Search This Blog

Saturday, March 22, 2025

Efficient coding hypothesis

From Wikipedia, the free encyclopedia

The efficient coding hypothesis was proposed by Horace Barlow in 1961 as a theoretical model of sensory coding in the brain. Within the brain, neurons communicate with one another by sending electrical impulses referred to as action potentials or spikes. One goal of sensory neuroscience is to decipher the meaning of these spikes in order to understand how the brain represents and processes information about the outside world.

Barlow hypothesized that the spikes in the sensory system formed a neural code for efficiently representing sensory information. By efficient it is understood that the code minimized the number of spikes needed to transmit a given signal. This is somewhat analogous to transmitting information across the internet, where different file formats can be used to transmit a given image. Different file formats require different numbers of bits for representing the same image at a given distortion level, and some are better suited for representing certain classes of images than others. According to this model, the brain is thought to use a code which is suited for representing visual and audio information which is representative of an organism's natural environment .

Labeled Neuron

Efficient coding and information theory

The development of Barlow's hypothesis was influenced by information theory introduced by Claude Shannon only a decade before. Information theory provides a mathematical framework for analyzing communication systems. It formally defines concepts such as information, channel capacity, and redundancy. Barlow's model treats the sensory pathway as a communication channel where neuronal spiking is an efficient code for representing sensory signals. The spiking code aims to maximize available channel capacity by minimizing the redundancy between representational units. H. Barlow was not the first to introduce the idea. It already appears in a 1954 article written by F. Attneave.

A key prediction of the efficient coding hypothesis is that sensory processing in the brain should be adapted to natural stimuli. Neurons in the visual (or auditory) system should be optimized for coding images (or sounds) representative of those found in nature. Researchers have shown that filters optimized for coding natural images lead to filters which resemble the receptive fields of simple-cells in V1. In the auditory domain, optimizing a network for coding natural sounds leads to filters which resemble the impulse response of cochlear filters found in the inner ear.

Constraints on the visual system

Due to constraints on the visual system such as the number of neurons and the metabolic energy required for "neural activities", the visual processing system must have an efficient strategy for transmitting as much information as possible. Information must be compressed as it travels from the retina back to the visual cortex. While the retinal receptors can receive information at 10^9 bit/s, the optic nerve, which is composed of 1 million ganglion cells transmitting at 1 bit/sec, only has a transmission capacity of 10^6 bit/s. Further reduction occurs that limits the overall transmission to 40 bit/s which results in inattentional blindness. Thus, the hypothesis states that neurons should encode information as efficiently as possible in order to maximize neural resources. For example, it has been shown that visual data can be compressed up to 20 fold without noticeable information loss.

Evidence suggests that our visual processing system engages in bottom-up selection. For example, inattentional blindness suggests that there must be data deletion early on in the visual pathway. This bottom-up approach allows us to respond to unexpected and salient events more quickly and is often directed by attentional selection. This also gives our visual system the property of being goal-directed. Many have suggested that the visual system is able to work efficiently by breaking images down into distinct components. Additionally, it has been argued that the visual system takes advantage of redundancies in inputs in order to transmit as much information as possible while using the fewest resources.

Evolution-based neural system

Simoncelli and Olshausen outline the three major concepts that are assumed to be involved in the development of systems neuroscience:

  1. an organism has specific tasks to perform
  2. neurons have capabilities and limitations
  3. an organism is in a particular environment.

One assumption used in testing the Efficient Coding Hypothesis is that neurons must be evolutionarily and developmentally adapted to the natural signals in their environment. The idea is that perceptual systems will be the quickest when responding to "environmental stimuli". The visual system should cut out any redundancies in the sensory input.

Natural images and statistics

Central to Barlow's hypothesis is information theory, which when applied to neuroscience, argues that an efficiently coding neural system "should match the statistics of the signals they represent". Therefore, it is important to be able to determine the statistics of the natural images that are producing these signals. Researchers have looked at various components of natural images including luminance contrast, color, and how images are registered over time. They can analyze the properties of natural scenes via digital cameras, spectrophotometers, and range finders.

Researchers look at how luminance contrasts are spatially distributed in an image: the luminance contrasts are highly correlated the closer they are in measurable distance and less correlated the farther apart the pixels are. Independent component analysis (ICA) is an algorithm system that attempts to "linearly transform given (sensory) inputs into independent outputs (synaptic currents) ". ICA eliminates the redundancy by decorrelating the pixels in a natural image. Thus the individual components that make up the natural image are rendered statistically independent. However, researchers have thought that ICA is limited because it assumes that the neural response is linear, and therefore insufficiently describes the complexity of natural images. They argue that, despite what is assumed under ICA, the components of the natural image have a "higher-order structure" that involves correlations among components. Instead, researchers have now developed temporal independent component analysis (TICA), which better represents the complex correlations that occur between components in a natural image. Additionally, a "hierarchical covariance model" developed by Karklin and Lewicki expands on sparse coding methods and can represent additional components of natural images such as "object location, scale, and texture".

The chromatic spectrum as it comes from natural light, but also as it is reflected off of "natural materials", can be easily characterized with principal components analysis (PCA). Because the cones are absorbing a specific amount of photons from the natural image, researchers can use cone responses as a way of describing the natural image. Researchers have found that the three classes of cone receptors in the retina can accurately code natural images and that color is decorrelated already in the LGN.Time has also been modeled. Natural images transform over time, and we can use these transformations to see how the visual input changes over time.

A padegogical review of efficient coding in visual processing --- efficient spatial coding, color coding, temporal/motion coding, stereo coding, and the combination of them --- is in chapter 3 of the book "Understanding vision: theory, models, and data". It explains how efficient coding is realized when input noise makes redundancy reduction no longer adequate, and how efficient coding methods in different situations are related to each other or different from each other.

Hypotheses for testing the efficient coding hypothesis

If neurons are encoding according to the efficient coding hypothesis then individual neurons must be expressing their full output capacity. Before testing this hypothesis it is necessary to define what is considered to be a neural response. Simoncelli and Olshausen suggest that an efficient neuron needs to be given a maximal response value so that we can measure if a neuron is efficiently meeting the maximum level. Secondly, a population of neurons must not be redundant in transmitting signals and must be statistically independent. If the efficient coding hypothesis is accurate, researchers should observe is that there is sparsity in the neuron responses: that is, only a few neurons at a time should fire for an input.

Methodological approaches for testing the hypotheses

One approach is to design a model for early sensory processing based on the statistics of a natural image and then compare this predicted model to how real neurons actually respond to the natural image. The second approach is to measure a neural system responding to a natural environment, and analyze the results to see if there are any statistical properties to this response. A third approach is to derive the necessary and sufficient conditions under which an observed neural computation is efficient, and test whether empirical stimulus statistics satisfy them.

Examples of these approaches

1. Predicted model approach

In one study by Doi et al. in 2012, the researchers created a predicted response model of the retinal ganglion cells that would be based on the statistics of the natural images used, while considering noise and biological constraints. They then compared the actual information transmission as observed in real retinal ganglion cells to this optimal model to determine the efficiency. They found that the information transmission in the retinal ganglion cells had an overall efficiency of about 80% and concluded that "the functional connectivity between cones and retinal ganglion cells exhibits unique spatial structure...consistent with coding efficiency.

A study by van Hateren and Ruderman in 1998 used ICA to analyze video-sequences and compared how a computer analyzed the independent components of the image to data for visual processing obtained from a cat in DeAngelis et al. 1993. The researchers described the independent components obtained from a video sequence as the "basic building blocks of a signal", with the independent component filter (ICF) measuring "how strongly each building block is present". They hypothesized that if simple cells are organized to pick out the "underlying structure" of images over time then cells should act like the independent component filters. They found that the ICFs determined by the computer were similar to the "receptive fields" that were observed in actual neurons.

Primary visual cortex location on the right side of the figure

2. Analyzing actual neural system in response to natural images

In a report in Science from 2000, William E. Vinje and Jack Gallant outlined a series of experiments used to test elements of the efficient coding hypothesis, including a theory that the non-classical receptive field (nCRF) decorrelates projections from the primary visual cortex. To test this, they took recordings from the V1 neurons in awake macaques during "free viewing of natural images and conditions" that simulated natural vision conditions. The researchers hypothesized that the V1 uses sparse code, which is minimally redundant and "metabolically more efficient".

They also hypothesized that interactions between the classical receptive field (CRF) and the nCRF produced this pattern of sparse coding during the viewing of these natural scenes. In order to test this, they created eye-scan paths and also extracted patches that ranged in size from 1-4 times the diameter of the CRF. They found that the sparseness of the coding increased with the size of the patch. Larger patches encompassed more of the nCRF—indicating that the interactions between these two regions created sparse code. Additionally as stimulus size increased, so did the sparseness. This suggests that the V1 uses sparse code when natural images span the entire visual field. The CRF was defined as the circular area surrounding the locations where stimuli evoked action potentials. They also tested to see if stimulation of the nCRF increased the independence of the responses from the V1 neurons by randomly selecting pairs of neurons. They found that indeed, the neurons were more greatly decoupled upon stimulation of the nCRF.

In conclusion, the experiments of Vinje and Gallant showed that the V1 uses sparse code by employing both the CRF and nCRF when viewing natural images, with the nCRF showing a definitive decorrelating effect on neurons which may increase their efficiency by increasing the amount of independent information they carry. They propose that the cells may represent the individual components of a given natural scene, which may contribute to pattern recognition

Another study done by Baddeley et al. had shown that firing-rate distributions of cat visual area V1 neurons and monkey inferotemporal (IT) neurons were exponential under naturalistic conditions, which implies optimal information transmission for a fixed average rate of firing. A subsequent study of monkey IT neurons found that only a minority were well described by an exponential firing distribution. De Polavieja later argued that this discrepancy was due to the fact that the exponential solution is correct only for the noise-free case, and showed that by taking noise into consideration, one could account for the observed results.

A study by Dan, Attick, and Reid in 1996 used natural images to test the hypothesis that early on in the visual pathway, incoming visual signals will be decorrelated to optimize efficiency. This decorrelation can be observed as the '"whitening" of the temporal and spatial power spectra of the neuronal signals". The researchers played natural image movies in front of cats and used a multielectrode array to record neural signals. This was achieved by refracting the eyes of the cats and then contact lenses being fitted into them. They found that in the LGN, the natural images were decorrelated and concluded, "the early visual pathway has specifically adapted for efficient coding of natural visual information during evolution and/or development".

Extensions

One of the implications of the efficient coding hypothesis is that the neural coding depends upon the statistics of the sensory signals. These statistics are a function of not only the environment (e.g., the statistics of the natural environment), but also the organism's behavior (e.g., how it moves within that environment). However, perception and behavior are closely intertwined in the perception-action cycle. For example, the process of vision involves various kinds of eye movements. An extension to the efficient coding hypothesis called active efficient coding (AEC) extends efficient coding to active perception. It hypothesizes that biological agents optimize not only their neural coding, but also their behavior to contribute to an efficient sensory representation of the environment. Along these lines, models for the development of active binocular vision, active visual tracking, and accommodation control have been proposed.

The brain has limited resources to process information, in vision this is manifested as the visual attentional bottleneck. The bottleneck forces the brain to select only a small fraction of visual input information for further processing, as merely coding information efficiently is no longer sufficient. A subsequent theory, V1 Saliency Hypothesis, has been developed on exogenous attentional selection of visual input information for further processing guided by a bottom-up saliency map in the primary visual cortex.

Criticisms

Researchers should consider how the visual information is used: The hypothesis does not explain how the information from a visual scene is used—which is the main purpose of the visual system. It seems necessary to understand why we are processing image statistics from the environment because this may be relevant to how this information is ultimately processed. However, some researchers may see the irrelevance of the purpose of vision in Barlow's theory as an advantage for designing experiments.

Some experiments show correlations between neurons: When considering multiple neurons at a time, recordings "show correlation, synchronization, or other forms of statistical dependency between neurons". However, it is relevant to note that most of these experiments did not use natural stimuli to provoke these responses: this may not fit in directly to the efficient coding hypothesis because this hypothesis is concerned with natural image statistics. In his review article Simoncelli notes that perhaps we can interpret redundancy in the Efficient Coding Hypothesis a bit differently: he argues that statistical dependency could be reduced over "successive stages of processing", and not just in one area of the sensory pathway. Yet, recordings by Hung et al. at the end of the visual pathway also show strong layer-dependent correlations to naturalistic objects and in ongoing activity. They showed that redundancy of neighboring neurons (i.e. a 'manifold' representation) benefits learning of complex shape features and that network anisotropy/inhomogeneity is a stronger predictor than noise redundancy of encoding/decoding efficiency.

Observed redundancy: A comparison of the number of retinal ganglion cells to the number of neurons in the primary visual cortex shows an increase in the number of sensory neurons in the cortex as compared to the retina. Simoncelli notes that one major argument of critics in that higher up in the sensory pathway there are greater numbers of neurons that handle the processing of sensory information so this should seem to produce redundancy. However, this observation may not be fully relevant because neurons have different neural coding. In his review, Simoncelli notes "cortical neurons tend to have lower firing rates and may use a different form of code as compared to retinal neurons". Cortical Neurons may also have the ability to encode information over longer periods of time than their retinal counterparts. Experiments done in the auditory system have confirmed that redundancy is decreased.

Difficult to test: Estimation of information-theoretic quantities requires enormous amounts of data, and is thus impractical for experimental verification. Additionally, informational estimators are known to be biased. However, some experimental success has occurred.

Need well-defined criteria for what to measure: This criticism illustrates one of the most fundamental issues of the hypothesis. Here, assumptions are made about the definitions of both the inputs and the outputs of the system. The inputs into the visual system are not completely defined, but they are assumed to be encompassed in a collection of natural images. The output must be defined to test the hypothesis, but variability can occur here too based on the choice of which type of neurons to measure, where they are located and what type of responses, such as firing rate or spike times are chosen to be measured.

How to take noise into account: Some argue that experiments that ignore noise, or other physical constraints on the system are too simplistic. However, some researchers have been able to incorporate these elements into their analyses, thus creating more sophisticated systems.

However, with appropriate formulations, efficient coding can also address some of these issues raised above. For example, some quantifiable degree of redundancies in neural representations of sensory inputs (manifested as correlations in neural responses) is predicted to occur when efficient coding is applied to noisy sensory inputs. Falsifiable theoretical predictions can also be made, and some of them subsequently tested.

Biomedical applications

Cochlear Implant

Possible applications of the efficient coding hypothesis include cochlear implant design. These neuroprosthetic devices stimulate the auditory nerve by an electrical impulses which allows some of the hearing to return to people who have hearing impairments or are even deaf. The implants are considered to be successful and efficient and the only ones in use currently. Using frequency-place mappings in the efficient coding algorithm may benefit in the use of cochlear implants in the future. Changes in design based on this hypothesis could increase speech intelligibility in hearing impaired patients. Research using vocoded speech processed by different filters showed that humans had greater accuracy in deciphering the speech when it was processed using an efficient-code filter as opposed to a cochleotropic filter or a linear filter. This shows that efficient coding of noise data offered perceptual benefits and provided the listeners with more information. More research is needed to apply current findings into medically relevant changes to cochlear implant design.

Information warfare

From Wikipedia, the free encyclopedia
 
Information warfare (IW) is the battlespace use and management of information and communication technology (ICT) in pursuit of a competitive advantage over an opponent. It is different from cyberwarfare that attacks computers, software, and command control systems. Information warfare is the manipulation of information trusted by a target without the target's awareness so that the target will make decisions against their interest but in the interest of the one conducting information warfare. As a result, it is not clear when information warfare begins, ends, and how strong or destructive it is.

Information warfare may involve the collection of tactical information, assurance(s) that one's information is valid, spreading of propaganda or disinformation to demoralize or manipulate the enemy and the public, undermining the quality of the opposing force's information, and denial of information-collection opportunities to opposing forces. Information warfare is closely linked to psychological warfare.

The United States Armed Forces' use of the term favors technology and hence tends to extend into the realms of electronic warfare, cyberwarfare, information assurance and computer network operations, attack, and defense. Other militaries use the much broader term information operations which, although making use of technology, focuses on the more human-related aspects of information use, including (amongst many others) social network analysis, decision analysis, and the human aspects of command and control.

Overview

Information warfare has been described as "the use of information to achieve our national objectives."[6] According to NATO, "Information war is an operation conducted in order to gain an information advantage over the opponent."

Information warfare can take many forms:

The United States Air Force has had Information Warfare Squadrons since the 1980s. In fact, the official mission of the U.S. Air Force is now "To fly, fight and win... in air, space and cyberspace", with the latter referring to its information warfare role.

As the U.S. Air Force often risks aircraft and aircrews to attack strategic enemy communications targets, remotely disabling such targets using software and other means can provide a safer alternative. In addition, disabling such networks electronically (instead of explosively) also allows them to be quickly re-enabled after the enemy territory is occupied. Similarly, counter-information warfare units are employed to deny such capability to the enemy. The first application of these techniques was used against Iraqi communications networks in the Gulf War.

Also during the Gulf War, Dutch hackers allegedly stole information about U.S. troop movements from U.S. Defense Department computers and tried to sell it to the Iraqis, who thought it was a hoax and turned it down. In January 1999, U.S. Air Intelligence computers were hit by a coordinated attack (Moonlight Maze), part of which came from a Russian mainframe. This could not be confirmed as a Russian cyber attack due to non-attribution – the principle that online identity may not serve as proof of real-world identity.

New battlefield

Within the realm of cyberspace, there are two primary weapons: network-centric warfare and C4ISR, which denotes integrated Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance. Furthermore, cyberspace attacks initiated by one nation against another nation have an underlying goal of gaining information superiority over the attacked party, which includes disrupting or denying the victimized party's ability to gather and distribute information. A real-world occurrence that illustrated the dangerous potential of cyberattacks transpired in 2007, when a strike from Israeli forces demolished an alleged nuclear reactor in Syria that was being constructed via a collaborative effort between Syria and North Korea. Accompanied by the strike was a cyberattack on Syria's air defenses, which left them blind to the attack on the nuclear reactor and, ultimately allowed for the attack to occur (New York Times 2014). An example of a more basic attack on a nation within cyberspace is a distributed denial of service (DDOS) attack, which is utilized to hinder networks or websites until they lose their primary functionality. As implied, cyberattacks do not just affect the military party being attacked, but rather the whole population of the victimized nation. Since more aspects of daily life are being integrated into networks in cyberspace, civilian populations can potentially be negatively affected during wartime. For example, if a nation chose to attack another nation's power grid servers in a specific area to disrupt communications, civilians and businesses in that area would also have to deal with power outages, which could potentially lead to economic disruptions as well.

Moreover, physical ICTs have also been implemented into the latest revolution in military affairs by deploying new, more autonomous robots (i.e. – unmanned drones) into the battlefield to carry out duties such as patrolling borders and attacking ground targets. Humans from remote locations pilot many of the unmanned drones, however, some of the more advanced robots, such as the Northrop Grumman X-47B, are capable of autonomous decisions. Despite piloting drones from remote locations, a proportion of drone pilots still suffer from stress factors of more traditional warfare. According to NPR, a study performed by the Pentagon in 2011 found that 29% of drone pilots are "burned out" and undergo high levels of stress. Furthermore, approximately 17% of the drone pilots surveyed as the study were labeled "clinically distressed" with some of those pilots also showing signs of post-traumatic stress disorder.

Modern ICTs have also brought advancements to communications management among military forces. Communication is a vital aspect of war for any involved party and, through the implementation of new ICTs such as data-enabled devices, military forces are now able to disseminate information faster than ever before. For example, some militaries are now employing the use of iPhones to upload data and information gathered by drones in the same area.

Notable examples

An office used by Russian web brigades captured by the Armed Forces of Ukraine during the Russian invasion of Ukraine

Chinese information warfare

The People's Republic of China engages in information warfare through the People's Liberation Army (PLA) and other organizations affiliated or controlled by the Chinese Communist Party (CCP). Laid out in the Chinese Defence White Paper of 2008, informatized warfare includes the utilization of information-based weapons and forces, including battlefield management systems, precision-strike capabilities, and technology-assisted command and control (C4ISR). The term also refers to propaganda and influence operations efforts by the Chinese state.

Russo-Ukrainian War

In 2022, the Armed Forces of Ukraine have taken advantage of deficiencies in Russian communications by allowing them to piggyback on Ukrainian networks, connect, and communicate. Ukrainian forces then eavesdrop, and cut off Russian communications at a crucial part of the conversation.

To build support before it invaded Ukraine, Russia perpetuated a narrative that claimed the Ukrainian government was committing violence against its own Russian speaking population. By publishing large amounts of disinformation on the internet, the alternate narrative was picked up in search results, such as Google News.

Russian interference in foreign elections

Russian interference in foreign elections, most notably the Russian interference in the 2016 United States elections, has been described as information warfare. Russia has also begun to interfere in the 2024 US presidential elections according to Microsoft. According to NBC, Russia is conducting disinformation campaigns in the 2024 US elections against US president, Joe Biden.

Russia vs West

Research suggests that Russia and the West are also engaged in an information war. For instance, Russia believes that the West is undermining its leader through the encouragement of overthrowing authoritarian regimes and liberal values. In response, Russia promotes the anti-liberal sentiments, including racism, antisemitism, homophobia, and misogyny. Russia has sought to promote the idea that the American democratic state is failing.

Russia, China and pro-Palestinian protests

The Telegraph reported in 2024 that China and Russia were promoting pro-Palestinian influencers in order to manipulate British public opinion in favor of Russian and Chinese interests. NBC reported that Russia was using different tools to cause division within the US, by delegitimizing US police operations against Pro Palestinian protests and by pivoting public conversation from the Russian invasion in Ukraine to the Israeli-Palestinian conflict. Russian media activity increased by 400% in the weeks after Hamas’ Oct. 7 attack on Israel.

United States COVID-19 disinformation campaign

According to a report by Reuters, the United States ran a propaganda campaign to spread disinformation about the Sinovac Chinese COVID-19 vaccine, including using fake social media accounts to spread the disinformation that the Sinovac vaccine contained pork-derived ingredients and was therefore haram under Islamic law. The campaign was described as "payback" for COVID-19 disinformation by China directed against the U.S. The campaign primarily targeted people in the Philippines and used a social media hashtag for "China is the virus" in Tagalog. The campaign ran from 2020 to mid-2021. The primary contractor for the U.S. military on the project was General Dynamics IT, which received $493 million for its role.

While information warfare has yielded many advances in the types of attack that a government can make, it has also raised concerns about the moral and legal ambiguities surrounding this particularly new form of war. Traditionally, wars have been analyzed by moral scholars according to just war theory. However, with Information Warfare, Just War Theory fails because the theory is based on the traditional conception of war. Information Warfare has three main issues surrounding it compared to traditional warfare:

  1. The risk for the party or nation initiating the cyberattack is substantially lower than the risk for a party or nation initiating a traditional attack. This makes it easier for governments, as well as potential terrorist or criminal organizations, to make these attacks more frequently than they could with traditional war.
  2. Information communication technologies (ICT) are so immersed in the modern world that a very wide range of technologies are at risk of a cyberattack. Specifically, civilian technologies can be targeted for cyberattacks and attacks can even potentially be launched through civilian computers or websites. As such, it is harder to enforce control of civilian infrastructures than a physical space. Attempting to do so would also raise many ethical concerns about the right to privacy, making defending against such attacks even tougher.
  3. The mass-integration of ICT into our system of war makes it much harder to assess accountability for situations that may arise when using robotic and/or cyber attacks. For robotic weapons and automated systems, it's becoming increasingly hard to determine who is responsible for any particular event that happens. This issue is exacerbated in the case of cyberattacks, as sometimes it is virtually impossible to trace who initiated the attack in the first place.

Recently, legal concerns have arisen centered on these issues, specifically the issue of the right to privacy in the United States of America. Lt. General Keith B. Alexander, who served as the head of Cyber Command under President Barack Obama, noted that there was a "mismatch between our technical capabilities to conduct operations and the governing laws and policies" when writing to the Senate Armed Services Committee. A key point of concern was the targeting of civilian institutions for cyberattacks, to which the general promised to try to maintain a mindset similar to that of traditional war, in which they will seek to limit the impact on civilians.

Argument from poor design

From Wikipedia, the free encyclopedia
 
The argument from poor design, also known as the dysteleological argument, is an argument against the assumption of the existence of a creator God, based on the reasoning that any omnipotent and omnibenevolent deity or deities would not create organisms with the perceived suboptimal designs that occur in nature.

The argument is structured as a basic modus ponens: if "creation" contains many defects, then design appears an implausible theory for the origin of earthly existence. Proponents most commonly use the argument in a weaker way, however: not with the aim of disproving the existence of God, but rather as a reductio ad absurdum of the well-known argument from design (which suggests that living things appear too well-designed to have originated by chance, and so an intelligent God or gods must have deliberately created them).

Although the phrase "argument from poor design" has seen little use, this type of argument has been advanced many times using words and phrases such as "poor design", "suboptimal design", "unintelligent design" or "dysteleology/dysteleological". The nineteenth-century biologist Ernst Haeckel applied the term "dysteleology" to the implications of organs so rudimentary as to be useless to the life of an organism. In his 1868 book Natürliche Schöpfungsgeschichte (The History of Creation), Haeckel devoted most of a chapter to the argument, ending with the proposition (perhaps with tongue slightly in cheek) of "a theory of the unsuitability of parts in organisms, as a counter-hypothesis to the old popular doctrine of the suitability of parts". In 2005, Donald Wise of the University of Massachusetts Amherst popularised the term "incompetent design" (a play on "intelligent design"), to describe aspects of nature seen as flawed in design.

Traditional Christian theological responses generally posit that God constructed a perfect universe but that humanity's misuse of its free will to rebel against God has resulted in the corruption of divine good design.

Overview

Natural selection is expected to push fitness to a peak, but that peak often is not the highest.

The argument runs that:

  1. An omnipotent, omniscient, omnibenevolent creator God would create organisms that have optimal design.
  2. Organisms have features that are suboptimal.
  3. Therefore, God either did not create these organisms or is not omnipotent, omniscient and omnibenevolent.

It is sometimes used as a reductio ad absurdum of the well-known argument from design, which runs as follows:

  1. Living things are too well-designed to have originated by chance.
  2. Therefore, life must have been created by an intelligent creator.
  3. This creator is God.

"Poor design" is consistent with the predictions of the scientific theory of evolution by means of natural selection. This predicts that features that were evolved for certain uses are then reused or co-opted for different uses, or abandoned altogether; and that suboptimal state is due to the inability of the hereditary mechanism to eliminate the particular vestiges of the evolutionary process.

In fitness landscape terms, natural selection will always push "up the hill", but a species cannot normally get from a lower peak to a higher peak without first going through a valley.

The argument from poor design is one of the arguments that was used by Charles Darwin; modern proponents have included Stephen Jay Gould, Richard Dawkins, and Nathan H. Lents. They argue that such features can be explained as a consequence of the gradual, cumulative nature of the evolutionary process. Theistic evolutionists generally reject the argument from design, but do still maintain belief in the existence of God.

Examples

In humans

Fatal flaws

Artist's representation of an ectopic pregnancy. Critics cite such common biological occurrences as contradictory to the 'watchmaker analogy'.

American scientist Nathan H. Lents published his book on poor design in the human body and genome in 2018 titled Human Errors. The book ignited a firestorm of criticism from the creationist community but was well received by the scientific community and received unanimously favorable reviews in the dozens of non-creationist media outlets that covered it.

Several defects in human anatomy can result in death, especially without modern medical care:

  • In the human female, a fertilized egg can implant into the fallopian tube, cervix or ovary rather than the uterus causing an ectopic pregnancy. The existence of a cavity between the ovary and the fallopian tube could indicate a flawed design in the female reproductive system. Prior to modern surgery, ectopic pregnancy invariably caused the deaths of both mother and baby. Even in modern times, in almost all cases the pregnancy must be aborted to save the life of the mother.
  • In the human female, the birth canal passes through the pelvis. The prenatal skull will deform to a surprising extent. However, if the baby's head is significantly larger than the pelvic opening, the baby cannot be born naturally. Prior to the development of modern surgery (caesarean section), such a complication would lead to the death of the mother, the baby, or both. Other birthing complications such as breech birth are worsened by this position of the birth canal.
  • In the human male, testes develop initially within the abdomen. Later during gestation, they migrate through the abdominal wall into the scrotum. This causes two weak points in the abdominal wall where hernias can later form. Prior to modern surgical techniques, complications from hernias, such as intestinal blockage and gangrene, usually resulted in death.
  • The existence of the pharynx, a passage used for both ingestion and respiration, with the consequent drastic increase in the risk of choking.
  • The breathing reflex is stimulated not directly by the absence of oxygen but indirectly by the presence of carbon dioxide. This means that high concentrations of inert gases, such as nitrogen and helium, can cause suffocation without any biological warning. Furthermore, at high altitudes, oxygen deprivation can occur in unadapted individuals who do not consciously increase their breathing rate.
  • The human appendix is a vestigial organ thought to serve no purpose. Appendicitis, an infection of this organ, is a certain death without medical intervention. "During the past few years, however, several studies have suggested its immunological importance for the development and preservation of the intestinal immune system."
  • Tinnitus, a phantom auditory sensation, is a maladaptation resulting from hearing loss most often caused by exposure to loud noise. Tinnitus serves no practical purpose, reduces quality of life, may cause depression, and when severe can lead to suicide.

Other flaws

  • Barely used nerves and muscles, such as the plantaris muscle of the foot, that are missing in part of the human population and are routinely harvested as spare parts if needed during operations. Another example is the muscles that move the ears, which some people can learn to control to a degree, but serve no purpose in any case.
  • The common malformation of the human spinal column, leading to scoliosis, sciatica and congenital misalignment of the vertebrae. The spinal cord cannot ever properly heal if it is damaged, because neurons have become so specialized that they are no longer able to regrow once they reach their mature state. The spinal cord, if broken, will never repair itself and will result in permanent paralysis.
  • The route of the recurrent laryngeal nerve is such that it travels from the brain to the larynx by looping around the aortic arch. This same configuration holds true for many animals; in the case of the giraffe, this results in about twenty feet of extra nerve.
  • Almost all animals and plants synthesize their own vitamin C, but humans cannot because the gene for this enzyme is defective (Pseudogene ΨGULO). Lack of vitamin C results in scurvy and eventually death. The gene is also non-functional in other primates and in guinea pigs, but is functional in most other animals.
  • The prevalence of congenital diseases and genetic disorders such as Huntington's disease.
  • The male urethra passes directly through the prostate, which can produce urinary difficulties if the prostate becomes swollen.
  • Crowded teeth and poor sinus drainage, as human faces are significantly flatter than those of other primates although humans share the same tooth set. This results in a number of problems, most notably with wisdom teeth, which can damage neighboring teeth or cause serious infections of the mouth.
  • The structure of human eyes (as well as those of all vertebrates). The retina is 'inside out'. The nerves and blood vessels lie on the surface of the retina instead of behind it as is the case in many invertebrate species. This arrangement forces a number of complex adaptations and gives mammals a blind spot. Having the optic nerve connected to the side of the retina that does not receive the light, as is the case in cephalopods, would avoid these problems. Lents and colleagues have proposed that the tapetum lucidum, the reflective surface behind vertebrate retinas, has evolved to overcome the limitations of the inverted retina, as cephalopods have never evolved this structure. However, an 'inverted' retina actually improves image quality through müller cells by reducing distortion. The effects of the blind spots resulting from the inverted retina are cancelled by binocular vision, as the blind spots in both eyes are oppositely angled. Additionally, as cephalopod eyes lack cone cells and might be able to judge color by bringing specific wavelengths to a focus on the retina, an inverted retina might interfere with this mechanism.
  • Humans are attracted to junk food's non-nutritious ingredients, and even wholly non-nutritious psychoactive drugs, and can experience physiological adaptations to prefer them to nutrients.

Other life

  • In the African locust, nerve cells start in the abdomen but connect to the wing. This leads to unnecessary use of materials.
  • Intricate reproductive devices in orchids, apparently constructed from components commonly having different functions in other flowers.
  • The use by pandas of their enlarged radial sesamoid bones in a manner similar to how other creatures use thumbs.
  • The existence of unnecessary wings in flightless birds, e.g. ostriches.
  • The enzyme RuBisCO has been described as a "notoriously inefficient" enzyme, as it is inhibited by oxygen, has a very slow turnover and is not saturated at current levels of carbon dioxide in the atmosphere. The enzyme is inhibited as it is unable to distinguish between carbon dioxide and molecular oxygen, with oxygen acting as a competitive enzyme inhibitor. However, RuBisCO remains the key enzyme in carbon fixation, and plants overcome its poor activity by having massive amounts of it inside their cells, making it the most abundant protein on Earth.
  • Sturdy but heavy bones, suited for non-flight, occurring in animals like bats. Or, on the converse: unstable, light, hollow bones, suited for flight, occurring in birds like penguins and ostriches, which cannot fly.
  • Various vestigial body parts, like the femur and pelvis in whales (evolution indicates the ancestors of whales lived on land).
  • Turritopsis dohrnii and species of the genus Hydra have biological immortality, but most animals do not.
  • Many species have strong instincts to behave in response to a certain stimulus. Natural selection can leave animals behaving in detrimental ways when they encounter a supernormal stimulus - like a moth flying into a flame.
  • Plants are green and not black, as chlorophyll absorbs green light poorly, even though black plants would absorb more light energy.
  • Whales and dolphins breathe air, but live in the water, meaning they must swim to the surface frequently to breathe.
  • Albatrosses cannot take off or land properly.

Counterarguments

Specific examples

Intelligent design proponent William Dembski questions the first premise of the argument, claiming that "intelligent design" does not need to be optimal.

While the appendix has been previously credited with very little function, research has shown that it serves an important role in the fetus and young adults. Endocrine cells appear in the appendix of the human fetus at around the 11th week of development, which produce various biogenic amines and peptide hormones, compounds that assist with various biological control (homeostatic) mechanisms. In young adults, the appendix has some immune functions.

Responses to counterarguments

In response to the claim that uses have been found for "junk" DNA, proponents note that the fact that some non-coding DNA has a purpose does not establish that all non-coding DNA has a purpose, and that the human genome does include pseudogenes that are nonfunctional "junk", with others noting that some sections of DNA can be randomized, cut, or added to with no apparent effect on the organism in question. The original study that suggested that the Makorin1-p1 served some purpose has been disputed. However, the original study is still frequently cited in newer studies and articles on pseudogenes previously thought to be nonfunctional.

As an argument regarding God

The argument from poor design is sometimes interpreted, by the argumenter or the listener, as an argument against the existence of God, or against characteristics commonly attributed to a creator deity, such as omnipotence, omniscience, or personality. In a weaker form, it is used as an argument for the incompetence of God. The existence of "poor design" (as well as the perceived prodigious "wastefulness" of the evolutionary process) would seem to imply a "poor" designer, or a "blind" designer, or no designer at all. In Gould's words, "If God had designed a beautiful machine to reflect his wisdom and power, surely he would not have used a collection of parts generally fashioned for other purposes. Orchids are not made by an ideal engineer; they are jury-rigged...."

The apparently suboptimal design of organisms has also been used by proponents of theistic evolution to argue in favour of a creator deity who uses natural selection as a mechanism of his creation. Arguers from poor design regard counter-arguments as a false dilemma, imposing that either a creator deity designed life on earth well or flaws in design indicate the life is not designed. This allows proponents of intelligent design to cherry pick which aspects of life constitute design, leading to the unfalsifiability of the theory. Christian proponents of both intelligent design and creationism may claim that good design indicates the creative intelligence of their God, while poor design indicates corruption of the world as a result of free will that caused the fall of man (for example, in Genesis 3:16 Yahweh says to Eve "I will increase your trouble in pregnancy").

Adaptationism

From Wikipedia, the free encyclopedia

Adaptationism is a scientific perspective on evolution that focuses on accounting for the products of evolution as collections of adaptive traits, each a product of natural selection with some adaptive rationale.

A formal alternative would be to look at the products of evolution as the result of neutral evolution, in terms of structural constraints, or in terms of a mixture of factors including (but not limited to) natural selection.

The most obvious justification for an adaptationist perspective is the belief that traits are, in fact, always adaptations built by natural selection for their functional role. This position is called "empirical adaptationism" by Godfrey-Smith. However, Godfrey-Smith also identifies "methodological" and "explanatory" flavors of adaptationism, and argues that all three are found in the evolutionary literature.

Although adaptationism has always existed— the view that the features of organisms are wonderfully adapted predates evolutionary thinking— and was sometimes criticized for its "Panglossian" excesses (e.g., by Bateson or Haldane), concerns about the role of adaptationism in scientific research did not become a major issue of debate until evolutionary biologists Stephen Jay Gould and Richard Lewontin penned a famous critique, "The Spandrels of San Marco and the Panglossian Paradigm". According to Gould and Lewontin, evolutionary biologists had a habit of proposing adaptive explanations for any trait by default without considering non-adaptive alternatives, and often by conflating products of adaptation with the process of natural selection. They identified neutral evolution and developmental constraints as potentially important non-adaptive factors and called for alternative research agendas.

This critique provoked defenses by Mayr, Reeve and Sherman  and others, who argued that the adaptationist research program was unquestionably highly successful, and that the causal and methodological basis for considering alternatives was weak. The "Spandrels paper" (as it came to be known) also added fuel to the emergence of an alternative "evo-devo" agenda focused on developmental "constraints"  Today, molecular evolutionists often cite neutral evolution as the null hypothesis in evolutionary studies, i.e., offering a direct contrast to the adaptationist approach. Constructive neutral evolution has been suggested as a means by which complex systems emerge through neutral transitions, and has been invoked to help understand the origins of a wide variety of features from the spliceosome of eukaryotes to the interdependency and simplification widespread in microbial communities.

Today, adaptationism is associated with the "reverse engineering" approach. Richard Dawkins noted in The Blind Watchmaker that evolution, an impersonal process, produces organisms that give the appearance of having been designed for a purpose. This observation justifies looking for the function of traits observed in biological organisms. This reverse engineering is used in disciplines such as psychology and economics to explain the features of human cognition. Reverse engineering can, in particular, help explain cognitive biases as adaptive solutions that assist individuals in decision-making when considering constraints such as the cost of processing information. This approach is valuable in understanding how seemingly irrational behaviors may, in fact, be optimal given the environmental and informational limitations under which human cognition operates.

Overview

Criteria to identify a trait as an adaptation

Adaptationism is an approach to studying the evolution of form and function. It attempts to frame the existence and persistence of traits, assuming that each of them arose independently and improved the reproductive success of the organism's ancestors. A trait is an adaptation if it fulfils the following criteria:

  1. The trait is a variation of an earlier form.
  2. The trait is heritable through the transmission of genes.
  3. The trait enhances reproductive success.

Constraints on the power of evolution

Genetic constraints

Genetic reality provides constraints on the power of random mutation followed by natural selection.

With pleiotropy, some genes control multiple traits, so that adaptation of one trait is impeded by effects on other traits that are not necessarily adaptive. Selection that influences epistasis is a case where the regulation or expression of one gene, depends on one or several others. This is true for a good number of genes though to differing extents. The reason why this leads to muddied responses is that selection for a trait that is epistatically based can mean that an allele for a gene that is epistatic when selected would happen to affect others. This leads to the coregulation of others for a reason other than there is an adaptive quality to each of those traits. Like with pleiotropy, traits could reach fixation in a population as a by-product of selection for another.

In the context of development the difference between pleiotropy and epistasis is not so clear but at the genetic level the distinction is more clear. With these traits as being by-products of others it can ultimately be said that these traits evolved but not that they necessarily represent adaptations.

Polygenic traits are controlled by a number of separate genes. Many traits are polygenic, for example human height. To drastically change a polygenic trait is likely to require multiple changes.

Anatomical constraints

Anatomical constraints are features of organism's anatomy that are prevented from change by being constrained in some way. When organisms diverge from a common ancestor and inherit certain characteristics which become modified by natural selection of mutant phenotypes, it is as if some traits are locked in place and are unable to change in certain ways. Some textbook anatomical constraints often include examples of structures that connect parts of the body together though a physical link.

These links are hard if not impossible to break because evolution usually requires that anatomy be formed by small consecutive modifications in populations through generations. In his book, Why We Get Sick, Randolph Nesse uses the "blind spot" in the vertebrate eye (caused by the nerve fibers running through the retina) as an example of this. He argues that natural selection has come up with an elaborate work-around of the eyes wobbling back-and-forth to correct for this, but vertebrates have not found the solution embodied in cephalopod eyes, where the optic nerve does not interrupt the view. See also: Evolution of the eye.

Another example is the cranial nerves in tetrapods. In early vertebrate evolution, sharks, skates, and rays (collectively Chondrichthyes), the cranial nerves run from the part of the brain that interprets sensory information, and radiate out towards the organs that produce those sensations. In tetrapods, however, and mammals in particular, the nerves take an elaborate winding path through the cranium around structures that evolved after the common ancestor with sharks.

Debate with structuralism

Adaptationism is sometimes characterized by critics as an unsubstantiated assumption that all or most traits are optimal adaptations. Structuralist critics (most notably Richard Lewontin and Stephen Jay Gould in their "spandrel" paper) contend that the adaptationists have overemphasized the power of natural selection to shape individual traits to an evolutionary optimum. Adaptationists are sometimes accused by their critics of using ad hoc "just-so stories". The critics, in turn, have been accused of misrepresentation (Straw man argumentation), rather than attacking the actual statements of supposed adaptationists.

Adaptationist researchers respond by asserting that they, too, follow George Williams' depiction of adaptation as an "onerous concept" that should only be applied in light of strong evidence. This evidence can be generally characterized as the successful prediction of novel phenomena based on the hypothesis that design details of adaptations should fit a complex evolved design to respond to a specific set of selection pressures. In evolutionary psychology, researchers such as Leda Cosmides, John Tooby, and David Buss contend that the bulk of research findings that were uniquely predicted through adaptationist hypothesizing comprise evidence of the methods' validity.

Purpose and function

There are philosophical issues with the way biologists speak of function, effectively invoking teleology, the purpose of an adaptation.

Function

To say something has a function is to say something about what it does for the organism. It also says something about its history: how it has come about. A heart pumps blood: that is its function. It also emits sound, which is considered to be an ancillary side-effect, not its function. The heart has a history (which may be well or poorly understood), and that history is about how natural selection formed and maintained the heart as a pump. Every aspect of an organism that has a function has a history. Now, an adaptation must have a functional history: therefore we expect it must have undergone selection caused by relative survival in its habitat. It would be quite wrong to use the word adaptation about a trait which arose as a by-product.

Teleology

Teleology was introduced into biology by Aristotle to describe the adaptedness of organisms. Biologists have found the implications of purposefulness awkward as they suggest supernatural intention, an aspect of Plato's thinking which Aristotle rejected. A similar term, teleonomy, grew out of cybernetics and self-organising systems and was used by biologists of the 1960s such as Ernst Mayr and George C. Williams as a less loaded alternative. On the one hand, adaptation is obviously purposeful: natural selection chooses what works and eliminates what does not. On the other hand, biologists want to deny conscious purpose in evolution. The dilemma gave rise to a famous joke by the evolutionary biologist Haldane: "Teleology is like a mistress to a biologist: he cannot live without her but he's unwilling to be seen with her in public.'" David Hull commented that Haldane's mistress "has become a lawfully wedded wife. Biologists no longer feel obligated to apologize for their use of teleological language; they flaunt it. The only concession which they make to its disreputable past is to rename it 'teleonomy'."

Sterilization law in the United States

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Sterilization_law_in_the_United_States   ...