Search This Blog

Saturday, March 13, 2021

Evolutionary approaches to depression

From Wikipedia, the free encyclopedia

Evolutionary approaches to depression are attempts by evolutionary psychologists to use the theory of evolution to shed light on the problem of mood disorders. Depression is generally thought of as dysfunction or a mental disorder, but its prevalence does not increase with age the way dementia and other organic dysfunction commonly does. Some researchers have surmised that the disorder may have evolutionary roots, in the same way that others suggest evolutionary contributions to schizophrenia, sickle cell anemia, psychopathy and other disorders. Psychology and psychiatry have not generally embraced evolutionary explanations for behaviors, and the proposed explanations for the evolution of depression remain controversial.

Background

Major depression (also called "major depressive disorder", "clinical depression" or often simply "depression") is a leading cause of disability worldwide, and in 2000 was the fourth leading contributor to the global burden of disease (measured in DALYs); it is also an important risk factor for suicide. It is understandable, then, that clinical depression is thought to be a pathology—a major dysfunction of the brain.

In most cases, rates of organ dysfunction increase with age, with low rates in adolescents and young adults, and the highest rates in the elderly. These patterns are consistent with evolutionary theories of aging which posit that selection against dysfunctional traits decreases with age (because there is a decreasing probability of surviving to later ages).

In contrast to these patterns, prevalence of clinical depression is high in all age categories, including otherwise healthy adolescents and young adults. In one study of the US population, for example, the 12 month prevalence for a major depression episode was highest in the youngest age category (15- to 24-year-olds). The high prevalence of unipolar depression (excluding depression associated bipolar disorder) is also an outlier when compared to the prevalence of other mental disorders such as major mental retardation, autism, schizophrenia and even the aforementioned bipolar disorder, all with prevalence rates about one tenth that of depression, or less. As of 2017, the only mental disorders with a higher prevalence than depression are anxiety disorders

The common occurrence and persistence of a trait like clinical depression with such negative effects early in life is difficult to explain. (Rates of infectious disease are high in young people, of course, but clinical depression is not thought to be caused by an infection.) Evolutionary psychology and its application in evolutionary medicine suggest how behaviour and mental states, including seemingly harmful states such as depression, may have been beneficial adaptations of human ancestors which improved the fitness of individuals or their relatives. It has been argued, for example, that Abraham Lincoln's lifelong depression was a source of insight and strength. Some even suggest that "we aren't designed to have happiness as our natural default" and so a state of depression is the evolutionary norm.

The following hypotheses attempt to identify a benefit of depression that outweighs its obvious costs.

Such hypotheses are not necessarily incompatible with one another and may explain different aspects, causes, and symptoms of depression.

Psychic pain hypothesis

One reason depression is thought to be a pathology is that it causes so much psychic pain and distress. However, physical pain is also very distressful, yet it has an evolved function: to inform the organism that it is suffering damage, to motivate it to withdraw from the source of damage, and to learn to avoid such damage-causing circumstances in the future. Sadness is also distressing, yet is widely believed to be an evolved adaptation. In fact, perhaps the most influential evolutionary view is that most cases of depression are simply particularly intense cases of sadness in response to adversity, such as the loss of a loved one.

According to the psychic pain hypothesis, depression is analogous to physical pain in that it informs the sufferer that current circumstances, such as the loss of a friend, are imposing a threat to biological fitness. It motivates the sufferer to cease activities that led to the costly situation, if possible, and it causes him or her to learn to avoid similar circumstances in the future. Proponents of this view tend to focus on low mood, and regard clinical depression as a dysfunctional extreme of low mood—and not as a unique set of characteristics that are physiologically distanced from regular depressed mood.

Alongside the absence of pleasure, other noticeable changes include psychomotor retardation, disrupted patterns of sleeping and feeding, a loss of sex drive and motivation—which are all also characteristics of the body's reaction to actual physical pain. In depressed people there is an increased activity in the regions of the cortex involved with the perception of pain, such as the anterior cingulate cortex and the left prefrontal cortex. This activity allows the cortex to manifest an abstract negative thought as a true physical stressor to the rest of the brain.

Behavioral shutdown model

The behavioral shutdown model states that if an organism faces more risk or expenditure than reward from activities, the best evolutionary strategy may be to withdraw from them. This model proposes that emotional pain, like physical pain, serves a useful adaptive purpose. Negative emotions like disappointment, sadness, grief, fear, anxiety, anger, and guilt are described as "evolved strategies that allow for the identification and avoidance of specific problems, especially in the social domain." Depression is characteristically associated with anhedonia and lack of energy, and those experiencing it are risk-aversive and perceive more negative and pessimistic outcomes because they are focused on preventing further loss. Although the model views depression as an adaptive response, it does not suggest that it is beneficial by the standards of current society; but it does suggest that many approaches to depression treat symptoms rather than causes, and underlying social problems need to be addressed.

A related phenomenon to the behavioral shutdown model is learned helplessness. In animal subjects, a loss of control or predictability in the subject's experiences results in a condition similar to clinical depression in humans. That is to say, if uncontrollable and unstoppable stressors are repeated for long enough, a rat subject will adopt a learned helplessness, which shares a number of behavioral and psychological features with human depression. The subject will not attempt to cope with problems, even when placed in a stressor-free novel environment. Should their rare attempts at coping prove successful in a new environment, a long lasting cognitive block prevents them from perceiving their action as useful and their coping strategy does not last long. From an evolutionary perspective, learned helplessness also allows a conservation of energy for an extended period of time should people find themselves in a predicament that is outside of their control, such as an illness or a dry season. However, for today's humans whose depression resembles learned helplessness, this phenomenon usually manifests as a loss of motivation and the distortion of one uncontrollable aspect of a person's life being viewed as representative of all aspects of their life – suggesting a mismatch between ultimate cause and modern manifestation.

Analytical rumination hypothesis

This hypothesis suggests that depression is an adaptation that causes the affected individual to concentrate his or her attention and focus on a complex problem in order to analyze and solve it.

One way depression increases the individual's focus on a problem is by inducing rumination. Depression activates the left ventrolateral prefrontal cortex, which increases attention control and maintains problem-related information in an "active, accessible state" referred to as "working memory", or WM. As a result, depressed individuals have been shown to ruminate, reflecting on the reasons for their current problems. Feelings of regret associated with depression also cause individuals to reflect and analyze past events in order to determine why they happened and how they could have been prevented. The rumination hypothesis has come under criticism. Evolutionary fitness is increased by ruminating before rather than after bad outcomes. A situation that resulted in a child being in danger but unharmed should lead the parent to ruminate on how to avoid the dangerous situation in the future. Waiting until the child dies and then ruminating in a state of depression is too late.

Some cognitive psychologists argue that ruminative tendency itself increases the likelihood of the onset of depression.

Another way depression increases an individual's ability to concentrate on a problem is by reducing distraction from the problem. For example, anhedonia, which is often associated with depression, decreases an individual's desire to participate in activities that provide short-term rewards, and instead, allows the individual to concentrate on long-term goals. In addition, "psychomotoric changes", such as solitariness, decreased appetite, and insomnia also reduce distractions. For instance, insomnia enables conscious analysis of the problem to be maintained by preventing sleep from disrupting such processes. Likewise, solitariness, lack of physical activity, and lack of appetite all eliminate sources of distraction, such as social interactions, navigation through the environment, and "oral activity", which disrupt stimuli from being processed.

Possibilities of depression as a dysregulated adaptation

Depression, especially in the modern context, may not necessarily be adaptive. The ability to feel pain and experience depression, are adaptive defense mechanisms, but when they are "too easily triggered, too intense, or long lasting", they can become "dysregulated". In such a case, defense mechanisms, too, can become diseases, such as "chronic pain or dehydration from diarrhea". Depression, which may be a similar kind of defense mechanism, may have become dysregulated as well.

Thus, unlike other evolutionary theories this one sees depression as a maladaptive extreme of something that is beneficial in smaller amounts. In particular, one theory focuses on the personality trait neuroticism. Low amounts of neuroticism may increase a person's fitness through various processes, but too much may reduce fitness by, for example, recurring depressions. Thus, evolution will select for an optimal amount and most people will have neuroticism near this amount. However, genetic variation continually occurs, and some people will have high neuroticism which increases the risk of depressions.

Rank theory

Rank theory is the hypothesis that, if an individual is involved in a lengthy fight for dominance in a social group and is clearly losing, then depression causes the individual to back down and accept the submissive role. In doing so, the individual is protected from unnecessary harm. In this way, depression helps maintain a social hierarchy. This theory is a special case of a more general theory derived from the psychic pain hypothesis: that the cognitive response that produces modern-day depression evolved as a mechanism that allows people to assess whether they are in pursuit of an unreachable goal, and if they are, to motivate them to desist.

Social risk hypothesis

This hypothesis is similar to the social rank hypothesis but focuses more on the importance of avoiding exclusion from social groups, rather than direct dominance contests. The fitness benefits of forming cooperative bonds with others have long been recognised—during the Pleistocene period, for instance, social ties were vital for food foraging and finding protection from predators.

As such, depression is seen to represent an adaptive, risk-averse response to the threat of exclusion from social relationships that would have had a critical impact on the survival and reproductive success of our ancestors. Multiple lines of evidence on the mechanisms and phenomenology of depression suggest that mild to moderate (or "normative") depressed states preserve an individual's inclusion in key social contexts via three intersecting features: a cognitive sensitivity to social risks and situations (e.g., "depressive realism"); it inhibits confident and competitive behaviours that are likely to put the individual at further risk of conflict or exclusion (as indicated by symptoms such as low self-esteem and social withdrawal); and it results in signalling behaviours directed toward significant others to elicit more of their support (e.g., the so-called "cry for help"). According to this view, the severe cases of depression captured by clinical diagnoses reflect the maladaptive, dysregulation of this mechanism, which may partly be due to the uncertainty and competitiveness of the modern, globalised world.

Honest signaling theory

Another reason depression is thought to be a pathology is that key symptoms, such as loss of interest in virtually all activities, are extremely costly to the sufferer. Biologists and economists have proposed, however, that signals with inherent costs can credibly signal information when there are conflicts of interest. In the wake of a serious negative life event, such as those that have been implicated in depression (e.g., death, divorce), "cheap" signals of need, such as crying, might not be believed when social partners have conflicts of interest. The symptoms of major depression, such as loss of interest in virtually all activities and suicidality, are inherently costly, but, as costly signaling theory requires, the costs differ for individuals in different states. For individuals who are not genuinely in need, the fitness cost of major depression is very high because it threatens the flow of fitness benefits. For individuals who are in genuine need, however, the fitness cost of major depression is low, because the individual is not generating many fitness benefits. Thus, only an individual in genuine need can afford to suffer major depression. Major depression therefore serves as an honest, or credible, signal of need.

For example, individuals suffering a severe loss such as the death of a spouse are often in need of help and assistance from others. Such individuals who have few conflicts with their social partners are predicted to experience grief—a means, in part, to signal need to others. Such individuals who have many conflicts with their social partners, in contrast, are predicted to experience depression—a means, in part, to credibly signal need to others who might be skeptical that the need is genuine.

Bargaining theory

Depression is not only costly to the sufferer, it also imposes a significant burden on family, friends, and society at large—yet another reason it is thought to be pathological. Yet if sufferers of depression have real but unmet needs, they might have to provide an incentive to others to address those needs.

The bargaining theory of depression is similar to the honest signaling, niche change, and social navigation theories of depression described below. It draws on theories of labor strikes developed by economists to basically add one additional element to honest signaling theory: The fitness of social partners is generally correlated. When a wife suffers depression and reduces her investment in offspring, for example, the husband's fitness is also put at risk. Thus, not only do the symptoms of major depression serve as costly and therefore honest signals of need, they also compel reluctant social partners to respond to that need in order to prevent their own fitness from being reduced. This explanation for depression has been challenged. Depression decreases the joint product of the family or group as the husband or helper only partially compensates for the loss of productivity by the depressed person. Instead of being depressed the person could break their own leg and gain help from the social group, but this obviously is a counterproductive strategy. And the lack of a sex drive certainly does not improve marital relations or fitness.

Social navigation or niche change theory

The social navigation or niche change hypothesis proposes that depression is a social navigation adaptation of last resort, designed especially to help individuals overcome costly, complex contractual constraints on their social niche. The hypothesis combines the analytical rumination and bargaining hypotheses and suggests that depression, operationally defined as a combination of prolonged anhedonia and psychomotor retardation or agitation, provides a focused sober perspective on socially imposed constraints hindering a person's pursuit of major fitness enhancing projects. Simultaneously, publicly displayed symptoms, which reduce the depressive's ability to conduct basic life activities, serve as a social signal of need; the signal's costliness for the depressive certifies its honesty. Finally, for social partners who find it uneconomical to respond helpfully to an honest signal of need, the same depressive symptoms also have the potential to extort relevant concessions and compromises. Depression's extortionary power comes from the fact that it retards the flow of just those goods and services such partners have come to expect from the depressive under status quo socioeconomic arrangements.

Thus depression may be a social adaptation especially useful in motivating a variety of social partners, all at once, to help the depressive initiate major fitness-enhancing changes in their socioeconomic life. There are diverse circumstances under which this may become necessary in human social life, ranging from loss of rank or a key social ally which makes the current social niche uneconomic to having a set of creative new ideas about how to make a livelihood which begs for a new niche. The social navigation hypothesis emphasizes that an individual can become tightly ensnared in an overly restrictive matrix of social exchange contracts, and that this situation sometimes necessitates a radical contractual upheaval that is beyond conventional methods of negotiation. Regarding the treatment of depression, this hypothesis calls into question any assumptions by the clinician that the typical cause of depression is related to maladaptive perverted thinking processes or other purely endogenous sources. The social navigation hypothesis calls instead for analysis of the depressive's talents and dreams, identification of relevant social constraints (especially those with a relatively diffuse non-point source within the social network of the depressive), and practical social problem-solving therapy designed to relax those constraints enough to allow the depressive to move forward with their life under an improved set of social contracts. This theory has been the subject of criticism.

Depression as an incentive device

This approach argues that being in a depressed state is not adaptive (indeed quite the opposite), but the threat of depression for bad outcomes and the promise of pleasure for good outcomes are adaptive because they motivate the individual toward undertaking effort that increase fitness. The reason for not relying on pleasure alone as an incentive device is because happiness is costly in terms of fitness as the individual becomes less cautious. This is most readily seen when an individual is manic and undertakes very risky behavior. The physiological manifestation of the incentives are most noticeable when an individual is bipolar with bouts of extreme elation and extreme depression as anxiety which is about the (possibly immediate) future is highly correlated with being bipolar. As noted earlier, bipolar disorder and clinical depression, as opposed to event depression, are viewed as dysregulation just as persistently high (or low) blood pressure are viewed as dysregulation even though at times high or low blood pressure is fitness enhancing.

Prevention of infection

It has been hypothesized that depression is an evolutionary adaptation because it helps prevent infection in both the affected individual and his/her kin.

First, the associated symptoms of depression, such as inactivity and lethargy, encourage the affected individual to rest. Energy conserved through such methods is highly crucial, as immune activation against infections is relatively costly; there must be, for instance, a 10% increase in metabolic activity for even a 1℃ change in body temperature. Therefore, depression allows one to conserve and allocate energy to the immune system more efficiently.

Depression further prevents infection by discouraging social interactions and activities that may result in exchange of infections. For example, the loss of interest discourages one from engaging in sexual activity, which, in turn, prevents the exchange of sexually transmitted diseases. Similarly, depressed mothers may interact less with their children, reducing the probability of the mother infecting her kin. Lastly, the lack of appetite associated with depression may also reduce exposure to food-borne parasites.

However, it should also be noted that chronic illness itself may be involved in causing depression. In animal models, the prolonged overreaction of the immune system, in response to the strain of chronic disease, results in an increased production of cytokines (a diverse group of hormonal regulators and signaling molecules). Cytokines interact with neurotransmitter systems—mainly norepinephrine, dopamine, and serotonin, and induce depressive characteristics. The onset of depression may help an individual recover from their illness by allowing them a more reserved, safe and energetically efficient lifestyle. The overproduction of these cytokines, beyond optimal levels due to the repeated demands of dealing with a chronic disease, may result in clinical depression and its accompanying behavioral manifestations that promote extreme energy reservation.

The third ventricle hypothesis

Third ventricle

The third ventricle hypothesis of depression proposes that the behavioural cluster associated with depression (hunched posture, avoidance of eye contact, reduced appetites for food and sex plus social withdrawal and sleep disturbance) serves to reduce an individual's attack-provoking stimuli within the context of a chronically hostile social environment. It further proposes that this response is mediated by the acute release of an unknown (probably cytokine) inflammatory agent into the third ventricular space. In support of this suggestion imaging studies reveal that the third ventricle is enlarged in depressives.

Reception

Clinical psychology and psychiatry have historically been relatively isolated from the field of evolutionary psychology. Some psychiatrists raise the concern that evolutionary psychologists seek to explain hidden adaptive advantages without engaging the rigorous empirical testing required to back up such claims. While there is strong research to suggest a genetic link to bipolar disorder and schizophrenia, there is significant debate within clinical psychology about the relative influence and the mediating role of cultural or environmental factors. For example, epidemiological research suggests that different cultural groups may have divergent rates of diagnosis, symptomatology, and expression of mental illnesses. There has also been increasing acknowledgment of culture-bound disorders, which may be viewed as an argument for an environmental versus genetic psychological adaptation. While certain mental disorders may have psychological traits that can be explained as 'adaptive' on an evolutionary scale, these disorders cause afflicted individuals significant emotional and psychological distress and negatively influence the stability of interpersonal relationships and day-to-day adaptive functioning.

 

Before the Dawn (book)

From Wikipedia, the free encyclopedia
 
Before the Dawn: Recovering the Lost History of Our Ancestors
Before the Dawn (Wade) book cover.jpg
AuthorNicholas Wade
CountryUnited States
LanguageEnglish
SubjectHuman evolution
PublisherPenguin Group
Publication date
2006 (first edition, hardcover)
2007 (updated edition, paperback)
Media typePrint (Hardcover and Paperback)
ISBN1-59420-079-3 (hardcover)
ISBN 978-0-14-303832-0 (paperback)
599.93'8-dc22
LC ClassGN281.W33 2006

Before the Dawn: Recovering the Lost History of Our Ancestors is a non-fiction book by Nicholas Wade, a science reporter for The New York Times. It was published in 2006 by the Penguin Group. By drawing upon research on the human genome, the book attempts to piece together what Wade calls "two vanished periods": the five million years of human evolution from the development of bipedalism leading up to behavioural modernity around 50,000 years ago, and the 45,000 subsequent years of prehistory.

Wade asserts that there is a clear continuity from the earlier apes of five million years ago to the anatomically modern humans who diverged from them, citing the genetic and social similarities between humans and chimpanzees. He attributes the divergence of the two species from a common ancestor to a change in their ecological niche; the ancestors of chimpanzees remained in the forests of equatorial Africa, whereas the ancestors of humans moved to open woodland and were exposed to different evolutionary pressures. Although Wade posits that much of human evolution can be attributed to the physical environment, he also believes that one of the major forces shaping evolution has been the nature of human society itself.

After humans migrated out of their ancestral environment of eastern Africa, they were exposed to new climates and challenges. Thus, Wade argues, human evolution did not end with behavioural modernity, but continued to be shaped by the different environments and lifestyles of each continent. While many adaptations happened in parallel across human populations, Wade believes that genetic isolation – either because of geography or hostile tribalism – also facilitated a degree of independent evolution, leading to genetic and cultural differentiation from the ancestral population and giving rise to different human races and languages.

The book received generally positive reviews, but some criticised the use of the term "race" and the implications of differences between them. In 2007, it won the Science in Society Journalism Award from the National Association of Science Writers.

Summary

Nicholas Wade divides Before the Dawn into twelve chapters, which are roughly in the chronological order of the human past. The first chapter, Genetics & Genesis, gives a general overview of the themes that are explored in the book. The central theme is that the human genome provides a record of the human past, including what Wade calls the "two vanished periods" of human evolution and prehistory. Through information from the human genome, Wade proclaims, it is possible to determine when humans lost their body hair and began to wear clothes, to track their migration out of Africa, to discover if they interbred with Neanderthals, and even to reconstruct the evolution of language.

Origin of humans and language

The second chapter, Metamorphosis, focuses on the evolutionary origins of humans around 5 million years ago in equatorial Africa. Wade suggests that the last common ancestor of humans and chimpanzees lived in forests. Some of them, due to a global climate change between 5 and 10 million years ago, left the shrinking forests and moved to open woodland, and this new ecological niche gave rise to the human lineage. A change in food availability led to an adaptation for the ability to eat meat, and this nutrition facilitated the evolution of a larger brain. The knuckle-walking of the common ancestor gave way to bipedalism, which is more efficient over longer distances. A larger brain in combination with freed-up hands culminated in the evolution of Homo habilis and the first use of tools around 2.5 million years ago, and the more humanlike and larger brained Homo ergaster about 1.7 million years ago. The adaptations of H. ergaster to hot, dry climates included an external nose to condense air and minimise water loss, and the loss of body hair to allow sweating to cool the body and larger brain. Wade writes that a mutation in the melanocortin receptor gene created an advantageous darkening of the pale, hairless skin. The close descendants of H. ergaster, Homo erectus and Homo heidelbergensis, migrated out of Africa and to Asia (around 1 to 1.66 million years ago) and Europe (around 500,000 years ago), respectively. In Europe, the glacial conditions around 300,000 to 400,000 years ago pressured H. heidelbergensis to evolve into Neanderthals.

The human lineage that remained in Africa eventually evolved into anatomically modern humans with modern-sized brains by about 200,000 years ago, and became common about 100,000 years ago, but did not become behaviourally modern until about 50,000 years ago. Citing the paleoanthropologist Richard Klein, Wade posits that such a great change must have been because of a neurological change, and was therefore genetic. This "genetic revolution", as Wade calls it, facilitated the emergence of language and thus the ability to share thoughts and innovations. Wade discusses the evolutionary origins of language in chapter three, First Words. He references Noam Chomsky's theory of "universal grammar" – which refers to both the hard-wiring of the brain that allows children to learn grammatical rules, and the underlying grammatical similarities of all human languages. Wade cites a number of evolutionary psychologists for an explanation, including Robin Dunbar, who argues that language evolved because it was a more efficient way of establishing social bonds than grooming; Geoffrey Miller, who suggests that speech was a signal of intelligence and thus evolved through sexual selection; and Steven Pinker, who thinks that the ecological niche of humans required the sharing of knowledge. Wade writes that the genetic basis of language is linked to the FOXP2 gene, as it shows signs of significant change in humans but not in chimpanzees, and that mutations of it cause severe speech disorders.

Ancestral humans and migration out of Africa

Chapter four, Eden, discusses the ancestral population of modern humans in Africa. Through the Y chromosome and mitochondrial DNA, Wade gives evidence to suggest that the ancestral population was no more than 5,000 to 10,000 individuals in an area corresponding to what is now Ethiopia. He supposes that the small population would have lived in close proximity, and likely spoke the same language. By looking at the click languages of the genetically "ancient" Khoisan peoples, and the fact that clicks are more often lost than gained in languages, Wade suggests that clicks were present in this ancestral language. To understand the nature of the ancestral population, Donald Brown's theory of "universal people" is raised; that is, the shared behaviours of all modern human societies. A small minority of this ancestral population, Wade continues in chapter five, Exodus, crossed the Gate of Grief and left Africa 50,000 years ago, following the coasts of India and the former continents of Sunda and Sahul. As they moved into the interior of Eurasia, they clashed with H. erectus and Neanderthals, eventually pushing them into refuges and ultimately to their extinction.

The wide dispersal of humans across varying environments with different evolutionary pressures, Wade contends, began to give rise to regional differentiation. He gives examples of variation in two genes related to brain development: an allele of microcephalin that appeared about 37,000 years ago and is common in Europeans and East Asians, but rare in sub-Saharan Africans; and an allele of ASPM that appeared about 6,000 years ago and is common in Europeans, Middle Easterners and to a lesser extent East Asians, but is nearly non-existent in sub-Saharan Africans. Wade believes that the rapid spread of these alleles conferred some cognitive advantage, and one that was enough to be favoured by natural selection. Wade continues to discuss the different human trajectories in chapter six, Stasis; he writes that while the humans across the Eurasian landmass were exposed to similarly harsh glacial conditions during the Upper Paleolithic, they gradually began to diverge between east and west. It was during this period that the dog was domesticated by humans living in Siberia, who in turn crossed the Bering land bridge and populated the Americas. Wade states that an adaptation in mitochondrial DNA for cold conditions possibly facilitated this migration. Furthermore, he theorises that the "mongoloid" skull and body type of East Asians and Native Americans were physical adaptations to the cold, and also partly the result of genetic drift, whereas light skin developed separately in East Asians and "caucasoid" Europeans to better enable them to synthesise vitamin D with sunlight.

Social evolution, emergence of human races and division of languages

Wade believes that large language families, such as Indo-European shown here in its ancestral Eurasia, may have spread through agriculture.

Chapter seven, Settlement, concerns sedentism – the transition from a nomadic lifestyle to a society which remains in one place permanently – which began to rise in the Near East at the end of the Last Glacial Maximum. It required new ways of thought and social organisation; Wade thinks that an evolutionary adaptation for less aggressiveness allowed this change, noting how the skeletons of the ancestral population were less gracile than those of today. Sedentism facilitated the development of agriculture, including the cultivation and domestication of wild cereals and animals. The domestication of cattle in northern Europe and parts of Africa facilitated the spread of a genetic mutation that allowed lactose tolerance, and Wade believes this is evidence of culture and evolution interacting. The following chapter, Sociality, focuses on the common dynamics of human societies, including warfare, religion, trade, and a division of roles between the sexes. Wade theorises that these institutions have an evolutionary basis, and looks at closely related primate societies – such as those of chimpanzees and bonobos – for evidence. Wade goes on to suggest that cannibalism may have been more common in the human past by noting a common genetic adaptation that protects against Creutzfeldt–Jakob disease, which is associated with the consumption of brain.

Wade writes that along with the ongoing social evolution that occurred after humans left Africa, the human physical form also continued to evolve. This is the subject of chapter nine, Race; because humans were spread across different continents, and distance and tribal hostility limited gene flow between them, they followed different evolutionary paths. Race is not well understood, he says, because its historical implications cause it to be avoided in modern academic studies. Wade states, however, that there are reasons to reconsider the study of it; the genetic differences between races may give evidence of the different evolutionary pressures they faced, and the differences may be medically relevant. Citing Neil Risch, Wade puts forth that there are five continental races – Africans, Caucasians, Asians, Pacific Islanders and Native Americans – which are made up of smaller subdivisions called ethnicities. He explicitly avoids discussing the cause of IQ differences between races, but hypothesises that racial differences may have an influence on sporting achievement. Chapter ten, Language, concerns the spread and division of languages. Wade believes that all languages ultimately came from an ancestral language, and that many of its descendants – such as Proto-Indo-European – possibly spread through agriculture. By citing linguists such as Joseph Greenberg, Wade shows that almost all of today's languages belong to families, which in turn may belong to superfamilies such as Eurasiatic.

Understanding history and predicting the future

In chapter eleven, History, Wade demonstrates how genetics can be related to recorded history. An example given is the substantial genetic legacy of Genghis Khan and his male relatives. History correlates with this, as writers of the period stated that he had hundreds of wives. Wade then covers the origins of the British; contrary to popular belief, he writes, neither the Anglo-Saxons nor the Vikings eradicated the indigenous population, as the Y chromosomes common to Celtic speakers are carried by a large percentage of the male population of Britain. Markers in these Y chromosomes can be linked to the Basques, and he suggests that the British and Irish descend from a refuge in Spain they shared during the Last Glacial Maximum.

Wade also discusses the origin of Jews; they descend from the Middle East through their Y chromosomes, but their mitochondrial DNA resembles that of their host countries. Citing Gregory Cochran and Henry Harpending, Wade argues that Ashkenazi Jews were historically forced into intellectually demanding occupations by their European hosts, and these selective pressures favoured genes that raised their intelligence. The consequence of this, however, was an increase in sphingolipid diseases. The final chapter, Evolution, gives a summary of human evolution from its origins to the present, and declares that it is unlikely that it will ever cease. Wade speculates where evolution will lead humans in the future, suggesting further skeletal gracilisation, increases in intelligence, adaptations to the changing climate, and even the possibility of speciation.

Background and publication

Author Nicholas Wade in 2005

Wade has written for The New York Times since 1981 as an editorial writer, science editor and a reporter, including articles supporting the idea of recent human evolution and racial differentiation. His motivation for writing Before the Dawn began during his reporting of genetics, particularly since the sequencing of the human genome in 2003, as he started to realise how it could be related to the human past. He sought to put all of the recent research of archaeology, paleoanthropology and linguistics together into one narrative revolving around genetics. Before the Dawn was published in hardcover in 2006, and then in an updated paperback in 2007, by the Penguin Group.

One of the principal themes of the book is the continual evolution of humans, especially since their migration out of Africa around 50,000 years ago. In an interview with American Scientist in 2006, Wade recognised the resistance against this idea in the fields of anthropology and archaeology, but thought that as "genetic evidence [forces a] reevaluation of the view that evolution stopped in the distant past", it would ease over time. More controversial, though, is the idea that humans populations have diverged enough to be considered "races". When asked about the "idea of race and how it is often thought of as just a social construct", he replied:

Well, I think the subject of race has been so difficult and so polluted by malign ideas that most people have just left it alone, including geneticists. … Most genetic variation is neutral – it doesn't do anything for or against the phenotype, and evolution ignores it – so most previous attempts to look at race have concluded that there's little difference between races. I think this position is the one on which the social scientists are basing their position. … If you look at the genes that do make a difference, selected genes, which are a tiny handful of the whole, you do find a number of differences, not very many, but a number of interesting differences between races as to which genes have been selected. This, of course, makes a lot of sense, because once the human family dispersed from its homeland in Africa, people faced different environments on each continent, different climates, different evolutionary challenges, and each group adapted to its environment in its own way.

American Scientist responded by suggesting that the term "race" is associated with historical baggage, and asked if perhaps a different term should be used. Wade replied, saying:

I'm not sure how that will play out. The geneticists, if you read their papers, have long been using code words. They sort of dropped the term "race" about 1980 or earlier, and instead you see code words like "population" or "population structure." Now that they're able to define race in genetic terms they tend to use other words, like "continental groups" or "continent of origin," which does, indeed, correspond to the everyday conception of race. When I'm writing I prefer to use the word race because that's the word that everyone understands. It's a word with baggage, but it's not necessarily a malign word.

In Before the Dawn, Wade suggests that genetic differences between human populations, or races, may be responsible for differences in sporting achievement. He avoids the more controversial discussion of why there are differences in IQ between them (aside from the high IQ of Ashkenazi Jews), however, saying that "[t]his dispute, whose merits lie beyond the scope of this book, has long made the study of race controversial".

Reception

Biologist E. O. Wilson gave the book an especially positive review, which appears on the cover.

Before the Dawn received generally favourable reviews. E. O. Wilson, often known as the "father of sociobiology", proclaimed that it was "[b]y far the best book I have ever read on humanity's deep history." James Watson, co-discoverer of the DNA double helix and Nobel Prize winner, commended the book for providing a "masterful overview on how changes in our respective DNA lineages let us begin to understand how human beings have evolved from ancestral hunter-gatherer forebears into effective members of today's advanced human societies." Lionel Tiger, Professor of Anthropology at Rutgers University, stated that "Nicholas Wade has delivered an impeccable, fearless, responsible and absorbing account" and that the book is "[b]ound to be the gold standard in the field for a very long time." Similarly, The Washington Post columnist Richard Cohen declared that Wade "is a robust and refreshing critic of scientific political correctness."

Another positive review came from John Derbyshire, a former columnist for the National Review, who declared that:

Stricter adherents of the [standard social science model] will be scandalized by the inclusion of a chapter titled "Race," which according to them is a thing that does not exist, except in the diseased imaginations of "racists." Fiddlesticks, says Wade: Of course race exists. He proceeds to give a calm, factual account of what we know, again carefully rooting it all in the genetic evidence.

Derbyshire concluded:

Before the Dawn is beautifully done, a grand genealogy of modern humanity, rooted in fact but spiced with an appropriate measure of speculation and hypothesis. Even for a reader to whom the material is already familiar – one who, for example, has been following Nicholas Wade's reports in The New York Times – it is well worth the trouble of reading this book for its narrative value, for the elegant way Wade has put it all together as a single compelling story. This is a brilliant book, by one of our best science journalists.

Other positive reviews came from Publishers Weekly; Kirkus Reviews, who called the book "meaty, well-written"; and The New York Review of Books, who said that it was "on the whole, a fascinating account of recent scientific findings."

Craig Stanford, Professor of Biological Sciences and Anthropology at the University of Southern California, gave the book a generally positive review in American Scientist, but offered critique by suggesting that "[i]f there is a flaw in this tightly written, insightful book, it is that Wade provides perhaps too many of the stock examples of human evolution. In my view, he spends too much time and space attempting to convince the reader that we did indeed evolve from apes (duh!) and that our own social behavior and cognition have roots in the deep human past." Peter Dizikes of The New York Times also gave a mostly positive review, calling it a "timely and informative survey." However, he criticised Wade's claim that "the development of lactose tolerance shows broadly that 'genes respond to cultural changes'", and posited the alternative view that it is "a case of genes responding to an environmental change produced by society – the abundance of milk-producing cattle – and not to any abstract cultural practice." Dizikes also took issue with Wade's use of the word "race", saying that the "judgment that these regional genetic tendencies constitute 'races' has no deep scientific rationale, either. Such labels are generalizations situated atop a complicated intermingling of populations."

Contrastingly, social anthropologists Kenneth M. Weiss and Anne V. Buchanan of the journal Nature (of which Nicholas Wade was formerly deputy editor) gave the book a highly critical review, finding particular fault in the book's social and political implications:

Positions on genetic determinism often correlate with social politics, and few of us are neutral or even changeable on the issues. Wade recognizes that his ideas may not be acceptable to everyone but warns that "to falter in scientific inquiry would be a retreat into darkness". He seems to be warning, appropriately enough, against benighted political correctness. But we should never become casual about how comparable "slopular" science and very similar speculative evolutionary reasoning by leading scientists fed a venomous kind of darkness not too many decades ago. Wade's post-hoc tales often put him in step with a long march of social darwinists who, with comfortable detachment from the (currently) dominant culture, insist that we look starkly at life in the raw and not blink at what we see.

In 2007, the book was given the Science in Society Journalism Award by the National Association of Science Writers. The judges found that Wade's writing was "skillful" in putting together the many findings about human origins in an "engaging" way.

Recent human evolution

From Wikipedia, the free encyclopedia

Recent human evolution refers to evolutionary adaptation, sexual and natural selection, and genetic drift within Homo sapiens populations, since their separation and dispersal in the Middle Paleolithic about 50,000 years ago. Contrary to popular belief, not only are humans still evolving, their evolution since the dawn of agriculture is faster than ever before. It is possible that human culture—itself a selective force—has accelerated human evolution. With a sufficiently large data set and modern research methods, scientists can study the changes in the frequency of an allele occurring in a tiny subset of the population over a single lifetime, the shortest meaningful time scale in evolution. Comparing a given gene with that of other species enables geneticists to determine whether it is rapidly evolving in humans alone. For example, while human DNA is on average 98% identical to chimp DNA, the so-called Human Accelerated Region 1 (HAR1), involved in the development of the brain, is only 85% similar.

Following the peopling of Africa some 130,000 years ago, and the recent Out-of-Africa expansion some 70,000 to 50,000 years ago, some sub-populations of Homo sapiens have been geographically isolated for tens of thousands of years prior to the early modern Age of Discovery. Combined with archaic admixture, this has resulted in significant genetic variation, which in some instances has been shown to be the result of directional selection taking place over the past 15,000 years, which is significantly later than possible archaic admixture events. That the human populations living on different parts of the globe have been evolving on divergent trajectories reflects the different conditions of their habitats. Selection pressures were especially severe for populations affected by the Last Glacial Maximum (LGM) in Eurasia, and for sedentary farming populations since the Neolithic, or New Stone Age.

Single nucleotide polymorphisms (SNP, pronounced 'snip'), or mutations of a single genetic code "letter" in an allele that spread across a population, in functional parts of the genome can potentially modify virtually any conceivable trait, from height and eye color to susceptibility to diabetes and schizophrenia. Approximately 2% of the human genome codes for proteins and a slightly larger fraction is involved in gene regulation. But most of the rest of the genome has no known function. If the environment remains stable, the beneficial mutations will spread throughout the local population over many generations until it becomes a dominant trait. An extremely beneficial allele could become ubiquitous in a population in as little as a few centuries whereas those that are less advantageous typically take millennia.

Human traits that emerged recently include the ability to free-dive for long periods of time, adaptations for living in high altitudes where oxygen concentrations are low, resistance to contagious diseases (such as malaria), fair skin, blue eyes, lactase persistence (or the ability to digest milk after weaning), lower blood pressure and cholesterol levels, thick hair shaft, dry ear wax, lower chances of drunkenness, higher body-mass index, reduced prevalence of Alzheimer's disease, lower susceptibility to diabetes, genetic longevity, shrinking brain sizes, and changes in the timing of menarche and menopause.

Archaic admixture

Genetic evidence suggests that a species dubbed Homo heidelbergensis is the last common ancestor of Neanderthals, Denisovans, and Homo sapiens. This common ancestor lived between 600,000 and 750,000 years ago, likely in either Europe or Africa. Members of this species migrated throughout Europe, the Middle East, and Africa and became the Neanderthals in Western Asia and Europe while another group moved further east and evolved into the Denisovans, named after the Denisovan Cave in Russia where the first known fossils of them were discovered. In Africa, members this group eventually became anatomically modern humans. Migrations and geographical isolation notwithstanding, the three descendant groups of Homo heidelbergensis later met and interbred.

Reconstruction of a Neanderthal female.

DNA analysis reveals that modern-day Tibetans, Melanesians, and Australian Aboriginals carry about 3%-5% of Denisovan DNA. In addition, DNA analysis of Indonesians and Papua New Guineans indicates that Homo sapiens and Denisovans interbred as recently as between 15,000 and 30,000 years ago.

Archaeological research suggests that as prehistoric humans swept across Europe 45,000 years ago, Neanderthals went extinct. Even so, there is evidence of interbreeding between the two groups as humans expanded their presence in the continent. While prehistoric humans carried 3%-6% Neanderthal DNA, modern humans have only about 2%. This seems to suggest selection against Neanderthal-derived traits.  For example, the neighborhood of the gene FOXP2, affecting speech and language, shows no signs of Neanderthal inheritance whatsoever.

Introgression of genetic variants acquired by Neanderthal admixture has different distributions in Europeans and East Asians, pointing to differences in selective pressures. Though East Asians inherit more Neanderthal DNA than Europeans, East Asians, South Asians, and Europeans all share Neanderthal DNA, so hybridization likely occurred between Neanderthals and their common ancestors coming out of Africa. Their differences also suggest separate hybridization events for the ancestors of East Asians and other Eurasians.

Following the genome sequencing of three Vindija Neanderthals, a draft sequence of the Neanderthal genome was published and revealed that Neanderthals shared more alleles with Eurasian populations—such as French, Han Chinese, and Papua New Guinean—than with sub-Saharan African populations, such as Yoruba and San. According to the authors of the study, the observed excess of genetic similarity is best explained by recent gene flow from Neanderthals to modern humans after the migration out of Africa. But gene flow did not go one way. The fact that some of the ancestors of modern humans in Europe migrated back into Africa means that modern Africans also carry some genetic materials from Neanderthals. In particular, Africans share 7.2% Neanderthal DNA with Europeans but only 2% with East Asians.

Some climatic adaptations, such as high-altitude adaptation in humans, are thought to have been acquired by archaic admixture. An ethnic group known as the Sherpas from Nepal is believed to have inherited an allele called EPAS1, which allows them to breathe easily at high altitudes, from the Denisovans. A 2014 study reported that Neanderthal-derived variants found in East Asian populations showed clustering in functional groups related to immune and haematopoietic pathways, while European populations showed clustering in functional groups related to the lipid catabolic process. A 2017 study found correlation of Neanderthal admixture in modern European populations with traits such as skin tone, hair color, height, sleeping patterns, mood and smoking addiction. A 2020 study of Africans unveiled Neanderthal haplotypes, or alleles that tend to be inherited together, linked to immunity and ultraviolet sensitivity. The promotion of beneficial traits acquired from admixture is known as adaptive introgression.

Upper Paleolithic, or the Late Stone Age (50,000 to 12,000 years ago)

Epicanthic eye folds are thought to be an adaptation for cold weather.

DNA analyses conducted since 2007 revealed the acceleration of evolution with regards to defenses against disease, skin color, nose shapes, hair color and type, and body shape since about 40,000 years ago, continuing a trend of active selection since humans emigrated from Africa 100,000 years ago. Humans living in colder climates tend to be more heavily built compared to those in warmer climates because having a smaller surface area compared to volume makes it easier to retain heat. People from warmer climates tend to have thicker lips, which have large surface areas, enabling them to keep cool. With regards to nose shapes, humans residing in hot and dry places tend to have narrow and protruding noses in order to reduce loss of moisture. Humans living in hot and humid places tend to have flat and broad noses that moisturizes inhaled hair and retains moisture from exhaled air. Humans dwelling in cold and dry places tend to have small, narrow, and long noses in order to warm and moisturize inhaled air. As for hair types, humans from regions with colder climates tend to have straight hair so that the head and neck are kept warm. Straight hair also allows cool moisture to quickly fall off the head. On the other hand, tight and curly hair increases the exposed areas of the scalp, easing the evaporation of sweat and allowing heat to be radiated away while keeping itself off the neck and shoulders. Epicanthic eye folds are believed to be an adaptation protecting the eye from the snow and reducing snow glare.

Physiological or phenotypical changes have been traced to Upper Paleolithic mutations, such as the East Asian variant of the EDAR gene, dated to about 35,000 years ago. Traits affected by the mutation are sweat glands, teeth, hair thickness and breast tissue. While Africans and Europeans carry the ancestral version of the gene, most East Asians have the mutated version. By testing the gene on mice, Yana G. Kamberov and Pardis C. Sabeti and their colleagues at the Broad Institute found that the mutated version brings thicker hair shafts, more sweat glands, and less breast tissue. East Asian women are known for having comparatively small breasts and East Asians in general tend to have thick hair. The research team calculated that this gene originated in Southern China, which was warm and humid, meaning having more sweat glands would be advantageous to the hunter-gatherers who lived there. Geneticist Joshua Akey suggested that the mutant gene could also be favored by sexual selection in that the visible traits associated with this gene made the individual carrying it more attractive to potential mates. Yet a third explanation is offered by Kamberov, who argued that each of the traits due to the mutant gene could be favored at different times. Today, the mutant version of EDAR can be found among 93% of the Han Chinese, 70% among the Japanese and the Thai, and between 60% to 90% among the American Indians, who descended from East Asia.

The most recent Ice Age peaked in intensity between 19,000 and 25,000 years ago and ended about 12,000 years ago. As the glaciers that once covered Scandinavia all the way down to Northern France retreated, humans began returning to Northern Europe from the Southwest, modern-day Spain. But about 14,000 years ago, humans from Southeastern Europe, especially Greece and Turkey, began migrating to the rest of the continent, displacing the first group of humans. Analysis of genomic data revealed that all Europeans since 37,000 years ago have descended from a single founding population that survived the Ice Age, with specimens found in various parts of the continent, such as Belgium. Although this human population got displaced 33,000 years ago, a genetically related group began spreading across Europe 19,000 years ago. Recent divergence of Eurasian lineages was sped up significantly during the Last Glacial Maximum, the Mesolithic and the Neolithic, due to increased selection pressures and founder effects associated with migration. Alleles predictive of light skin have been found in Neanderthals, but the alleles for light skin in Europeans and East Asians, KITLG and ASIP, are (as of 2012) thought to have not been acquired by archaic admixture but recent mutations since the LGM. Phenotypes associated with the white or Caucasian populations of Western Eurasian stock emerge during the LGM, from about 19,000 years ago. The light skin pigmentation characteristic of modern Europeans is estimated to have spread across Europe in a "selective sweep" during the Mesolithic (5,000 years ago). The associated TYRP1 SLC24A5 and SLC45A2 alleles emerge around 19,000 years ago, still during the LGM, most likely in the Caucasus. Within the last 20,000 years or so, light skin has been favored by natural selection in East Asia, Europe, and North America. At the same time, Southern Africans tend to have lighter skin than their equatorial counterparts. In general, people living in higher latitudes tend to have lighter skin. The HERC2 variation for blue eyes first appears around 14,000 years ago in Italy and the Caucasus.

Larger average cranial capacity is correlated with living in cold regions.

Inuit adaptation to high-fat diet and cold climate has been traced to a mutation dated the Last Glacial Maximum (20,000 years ago). Average cranial capacity among modern male human populations varies in the range of 1,200 to 1,450 cm3. Larger cranial volumes are associated with cooler climatic regions, with the largest averages being found in populations of Siberia and the Arctic. Humans living in Northern Asia and the Arctic have evolved the ability to develop thick layers of fat on their faces to keep warm. Moreover, the Inuit tend to have flat and broad faces, an adaptation that reduces the likelihood of frostbites. Both Neanderthal and Cro-Magnons had somewhat larger cranial volumes on average than modern Europeans, suggesting the relaxation of selection pressures for larger brain volume after the end of the LGM.

Australian Aboriginals living in the Central Desert, where the temperature can drop below freezing at night, have evolved the ability to reduce their core temperatures without shivering.

Early fossils of Homo sapiens suggest that members of this species had vastly different brains 300,000 years ago compared to today. In particular, they were elongated rather than globular in shape. Only fossils from 35,000 years ago or less share the same basic brain shape as that of current humans. Human brains appear to be shrinking over the last twenty thousand years. Modern human brains are about 10% smaller than those of the Cro-Magnons, who lived in Europe twenty to thirty thousand years ago. That is a difference comparable to a tennis ball. Scientists are not so sure about the implications of this finding. On one hand, it could be that humans are becoming less and less intelligent as their societies become ever more complex, which makes it easier for them to survive. On the other hand, shrinking brain sizes could be associated with lower levels of aggression. In any case, evidence for the shrinking human brain can be observed in Africa, China, and Europe.

Even though it has long been thought that human culture—broadly defined to be any learned behavior, including technology—has slowed down, if not halted, human evolution, biologists working in the early twenty-first century A.D. have come to the conclusion that instead, human culture itself is a force of selection. Scans of the entire human genome suggests large parts of it is under active selection within the last 10,000 to 20,000 years or so, which is recent in evolutionary terms. Although the details of such genes remain unclear (as of 2010), they can still be categorized for likely functionality according to the structures of the proteins for which they code. Many such genes are linked to the immune system, the skin, metabolism, digestion, bone development, hair growth, smell and taste, and brain function. Since the culture of behaviorally modern humans undergoes rapid change, it is possible that human culture has accelerated human evolution within the last 50,000 years or so. While this possibility remains unproven, mathematical models do suggest that gene-culture interactions can give rise to especially speedy biological evolution. If this is true, then humans are evolving to adapt to the selective pressures they created themselves.

Holocene (12,000 years ago till present)

Neolithic or New Stone Age

All blue-eyed humans share a common ancestor.

Blue eyes are an adaptation for living in regions where the amounts of light are limited because they allow more light to come in than brown eyes. A research program by geneticist Hans Eiberg and his team at the University of Copenhagen from the 1990s to 2000s investigating the origins of blue eyes revealed that a mutation in the gene OCA2 is responsible for this trait. According to them, all humans initially had brown eyes and the OCA2 mutation took place between 6,000 and 10,000 years ago. It dilutes the production of melanin, responsible for the pigmentation of human hair, eye, and skin color. The mutation does not completely switch off melanin production, however, as that would leave the individual with a condition known as albinism. Variations in eye color from brown to green can be explained via the variation in the amounts of melanin produced in the iris. While brown-eyed individuals share a large area in their DNA controlling melanin production, blue-eyed individuals have only a small region. By examining mitochondrial DNA of people from multiple countries, Eiberg and his team concluded blue-eyed individuals all share a common ancestor.

In 2018, an international team of researchers from Israel and the United States announced their genetic analysis of 6,500-year-old excavated human remains in Israel's Upper Galilee region revealed a number of traits not found in the humans who had previously inhabited the area, including blue eyes. They concluded that the region experienced a significant demographic shift 6,000 years ago due to migration from Anatolia and the Zagros mountains (in modern-day Turkey and Iran) and that this change contributed to the development of the Chalcolithic culture in the region.

In 2006, population geneticist Jonathan Pritchard and his colleagues studied the populations of Africa, East Asia, and Europe and identified some 700 regions of the human genome as having been shaped by natural selection between 15,000 and 5,000 years ago. These genes affect the senses of smell and taste, skin color, digestion, bone structure, and brain function. According to Spencer Wells, director of the Genographic Project of the National Geographic Society, such a study helps anthropologists explain in detail why peoples from different parts of the globe can be so strikingly different in appearance even though most of their DNA is identical.

The advent of agriculture has played a key role in the evolutionary history of humanity. Early farming communities benefited from new and comparatively stable sources of food, but were also exposed to new and initially devastating diseases such as measles and smallpox. Eventually, genetic resistance to such diseases evolved and humans living today are descendants of those who survived the agricultural revolution and reproduced. Diseases are one of the strongest forces of evolution acting on Homo sapiens. As this species migrated throughout Africa and began colonizing new lands outside the continent around 100,000 years ago, they came into contact with and helped spread a variety of pathogens with deadly consequences. In addition, the dawn of agriculture led to the rise of major disease outbreaks. Malaria is the oldest known of human contagions, traced to West Africa around 100,000 years ago, before humans began migrating out of the continent. Malarial infections surged around 10,000 years ago, raising the selective pressures upon the affected populations, leading to the evolution of resistance.

A study by anthropologists John Hawks, Henry Harpending, Gregory Cochran, and colleagues suggests that human evolution has sped up significantly since the beginning of the Holocene, at an estimated pace of around 100 times faster than during the Paleolithic, primarily in the farming populations of Eurasia. Thus, humans living in the twenty-first century are more different from their ancestors of 5,000 years ago than their ancestors from that era were to the Neanderthals who went extinct around 30,000 years ago. They tied this effect to new selection pressures arising from new diets, new modes of habitation, and immunological pressures related to the domestication of animals. For example, populations that cultivate rice, wheat, and other grains have gained the ability to digest starch thanks to an enzyme called amylase, found in saliva. In addition, having a larger population means having more mutations, the raw material on which natural selection acts.

Hawks and colleagues scanned data from the International HapMap Project of Africans, Asians, and Europeans for SNPs and found evidence of evolution speeding up in 1800 genes, or 7% of the human genome. They also discovered that human populations in Africa, Asia, and Europe were evolving along divergent paths, becoming ever more different, and that there was very little gene flow among them. Most of the new traits are unique to their continent of origin.

Humans living in humid tropical areas show the least sign of evolution, meaning ancestral humans were especially well-suited to these places. Only when humans migrated out of them did natural selective pressures arise. Moreover, African populations have the highest amounts of genetic diversity; the further one moves from Africa, the more homogeneous people become genetically. In fact, most of the variation in the human genome is due not to natural selection but rather neutral mutations and random shuffling of genes down the generations.

John Hawks reported evidence of recent evolution in the human brain within the last 5,000 years or so. Measurements of the skull suggests that the human brain has shrunk by about 150 cubic centimeters, or roughly ten percent. This is likely due to the growing specialization in modern societies centered around agriculture rather than hunting and gathering. More broadly, human brain sizes have been diminishing since at least 100,000 years ago, though the change was most significant within the last 12,000 years. 100,000 years ago, the average brain size was about 1,500 cubic centimeters, compared to around 1,450 cubic centimeters 12,000 years ago and 1,350 today.

Examples for adaptations related to agriculture and animal domestication include East Asian types of ADH1B associated with rice domestication, and lactase persistence.

About ten thousand years ago, the rice-cultivating residents of Southern China discovered that they could make alcoholic beverages by fermentation. Drunkenness likely became a serious threat to survival and a mutant gene for an enzyme that decomposes alcohol into something safe and makes people's faces turn red, alcohol dehydrogenase, gradually spread throughout the rest of China.

Today, most Northwestern Europeans can drink milk after weaning.

Around 11,000 years ago, as agriculture was replacing hunting and gathering in the Middle East, people invented ways to reduce the concentrations of lactose in milk by fermenting it to make yogurt and cheese. People lost the ability to digest lactose as they matured and as such lost the ability to consume milk. Thousands of years later, a genetic mutation enabled people living in Europe at the time to continue producing lactase, an enzyme that digests lactose, throughout their lives, allowing them to drink milk after weaning and survive bad harvests.

These two key developments paved the way for communities of farmers and herders to rapidly displace the hunter-gatherers who once prevailed across Europe. Today, lactase persistence can be found in 90% or more of the populations in Northwestern and Northern Central Europe, and in pockets of Western and Southeastern Africa, Saudi Arabia, and South Asia. It is not as common in Southern Europe (40%) because Neolithic farmers had already settled there before the mutation existed. On the other hand, it is rather rare in inland Southeast Asia and Southern Africa. While all Europeans with lactase persistence share a common ancestor for this ability, pockets of lactase persistence outside Europe are likely due to separate mutations. The European mutation, called the LP allele, is traced to modern-day Hungary, 7,500 years ago. In the twenty-first century, about 35% of the human population is capable of digesting lactose after the age of seven or eight. Milk-drinking humans could produce offspring up to 19% more fertile than those without the ability, putting the mutation among those under the strongest selection known. As an example of gene-culture co-evolution, communities with lactase persistence and dairy farming took over Europe in several hundred generations, or thousands of years. This raises a chicken-and-egg type of question: which came first, dairy farming or lactase persistence? To answer this question, population geneticists examined DNA samples extracted from skeletons found in archeological sites in Germany, Hungary, Poland, and Lithuania dating from between 3,800 and 6,000 years ago. They did not find any evidence of the LP allele. Hence, Europeans began dairy farming before they gained the ability to drink milk after early childhood.

A Finnish research team reported that the European mutation that allows for lactase persistence is not found among the milk-drinking and dairy-farming Africans, however. Sarah Tishkoff and her students confirmed this by analyzing DNA samples from Tanzania, Kenya, and Sudan, where lactase persistence evolved independently. The uniformity of the mutations surrounding the lactase gene suggests that lactase persistence spread rapidly throughout this part of Africa. According to Tishkoff's data, this mutation first appeared between 3,000 and 7,000 years ago, and has been strongly favored by natural selection, more strongly than even resistance to malaria, in fact. In this part of the world, it provides some protection against drought and enables people to drink milk without diarrhea, which causes dehydration.

Lactase persistence is a rare ability among mammals. It is also a clear and simple example of convergent evolution in humans because it involves a single gene. Other examples of convergent evolution, such as the light skin of Europeans and East Asians or the various means of resistance to malaria, are much more complicated.

Humans evolved light skin after migrating from Africa to Europe and East Asia.

The shift towards settled communities based on farming was a significant cultural change, which in turn may have accelerated human evolution. Agriculture brought about an abundance of cereals, enabling women to wean their babies earlier and have more children over shorter periods of time. Despite the vulnerability of densely populated communities to diseases, this led to a population explosion and thus more genetic variation, the raw material on which natural selection acts. Diets in early agricultural communities were deficient in many nutrients, including vitamin D. This could be one reason why natural selection has favored fair skin among Europeans, as it increases UV absorption and synthesis of vitamin D.

Paleoanthropologist Richard G. Klein of Stanford University told the New York Times that while it was difficult to correlate a given genetic change with a specific archeological period, it was possible to identify a number of modifications as due to the rise of agriculture. Rice cultivation spread across China between 7,000 and 6,000 years ago and reached Europe at about the same time. Scientists have had trouble finding Chinese skeletons before that period resembling that of a modern Chinese person or European skeletons older than 10,000 years similar to that of a modern European.

Among the list of genes Jonathan Pritchard and his team studied were five that influenced complexion. Selected versions of the genes, thought to have first emerged 6,600 years ago, were found only among Europeans and were responsible for their pale skin. The consensus among anthropologists is that when the first anatomically modern humans arrived in Europe 45,000 years ago, they shared the dark skin of their African ancestors but eventually acquired lighter skin as an adaptation that helped them synthesize vitamin D using sunlight. This means that either the Europeans acquired their light skin much more recently or that this was a continuation of an earlier trend. Because East Asians are also pale, nature achieved the same result either by selecting different genes not detected by the test or by doing so to the same genes but thousands of years earlier, making such changes invisible to the test.

Non-human primates have no pigments in their skin because they have fur. But when humans lost their fur—enabling them to sweat efficiently—they needed dark skin to protect themselves against ultraviolet radiation. Later research revealed that the so-called golden gene, thus named because of the color it gives to zebrafish, is ubiquitous among Europeans but rare among East Asians, suggesting there was little gene flow between the two populations. Among East Asians, a different gene, DCT, likely contributed to their fair skin.

Bronze Age to Medieval Era

Sickle cell anemia is an adaptation against malaria.

Resistance to malaria is a well-known example of recent human evolution. This disease attacks humans early in life. Thus humans who are resistant enjoy a higher chance of surviving and reproducing. While humans have evolved multiple defenses against malaria, sickle cell anemia—a condition in which red blood cells are deformed into sickle shapes, thereby restricting blood flow—is perhaps the best known. Sickle cell anemia makes it more difficult for the malarial parasite to infect red blood cells. This mechanism of defense against malaria emerged independently in Africa and in Pakistan and India. Within 4,000 years it has spread to 10-15% of the populations of these places. Another mutation that enabled humans to resist malaria that is strongly favored by natural selection and has spread rapidly in Africa is the inability of synthesize the enzyme glucose-6-phosphate dehydrogenase, or G6PD.

A combination of poor sanitation and high population densities proved ideal for the spread of contagious diseases which was deadly for the residents of ancient cities. Evolutionary thinking would suggest that people living in places with long-standing urbanization dating back millennia would have evolved resistance to certain diseases, such as tuberculosis and leprosy. Using DNA analysis and archeological findings, scientists from the University College London and the Royal Holloway studied samples from 17 sites in Europe, Asia, and Africa. They learned that, indeed, long-term exposure to pathogens has led to resistance spreading across urban populations. Urbanization is therefore a selective force that has influenced human evolution. The allele in question is named SLC11A1 1729+55del4. Scientists found that among the residents of places that have been settled for thousands of years, such as Susa in Iran, this allele is ubiquitous whereas in places with just a few centuries of urbanization, such as Yakutsk in Siberia, only 70-80% of the population have it.

Adaptations have also been found in modern populations living in extreme climatic conditions such as the Arctic, as well as immunological adaptations such as resistance against brain disease in populations practicing mortuary cannibalism, or the consumption of human corpses. Inuit have the ability to thrive on the lipid-rich diets consisting of Arctic mammals. Human populations living in regions of high attitudes, such as the Tibetan Plateau, Ethiopia, and the Andes benefit from a mutation that enhances the concentration of oxygen in their blood. This is achieved by having more capillaries, increasing their capacity for carrying oxygen. This mutation is believed to be around 3,000 years old.

Geneticist Ryosuke Kimura and his team at the Tokai University School of Medicine discovered that an allele called EDAR, practically absent among Europeans and Africans but common among East Asians, gives rise to thicker hair, presumably as an adaptation to the cold. Kohichiro Yoshihura and his team at Nagasaki University found that a variant of the gene ABCC11 produces dry ear wax among East Asians. Africans and Europeans by contrast share the older version of the gene, producing wet ear wax. However, it is not known what evolutionary advantage, if any, wet ear wax confers, so this variant was likely selected for some other trait, such as making people sweat less. What scientists do know, however, is that dry ear wax is strongly favored by natural selection in East Asia.

The Sama-Bajau have evolved to become durable free divers.

A recent adaptation has been proposed for the Austronesian Sama-Bajau, also known as the Sea Gypsies or Sea Nomads, developed under selection pressures associated with subsisting on free-diving over the past thousand years or so. As maritime hunter-gatherers, the ability to dive for long periods of times plays a crucial role in their survival. Due to the mammalian dive reflex, the spleen contracts when the mammal dives and releases oxygen-carrying red blood cells. Over time, individuals with larger spleens were more likely to survive and thrive because free-diving can actually be quite dangerous. By contrast, communities centered around farming show no signs of evolving to have larger spleens. Because the Sama-Bajau show no interest in abandoning this lifestyle, there is no reason to believe further adaptation will not occur.

Advances in the biology of genomes have enabled geneticists to investigate the course of human evolution within centuries or even decades. Jonathan Pritchard and a postdoctoral fellow, Yair Field, found a way to track changes in the frequency of an allele using huge genomic data sets. They did this by counting the singletons, or changes of single DNA bases, which are likely to be recent because they are rare and have not spread throughout the population. Since alleles bring neighboring DNA regions with them as they move around the genome, the number of singletons can be used to roughly estimate how quickly the allele has changed its frequency. This approach can unveil evolution within the last 2,000 years or a hundred human generations. Armed with this technique and data from the UK10K project, Pritchard and his team found that alleles for lactase persistence, blond hair, and blue eyes have spread rapidly among Britons within the last two millennia or so. Britain's cloudy skies may have played a role in that the genes for fair hair could also bring fair skin, reducing the chances of vitamin D deficiency. Sexual selection could play a role, too, driven by fondness of mates with blond hair and blue eyes. The technique also enabled them to track the selection of polygenic traits—those affected by a multitude of genes, rather than just one—such as height, infant head circumferences, and female hip sizes (crucial for giving birth). They found that natural selection has been favoring increased height and larger head and female hip sizes among Britons. Moreover, lactase persistence showed signs of active selection during the same period. However, evidence for the selection of polygenic traits is weaker than those affected only by one gene.

A 2012 paper studied the DNA sequence of around 6,500 Americans of European and African descent and confirmed earlier work indicating that the majority of changes to a single letter in the sequence (single nucleotide variants) were accumulated within the last 5,000-10,000 years. Almost three quarters arose in the last 5,000 years or so. About 14% of the variants are potentially harmful, and among those, 86% were 5,000 years old or younger. The researchers also found that European Americans had accumulated a much larger number of mutations than African Americans. This is likely a consequence of their ancestors' migration out of Africa, which resulted in a genetic bottleneck; there were few mates available. Despite the subsequent exponential growth in population, natural selection has not had enough time to eradicate the harmful mutations. While humans today carry far more mutations than their ancestors did 5,000 years ago, they are not necessarily more vulnerable to illnesses because these might be caused by multiple mutations. It does, however, confirm earlier research suggesting that common diseases are not caused by common gene variants. In any case, the fact that the human gene pool has accumulated so many mutations over such a short period of time—in evolutionary terms—and that the human population has exploded in that time mean that humanity is more evolvable than ever before. Natural selection might eventually catch up with the variations in the gene pool, as theoretical models suggest that evolutionary pressures increase as a function of population size.

Industrial Revolution to present

Even though modern healthcare reduces infant mortality rates and extends life expectancy, natural selection continues to act on humans.

Geneticist Steve Jones told the BBC that during the sixteenth century, only a third of English babies survived till the age of 21, compared to 99% in the twenty-first century. Medical advances, especially those made in the twentieth century, made this change possible. Yet while people from the developed world today are living longer and healthier lives, many are choosing to have just a few or no children at all, meaning evolutionary forces continue to act on the human gene pool, just in a different way.

While modern medicine appears to shield humanity from the pressures of natural selection, it does not prevent other evolutionary processes from taking place. According to neutral selection theory, natural selection affects only 8% of the human genome, meaning mutations in the remaining parts of the genome can change their frequency by pure chance. If natural selective pressures are reduced, then traits that are normally purged are not removed as quickly, which could increase their frequency and speed up evolution. There is evidence that the rate of human mutation is rising. For humans, the largest source of heritable mutations is sperm; a man accumulates more and more mutations in his sperm as he ages. Hence, men delaying reproduction can affect human evolution. The accumulation of so many mutations in a short period of time could pose genetic problems for future human generations.

A 2012 study led by Augustin Kong suggests that the number of de novo (or new) mutations increases by about two per year of delayed reproduction by the father and that the total number of paternal mutations doubles every 16.5 years.

Dependence on modern medicine itself is another evolutionary time bomb. For a long time, it has reduced the fatality of genetic defects and contagious diseases, allowing more and more humans to survive and reproduce, but it has also enabled maladaptive traits that would otherwise be culled to accumulate in the gene pool. This is not a problem as long as access to modern healthcare is maintained. But natural selective pressures will mount considerably if that is taken away. Nevertheless, dependence on medicine rather than genetic adaptations will likely be the driving force behind humanity's fight against diseases for the foreseeable future. Moreover, while the introduction of antibiotics initially reduced the mortality rates due to infectious diseases by significant amounts, abuse has led to the rise of resistant strains of bacteria, making many illnesses major causes of death once again.

Human jaws and teeth have been shrinking in proportion with the decrease in body size in the last 30,000 years as a result of new diets and technology. There are many individuals today who do not have enough space in their mouths for their third molars (or wisdom teeth) due to reduced jaw sizes. In the twentieth century, the trend toward smaller teeth appeared to have been slightly reversed due to the introduction of fluoride, which thickens dental enamel, thereby enlarging the teeth.

In the middle of the eighteenth century, the average height of Dutch soldiers was 165 cm, well below European and American averages. However, 150 years later, the Dutch gained an average of 20 cm while the Americans only 6 cm. This is due to the fact that tall Dutchmen on average had more children than those who were short, as Dutchwomen found them more attractive, and that while tall Dutchwomen on average had fewer children than those of medium heights, they did have more children than those who were short. Things like good nutrition and good healthcare did not play as important a role as biological evolution. By contrast, in some other countries such as the United States, for example, men of average height and short women tended to have more children.

Recent research suggests that menopause is evolving to occur later. Other reported trends appear to include lengthening of the human reproductive period and reduction in cholesterol levels, blood glucose and blood pressure in some populations.

Population geneticist Emmanuel Milot and his team studied recent human evolution in an isolated Canadian island using 140 years of church records. They found that selection favored younger age at first birth among women. In particular, the average age at first birth of women from Coudres Island (Île aux Coudres), 80 km northeast of Québec City, decreased by four years between 1800 and 1930. Women who started having children sooner generally ended up with more children in total who survive till adulthood. In other words, for these French-Canadian women, reproductive success was associated with lower age at first childbirth. Maternal age at first birth is a highly heritable trait.

Human evolution continues during the modern era, including among industrialized nations. Things like access to contraception and the freedom from predators do not stop natural selection. Among developed countries, where life expectancy is high and infant mortality rates are low, selective pressures are the strongest on traits that influence the number of children a human has. It is speculated that alleles influencing sexual behavior would be subject to strong selection, though the details of how genes can affect said behavior remain unclear.

Historically, as a by-product of the ability to walk upright, humans evolved to have narrower hips and birth canals and to have larger heads. Compared to other close relatives such as chimpanzees, childbirth is a highly challenging and potentially fatal experience for humans. Thus began an evolutionary tug-of-war. For babies, having larger heads proved beneficial as long as their mothers' hips were wide enough. If not, both mother and child typically died. This is an example of balancing selection, or the removal of extreme traits. In this case, heads that were too large or small were selected against. This evolutionary tug-of-war attained an equilibrium, making these traits remain more or less constant over time while allowing for genetic variation to flourish, thus paving the way for rapid evolution should selective forces shift their direction.

All this changed in the twentieth century as Cesarean sections (or C-sections) became safer and more common in some parts of the world. Larger head sizes continue to be favored while selective pressures against smaller hip sizes have diminished. Projecting forward, this means that human heads would continue to grow while hip sizes would not. As a result of increasing fetopelvic disproportion, C-sections would become more and more common in a positive feedback loop, though not necessarily to the extent that natural childbirth would become obsolete.

Paleoanthropologist Briana Pobiner of the Smithsonian Institute noted that cultural factors could play a role in the widely different rates of C-sections across the developed and developing worlds. Daghni Rajasingam of the Royal College of Obstetricians observed that the increasing rates of diabetes and obesity among women of reproductive age also boost the demand for C-sections. Biologist Philipp Mitteroecker from the University of Vienna and his team estimated that about six percent of all births worldwide were obstructed and required medical intervention. In the United Kingdom, one quarter of all births involved the C-section while in the United States, the number was one in three. Mitteroecker and colleagues discovered that the rate of C-sections has gone up 10% to 20% since the mid-twentieth century. They argued that because the availability of safe Cesarean sections significantly reduced maternal and infant mortality rates in the developed world, they have induced an evolutionary change. However, "It's not easy to foresee what this will mean for the future of humans and birth," Mitteroecker told The Independent. This is because the increase in baby sizes is limited by the mother's metabolic capacity and modern medicine, which makes it more likely that neonates who are born prematurely or are underweight to survive.

Westerners are evolving to have lower blood pressures because their modern diets contain high amounts of salt (NaCl), which raises blood pressure.

Researchers participating in the Framingham Heart Study, which began in 1948 and was intended to investigate the cause of heart disease among women and their descendants in Framingham, Massachusetts, found evidence for selective pressures against high blood pressure due to the modern Western diet, which contains high amounts of salt, known for raising blood pressure. They also found evidence for selection against hypercholesterolemnia, or high levels of cholesterol in the blood. Evolutionary geneticist Stephen Stearns and his colleagues reported signs that women were gradually becoming shorter and heavier. Stearns argued that human culture and changes humans have made on their natural environments are driving human evolution rather than putting the process to a halt. The data indicates that the women were not eating more; rather, the ones who were heavier tended to have more children. Stearns and his team also discovered that the subjects of the study tended to reach menopause later; they estimated that if the environment remains the same, the average age at menopause will increase by about a year in 200 years, or about ten generations. All these traits have medium to high heritability. Given the starting date of the study, the spread of these adaptations can be observed in just a few generations.

By analyzing genomic data of 60,000 individuals of Caucasian descent from Kaiser Permanente in Northern California, and 150,000 people the UK Biobank, evolutionary geneticist Joseph Pickrell and evolutionary biologist Molly Przeworski were able to identify signs of biological evolution among living human generations. For the purposes of studying evolution, one lifetime is the shortest possible time scale. An allele associated with difficulty withdrawing from tobacco smoking dropped in frequency among the British but not among the Northern Californians. This suggests that heavy smokers—who were common in Britain during the 1950s but not in Northern California—were selected against. A set of alleles linked to later menarche was more common among women who lived for longer. An allele called ApoE4, linked to Alzheimer's disease, fell in frequency as carriers tended to not live for very long. In fact, these were the only traits that reduced life expectancy Pickrell and Przeworski found, which suggests that other harmful traits probably have already been eradicated. Only among older people are the effects of Alzheimer's disease and smoking visible. Moreover, smoking is a relatively recent trend. It is not entirely clear why such traits bring evolutionary disadvantages, however, since older people have already had children. Scientists proposed that either they also bring about harmful effects in youth or that they reduce an individual's inclusive fitness, or the tendency of organisms that share the same genes to help each other. Thus, mutations that make it difficult for grandparents to help raise their grandchildren are unlikely to propagate throughout the population. Pickrell and Przeworski also investigated 42 traits determined by multiple alleles rather than just one, such as the timing of puberty. They found that later puberty and older age of first birth were correlated with higher life expectancy.

Larger sample sizes allow for the study of rarer mutations. Pickrell and Przeworski told The Atlantic that a sample of half a million individuals would enable them to study mutations that occur among only 2% of the population, which would provide finer details of recent human evolution. While studies of short time scales such as these are vulnerable to random statistical fluctuations, they can improve understanding of the factors that affect survival and reproduction among contemporary human populations.

Evolutionary geneticist Jaleal Sanjak and his team analyzed genetic and medical information from more than 200,000 women over the age of 45 and 150,000 men over the age of 50—people who have passed their reproductive years—from the UK Biobank and identified 13 traits among women and ten among men that were linked to having children at a younger age, having a higher body-mass index, fewer years of education, and lower levels of fluid intelligence, or the capacity for logical reasoning and problem solving. Sanjak noted, however, that it was not known whether having children actually made women heavier or being heavier made it easier to reproduce. Because taller men and shorter women tended to have more children and because the genes associated with height affect men and women equally, the average height of the population will likely remain the same. Among women who had children later, those with higher levels of education had more children.

Evolutionary biologist Hakhamanesh Mostafavi led a 2017 study that analyzed data of 215,000 individuals from just a few generations the United Kingdom and the United States and found a number of genetic changes that affect longevity. The ApoE allele linked to Alzheimer's disease was rare among women aged 70 and over while the frequency of the CHRNA3 gene associated with smoking addiction among men fell among middle-aged men and up. Because this is not itself evidence of evolution, since natural selection only cares about successful reproduction not longevity, scientists have proposed a number of explanations. Men who live longer tend to have more children. Men and women who survive till old age can help take care of both their children and grandchildren, in benefits their descendants down the generations. This explanation is known as the grandmother hypothesis. It is also possible that Alzheimer's disease and smoking addiction are also harmful earlier in life, but the effects are more subtle and larger sample sizes are required in order to study them. Mostafavi and his team also found that mutations causing health problems such as asthma, having a high body-mass index and high cholesterol levels were more common among those with shorter lifespans while mutations those leading to delayed puberty and reproduction were more common among long living individuals. According to geneticist Jonathan Pritchard, while the link between fertility and longevity was identified in previous studies, those did not entirely rule out the effects of educational and financial status—people who rank high in both tend to have children later in life; this seems to suggest the existence of an evolutionary trade-off between longevity and fertility.

In South Africa, where large numbers of people are infected with HIV, some have genes that help them combat this virus, making it more likely that they would survive and pass this trait onto their children. If the virus persists, humans living in this part of the world could become resistant to it in as little as hundreds of years. However, because HIV evolves more quickly than humans, it will more likely be dealt with technologically rather than genetically.

The Amish have a mutation that extends their life expectancy and reduces their susceptibility to diabetes.

A 2017 study by researchers from Northwestern University unveiled a mutation among the Old Order Amish living in Berne, Indiana, that suppressed their chances of having diabetes and extends their life expectancy by about ten years on average. That mutation occurred in the gene called Serpine1, which codes for the production of the protein PAI-1 (plasminogen activator inhibitor), which regulates blood clotting and plays a role in the aging process. About 24% of the people sampled carried this mutation and had a life expectancy of 85, higher than the community average of 75. Researchers also found the telomeres—non-functional ends of human chromosomes—of those with the mutation to be longer than those without. Because telomeres shorten as the person ages, they determine the person's life expectancy. Those with longer telomeres tend to live longer. At present, the Amish live in 22 U.S. states plus the Canadian province of Ontario. They live simple lifestyles that date back centuries and generally insulate themselves from modern North American society. They are mostly indifferent towards modern medicine, but scientists do have a healthy relationship with the Amish community in Berne. Their detailed genealogical records make them ideal subjects for research.

Multidisciplinary research suggests that ongoing evolution could help explain the rise of certain medical conditions such as autism and autoimmune disorders. Autism and schizophrenia may be due to genes inherited from the mother and the father which are over-expressed and which fight a tug-of-war in the child's body. Allergies, asthma, and autoimmune disorders appear linked to higher standards of sanitation, which prevent the immune systems of modern humans from being exposed to various parasites and pathogens the way their ancestors' were, making them hypersensitive and more likely to overreact. The human body is not built from a professionally engineered blue print but a system shaped over long periods of time by evolution with all kinds of trade-offs and imperfections. Understanding the evolution of the human body can help medical doctors better understand and treat various disorders. Research in evolutionary medicine suggests that diseases are prevalent because natural selection favors reproduction over health and longevity. In addition, biological evolution is slower than cultural evolution and humans evolve more slowly than pathogens.

Whereas in the ancestral past, humans lived in geographically isolated communities where inbreeding was rather common, modern transportation technologies have made it much easier for people to travel great distances and facilitated further genetic mixing, giving rise to additional variations in the human gene pool. It also enables the spread of diseases worldwide, which can have an effect on human evolution. Besides the selection and flow of genes and alleles, another mechanism of biological evolution is epigenetics, or changes not to the DNA sequence itself, but rather the way it is expressed. Scientists already know that chronic illnesses and stress are epigenetic mechanisms

Delayed-choice quantum eraser

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Delayed-choice_quantum_eraser A delayed-cho...