Search This Blog

Thursday, March 25, 2021

Neanderthal extinction

From Wikipedia, the free encyclopedia

Distribution of the Neanderthal, and main sites.
 
Replacement of Neanderthals by early modern humans.

Neanderthals became extinct around 40,000 years ago. This timing, based on research published in Nature in 2014, is much earlier than previous estimates, and derives from improved radiocarbon-dating methods analyzing 40 sites from Spain to Russia. Evidence for continued Neanderthal presence in the Iberian Peninsula 37,000 years ago was published in 2017.

Various hypotheses on the causes of Neanderthal exinction implicate:

It seems unlikely that any single one of these hypotheses is sufficient on its own; rather, multiple factors probably contributed to the demise of an already low population.

Possible coexistence before extinction

Neanderthal tools
 
modern human tools

In research published in Nature in 2014, an analysis of radiocarbon dates from forty Neanderthal sites from Spain to Russia found that the Neanderthals disappeared in Europe between 41,000 and 39,000 years ago with 95% probability. The study also found with the same probability that modern humans and Neanderthals overlapped in Europe for between 2,600 and 5,400 years. Modern humans reached Europe between 45,000 and 43,000 years ago. Improved radiocarbon dating published in 2015 indicates that Neanderthals disappeared around 40,000 years ago, which overturns older carbon dating which indicated that Neanderthals may have lived as recently as 24,000 years ago, including in refugia on the south coast of the Iberian peninsula such as Gorham's Cave. Zilhão et al. (2017) argue for pushing this date forward by some 3,000 years, to 37,000 years ago. Inter-stratification of Neanderthal and modern human remains has been suggested, but is disputed. Stone tools that have been proposed to be linked to Neanderthals have been found at Byzovya (ru:Бызовая) in the polar Urals, and dated to 31,000 to 34,000 years ago.

Possible cause of extinction

Violence

Some authors have discussed the possibility that Neanderthal extinction was either precipitated or hastened by violent conflict with Homo sapiens. Violence in early hunter-gatherer societies usually occurred as a result of resource competition following natural disasters. It is therefore plausible to suggest that violence, including primitive warfare, would have transpired between the two human species. The hypothesis that early humans violently replaced Neanderthals was first proposed by French paleontologist Marcellin Boule (the first person to publish an analysis of a Neanderthal) in 1912.

Parasites and pathogens

Another possibility is the spread among the Neanderthal population of pathogens or parasites carried by Homo sapiens. Neanderthals would have limited immunity to diseases they had not been exposed to, so diseases carried into Europe by Homo sapiens could have been particularly lethal to them if Homo sapiens were relatively resistant. If it were relatively easy for pathogens to leap between these two similar species, perhaps because they lived in close proximity, then Homo sapiens would have provided a pool of individuals capable of infecting Neanderthals and potentially preventing the epidemic from burning itself out as Neanderthal population fell. On the other hand, the same mechanism could work in reverse, and the resistance of Homo sapiens to Neanderthal pathogens and parasites would need explanation. However, there is good reason to suppose that the net movement of novel human pathogens would have been overwhelmingly uni-directional, from Africa into the Eurasian landmass. The most common source of novel human pathogens (like HIV1 today) would have been our closest phylogenetic relatives, namely, other primates, of which there were many in Africa but only one known species in Europe, the Barbary Macaque, and only a few species in Southern Asia. As a result, African populations of humans would have been exposed to, and developed resistance to, and become carriers of, more novel pathogens than their Eurasian cousins, with far-reaching consequences. The uni-directional movement of pathogens would have enforced a uni-directional movement of human populations out of Africa and doomed the immunologically naïve indigenous populations of Eurasia whenever they encountered more recent emigrants out of Africa and ensured that Africa remained the crucible of human evolution in spite of the widespread distribution of hominins over the highly variable geography of Eurasia. This putative "African advantage" would have persisted until the agricultural revolution 10,000 years ago in Eurasia, after which domesticated animals overtook other primates species as the most common source of novel human pathogens, replacing the "African advantage" with a "Eurasian advantage". The devastating effect of Eurasian pathogens on Native American populations in the historical era gives us some idea of the effect that modern humans may have had on the precursor populations of hominins in Eurasia 40,000 years ago. An examination of human and Neanderthal genomes and adaptations regarding pathogens or parasites may shed further light on this issue.

Competitive replacement

Sapiens and Neanderthal skulls

Species specific disadvantages

Slight competitive advantage on the part of modern humans has accounted for Neanderthals' decline on a timescale of thousands of years.

Generally small and widely-dispersed fossil sites suggest that Neanderthals lived in less numerous and socially more isolated groups than contemporary Homo sapiens. Tools such as Mousterian flint stone flakes and Levallois points are remarkably sophisticated from the outset, yet they have a slow rate of variability and general technological inertia is noticeable during the entire fossil period. Artifacts are of utilitarian nature, and symbolic behavioral traits are undocumented before the arrival of modern humans in Europe around 40,000 to 35,000 years ago.

The noticeable morphological differences in skull shape between the two human species also have cognitive implications. These include the Neandertals' smaller parietal lobes and cerebellum, areas implicated in tool use, creativity, and higher-order conceptualization. The differences, while slight, would have been visible to natural selection and may underlie and explain the differences in social behaviors, technological innovation, and artistic output.

Jared Diamond, a supporter of competitive replacement, points out in his book The Third Chimpanzee that the replacement of Neanderthals by modern humans is comparable to patterns of behavior that occur whenever people with advanced technology clash with less advanced people.

Division of labor

In 2006, two anthropologists of the University of Arizona proposed an efficiency explanation for the demise of the Neanderthals. In an article titled "What's a Mother to Do? The Division of Labor among Neanderthals and Modern Humans in Eurasia", it was posited that Neanderthal division of labor between the sexes was less developed than Middle paleolithic Homo sapiens. Both male and female Neanderthals participated in the single occupation of hunting big game, such as bison, deer, gazelles, and wild horses. This hypothesis proposes that the Neanderthal's relative lack of labor division resulted in less efficient extraction of resources from the environment as compared to Homo sapiens.

Anatomical differences and running ability

Researchers such as Karen L. Steudel of the University of Wisconsin have highlighted the relationship of Neanderthal anatomy (shorter and stockier than that of modern humans) and the ability to run and the requirement of energy (30% more).

Nevertheless, in the recent study, researchers Martin Hora and Vladimir Sladek of Charles University in Prague show that Neanderthal lower limb configuration, particularly the combination of robust knees, long heels, and short lower limbs, increased the effective mechanical advantage of the Neanderthal knee and ankle extensors, thus reducing the force needed and the energy spent for locomotion significantly. The walking cost of the Neanderthal male is now estimated to be 8–12% higher than that of anatomically modern males, whereas the walking cost of the Neanderthal female is considered to be virtually equal to that of anatomically modern females.

Other researchers, like Yoel Rak, from Tel-Aviv University in Israel, have noted that the fossil records show that Neanderthal pelvises in comparison to modern human pelvises would have made it much harder for Neanderthals to absorb shocks and to bounce off from one step to the next, giving modern humans another advantage over Neanderthals in running and walking ability. However, Rak also notes that all archaic humans had wide pelvises, indicating that this is the ancestral morphology and that modern humans underwent a shift towards narrower pelvises in the late Pleistocene.

Modern humans' advantage in hunting warm climate animals

Pat Shipman, from Pennsylvania State University in the United States, argues that the domestication of the dog gave modern humans an advantage when hunting. The oldest remains of domesticated dogs were found in Belgium (31,700 BP) and in Siberia (33,000 BP). A survey of early sites of modern humans and Neanderthals with faunal remains across Spain, Portugal and France provided an overview of what modern humans and Neanderthals ate. Rabbit became more frequent, while large mammals – mainly eaten by the Neanderthals – became increasingly rare. In 2013, DNA testing on the "Altai dog", a paleolithic dog's remains from the Razboinichya Cave (Altai Mountains), has linked this 33,000-year-old dog with the present lineage of Canis lupus familiaris.

Interbreeding

Human-Neandertal mtDNA
 
Neanderthal DNA extraction

Interbreeding can only account for a certain degree of Neanderthal population decrease. A homogeneous absorption of an entire species is a rather unrealistic idea. This would also be counter to strict versions of the Recent African Origin, since it would imply that at least part of the genome of Europeans would descend from Neanderthals, whose ancestors left Africa at least 350,000 years ago.

The most vocal proponent of the hybridization hypothesis is Erik Trinkaus of Washington University. Trinkaus claims various fossils as hybrid individuals, including the "child of Lagar Velho", a skeleton found at Lagar Velho in Portugal. In a 2006 publication co-authored by Trinkaus, the fossils found in 1952 in the cave of Peștera Muierilor, Romania, are likewise claimed as hybrids.

Genetic studies indicate some form of hybridization between archaic humans and modern humans had taken place after modern humans emerged from Africa. An estimated 1–4% of the DNA in Europeans and Asians (e.g. French, Chinese and Papua probands) is non-modern, and shared with ancient Neanderthal DNA rather than with sub-Saharan Africans (e.g. Yoruba and San probands).

Modern-human findings in Abrigo do Lagar Velho, Portugal allegedly featuring Neanderthal admixtures have been published. However, the interpretation of the Portuguese specimen is disputed.

Jordan, in his work Neanderthal, points out that without some interbreeding, certain features on some "modern" skulls of Eastern European Cro-Magnon heritage are hard to explain. In another study, researchers have recently found in Peştera Muierilor, Romania, remains of European humans from ~37,000–42,000 years ago who possessed mostly diagnostic "modern" anatomical features, but also had distinct Neanderthal features not present in ancestral modern humans in Africa, including a large bulge at the back of the skull, a more prominent projection around the elbow joint, and a narrow socket at the shoulder joint.

The Neanderthal genome project published papers in 2010 and 2014 stating that Neanderthals contributed to the DNA of modern humans, including most humans outside sub-Saharan Africa, as well as a few populations in sub-Saharan Africa, through interbreeding, likely between 50,000 and 60,000 years ago. Recent studies also show that a few Neanderthals began mating with ancestors of modern humans long before the large "out of Africa migration" of the present day non-Africans, as early as 100,000 years ago. In 2016, research indicated that there were three distinct episodes of interbreeding between modern humans and Neanderthals: the first encounter involved the ancestors of non-African modern humans, probably soon after leaving Africa; the second, after the ancestral Melanesian group had branched off (and subsequently had a unique episode of interbreeding with Denisovans); and the third, involving the ancestors of East Asians only.

Neanderthal DNA Comparison (SharedDNA)

While interbreeding is viewed as the most parsimonious interpretation of the genetic discoveries, the authors point out they cannot conclusively rule out an alternative scenario, in which the source population of non-African modern humans was already more closely related to Neanderthals than other Africans were, due to ancient genetic divisions within Africa. Among the genes shown to differ between present-day humans and Neanderthals were RPTN, SPAG17, CAN15, TTF1 and PCD16.

Climate change

Neanderthals went through a demographic crisis in Western Europe that seems to coincide with climate change that resulted in a period of extreme cold in Western Europe. "The fact that Neanderthals in Western Europe were nearly extinct, but then recovered long before they came into contact with modern humans came as a complete surprise to us," said Love Dalén, associate professor at the Swedish Museum of Natural History in Stockholm. If so, this would indicate that Neanderthals may have been very sensitive to climate change.

Natural catastrophe

A number of researchers have argued that the Campanian Ignimbrite Eruption, a volcanic eruption near Naples, Italy, about 39,280 ± 110 years ago (older estimate ~37,000 years), erupting about 200 km3 (48 cu mi) of magma (500 km3 (120 cu mi) bulk volume) contributed to the extinction of Neanderthal man. The argument has been developed by Golovanova et al. The hypothesis posits that although Neanderthals had encountered several Interglacials during 250,000 years in Europe, inability to adapt their hunting methods caused their extinction facing H. sapiens competition when Europe changed into a sparsely vegetated steppe and semi-desert during the last Ice Age. Studies of sediment layers at Mezmaiskaya Cave suggest a severe reduction of plant pollen. The damage to plant life would have led to a corresponding decline in plant-eating mammals hunted by the Neanderthals.

 

Higher background radiation linked to lower cancer mortality - study

Higher background radiation levels were also linked to a life expectancy about 2.5 years longer than those living in areas with relatively low levels.

A BIRD SITS on a radiation sign at the uranium ore dump near the town of Mailuu-Suu, Kyrgyzstan. (photo credit: PAVEL MIKHEYEV/REUTERS)
A BIRD SITS on a radiation sign at the uranium ore dump near the town of Mailuu-Suu, Kyrgyzstan.
(photo credit: PAVEL MIKHEYEV/REUTERS)
Higher background radiation levels lead to lower levels of some cancers and may extend life expectancy, according to a new study by Ben-Gurion University (BGU) of the Negev and Nuclear Research Center Negev (NRCN) published in the Biogerontology journal in January.
 
The higher radiation levels were linked to lower levels of lung, pancreatic and colon cancers in both men and women and lower rates of brain and bladder cancers in men. People living in areas with with relatively high background radiation had a life expectancy about 2.5 years longer than those living in areas with relatively low levels. The higher life expectancy may be linked to the lower cancer mortality rates found in areas with higher background radiation.
 
The study was conducted by using the US Environmental Protection Agency's radiation dose calculator to retrieve data from 3,129 US counties, cancer rate data from the US Cancer Statistics and life expectancy data from the Institute for Health Metrics and Evaluation at the University of Washington Medical Center.
 
Background radiation is an ionizing radiation that exists in the environment from natural sources, including radiation from space and terrestrial sources.
 
Background radiation fluctuates between 92 to 227 millirem per year in the US, according to BGU. A millirem is one thousandth of a Roentgen Equivalent Man (REM), a measure of the health effect of low-level ionizing radiation on the human body.
 
The scientists stated that the findings indicate "clear beneficial health effects in humans" related to higher background radiation, adding that the linear no-threshold (LNT) paradigm, which assumes that any exposure to ionizing radiation is risky, should be reevaluated.

The researchers stressed that hundreds of billions of dollars are spent each year to maintain extremely low radiation levels based on the LNT paradigm which their findings indicate may not be necessary. They suggested setting a threshold level, instead of operating on the assumption that any amount of radiation is bad.

Behavioral modernity

From Wikipedia, the free encyclopedia
 
Upper Paleolithic (16,000-year-old) cave painting from Lascaux cave in France

Behavioral modernity is a suite of behavioral and cognitive traits that distinguishes current Homo sapiens from other anatomically modern humans, hominins, and primates. Most scholars agree that modern human behavior can be characterized by abstract thinking, planning depth, symbolic behavior (e.g., art, ornamentation), music and dance, exploitation of large game, and blade technology, among others. Underlying these behaviors and technological innovations are cognitive and cultural foundations that have been documented experimentally and ethnographically by evolutionary and cultural anthropologists. These human universal patterns include cumulative cultural adaptation, social norms, language, and extensive help and cooperation beyond close kin.

Within the tradition of evolutionary anthropology and related disciplines, it has been argued that the development of these modern behavioral traits, in combination with the climatic conditions of the Last Glacial Period and Last Glacial Maximum causing population bottlenecks, contributed to the evolutionary success of Homo sapiens worldwide relative to Neanderthals, Denisovans, and other archaic humans.

Arising from differences in the archaeological record, debate continues as to whether anatomically modern humans were behaviorally modern as well. There are many theories on the evolution of behavioral modernity. These generally fall into two camps: gradualist and cognitive approaches. The Later Upper Paleolithic Model theorises that modern human behavior arose through cognitive, genetic changes in Africa abruptly around 40,000–50,000 years ago around the time of the Out-of-Africa migration, prompting the movement of modern humans out of Africa and across the world. Other models focus on how modern human behavior may have arisen through gradual steps, with the archaeological signatures of such behavior appearing only through demographic or subsistence-based changes. Many cite evidence of behavioral modernity earlier (by at least about 150,000–75,000 years ago and possibly earlier) namely in the African Middle Stone Age. Sally McBrearty and Alison S. Brooks are notable proponents of gradualism, challenging European-centric models by situating more change in the Middle Stone Age of African pre-history, though this version of the story is more difficult to develop in concrete terms due to a thinning fossil record as one goes further back in time.

Definition

To classify what should be included in modern human behavior, it is necessary to define behaviors that are universal among living human groups. Some examples of these human universals are abstract thought, planning, trade, cooperative labor, body decoration, and the control and use of fire. Along with these traits, humans possess much reliance on social learning. This cumulative cultural change or cultural "ratchet" separates human culture from social learning in animals. As well, a reliance on social learning may be responsible in part for humans' rapid adaptation to many environments outside of Africa. Since cultural universals are found in all cultures including some of the most isolated indigenous groups, these traits must have evolved or have been invented in Africa prior to the exodus.

Archaeologically, a number of empirical traits have been used as indicators of modern human behavior. While these are often debated a few are generally agreed upon. Archaeological evidence of behavioral modernity includes:

Critiques

Several critiques have been placed against the traditional concept of behavioral modernity, both methodologically and philosophically. Shea (2011) outlines a variety of problems with this concept, arguing instead for "behavioral variability", which, according to the author, better describes the archaeological record. The use of trait lists, according to Shea (2011), runs the risk of taphonomic bias, where some sites may yield more artifacts than others despite similar populations; as well, trait lists can be ambiguous in how behaviors may be empirically recognized in the archaeological record. Shea (2011) in particular cautions that population pressure, cultural change, or optimality models, like those in human behavioral ecology, might better predict changes in tool types or subsistence strategies than a change from "archaic" to "modern" behavior. Some researchers argue that a greater emphasis should be placed on identifying only those artifacts which are unquestionably, or purely, symbolic as a metric for modern human behavior.

Theories and models

Late Upper Paleolithic Model or "Upper Paleolithic Revolution"

The Late Upper Paleolithic Model, or Upper Paleolithic Revolution, refers to the idea that, though anatomically modern humans first appear around 150,000 years ago (as was once believed), they were not cognitively or behaviorally "modern" until around 50,000 years ago, leading to their expansion out of Africa and into Europe and Asia. These authors note that traits used as a metric for behavioral modernity do not appear as a package until around 40–50,000 years ago. Klein (1995) specifically describes evidence of fishing, bone shaped as a tool, hearths, significant artifact diversity, and elaborate graves are all absent before this point. According to these authors, art only becomes common beyond this switching point, signifying a change from archaic to modern humans. Most researchers argue that a neurological or genetic change, perhaps one enabling complex language, such as FOXP2, caused this revolutionary change in humans.

Alternative models

Contrasted with this view of a spontaneous leap in cognition among ancient humans, some authors like Alison S. Brooks, primarily working in African archaeology, point to the gradual accumulation of "modern" behaviors, starting well before the 50,000 year benchmark of the Upper Paleolithic Revolution models. Howiesons Poort, Blombos, and other South African archaeological sites, for example, show evidence of marine resource acquisition, trade, the making of bone tools, blade and microlith technology, and abstract ornamentation at least by 80,000 years ago. Given evidence from Africa and the Middle East, a variety of hypotheses have been put forth to describe an earlier, gradual transition from simple to more complex human behavior. Some authors have pushed back the appearance of fully modern behavior to around 80,000 years ago or earlier in order to incorporate the South African data.

Others focus on the slow accumulation of different technologies and behaviors across time. These researchers describe how anatomically modern humans could have been cognitively the same and what we define as behavioral modernity is just the result of thousands of years of cultural adaptation and learning. D'Errico and others have looked at Neanderthal culture, rather than early human behavior exclusively, for clues into behavioral modernity. Noting that Neanderthal assemblages often portray traits similar to those listed for modern human behavior, researchers stress that the foundations for behavioral modernity may in fact lie deeper in our hominin ancestors. If both modern humans and Neanderthals express abstract art and complex tools then "modern human behavior" cannot be a derived trait for our species. They argue that the original "human revolution" theory reflects a profound Eurocentric bias. Recent archaeological evidence, they argue, proves that humans evolving in Africa some 300,000 or even 400,000 years ago were already becoming cognitively and behaviourally "modern". These features include blade and microlithic technology, bone tools, increased geographic range, specialized hunting, the use of aquatic resources, long distance trade, systematic processing and use of pigment, and art and decoration. These items do not occur suddenly together as predicted by the "human revolution" model, but at sites that are widely separated in space and time. This suggests a gradual assembling of the package of modern human behaviours in Africa, and its later export to other regions of the Old World.

Between these extremes is the view – currently supported by archaeologists Chris Henshilwood, Curtis Marean, Ian Watts and others – that there was indeed some kind of 'human revolution' but that it occurred in Africa and spanned tens of thousands of years. The term "revolution" in this context would mean not a sudden mutation but a historical development along the lines of "the industrial revolution" or "the Neolithic revolution". In other words, it was a relatively accelerated process, too rapid for ordinary Darwinian "descent with modification" yet too gradual to be attributed to a single genetic or other sudden event. These archaeologists point in particular to the relatively explosive emergence of ochre crayons and shell necklaces apparently used for cosmetic purposes. These archaeologists see symbolic organisation of human social life as the key transition in modern human evolution. Recently discovered at sites such as Blombos Cave and Pinnacle Point, South Africa, pierced shells, pigments and other striking signs of personal ornamentation have been dated within a time-window of 70,000–160,000 years ago in the African Middle Stone Age, suggesting that the emergence of Homo sapiens coincided, after all, with the transition to modern cognition and behaviour. While viewing the emergence of language as a 'revolutionary' development, this school of thought generally attributes it to cumulative social, cognitive and cultural evolutionary processes as opposed to a single genetic mutation.

A further view, taken by archaeologists such as Francesco D'Errico and João Zilhão, is a multi-species perspective arguing that evidence for symbolic culture in the form of utilised pigments and pierced shells are also found in Neanderthal sites, independently of any "modern" human influence.

Cultural evolutionary models may also shed light on why although evidence of behavioral modernity exists before 50,000 years ago it is not expressed consistently until that point. With small population sizes, human groups would have been affected by demographic and cultural evolutionary forces that may not have allowed for complex cultural traits. According to some authors until population density became significantly high, complex traits could not have been maintained effectively. Some genetic evidence supports a dramatic increase in population size before human migration out of Africa. High local extinction rates within a population also can significantly decrease the amount of diversity in neutral cultural traits, regardless of cognitive ability.

Highly speculatively, bicameral mind theory argues for an additional, and cultural rather than genetic, shift from selfless to self-perceiving forms of human cognition and behavior very late in human history, in the Bronze Age. This is based on a literary analysis of Bronze Age texts which claims to show the first appearances of the concept of self around this time, replacing the voices of gods as the primary form of recorded human cognition. This non-mainstream theory is not widely accepted but does receive serious academic interest from time to time.

Archaeological evidence

Africa

Recent research indicates that Homo sapiens originated in Africa between around 350,000 and 260,000 years ago. There is some evidence for the beginning of modern behavior among early African H. sapiens around that period.

Before the Out of Africa theory was generally accepted, there was no consensus on where the human species evolved and, consequently, where modern human behavior arose. Now, however, African archaeology has become extremely important in discovering the origins of humanity. The first Cro-Magnon expansion into Europe around 48,000 years ago is generally accepted as already "modern", and it is now generally believed that behavioral modernity appeared in Africa before 50,000 years ago, either significantly earlier, or possibly as a late Upper Paleolithic "revolution" soon before which prompted migration out of Africa.

A variety of evidence of abstract imagery, widened subsistence strategies, and other "modern" behaviors have been discovered in Africa, especially South, North, and East Africa. The Blombos Cave site in South Africa, for example, is famous for rectangular slabs of ochre engraved with geometric designs. Using multiple dating techniques, the site was dated to be around 77,000 and 100,000 to 75,000 years old. Ostrich egg shell containers engraved with geometric designs dating to 60,000 years ago were found at Diepkloof, South Africa. Beads and other personal ornamentation have been found from Morocco which might be as much as 130,000 years old; as well, the Cave of Hearths in South Africa has yielded a number of beads dating from significantly prior to 50,000 years ago, and shell beads dating to about 75,000 years ago have been found at Blombos Cave, South Africa.

Specialized projectile weapons as well have been found at various sites in Middle Stone Age Africa, including bone and stone arrowheads at South African sites such as Sibudu Cave (along with an early bone needle also found at Sibudu) dating approximately 72,000-60,000 years ago on some of which poisons may have been used, and bone harpoons at the Central African site of Katanda dating to about 90,000 years ago. Evidence also exists for the systematic heat treating of silcrete stone to increased its flake-ability for the purpose of toolmaking, beginning approximately 164,000 years ago at the South African site of Pinnacle Point and becoming common there for the creation of microlithic tools at about 72,000 years ago.

In 2008, an ochre processing workshop likely for the production of paints was uncovered dating to ca. 100,000 years ago at Blombos Cave, South Africa. Analysis shows that a liquefied pigment-rich mixture was produced and stored in the two abalone shells, and that ochre, bone, charcoal, grindstones and hammer-stones also formed a composite part of the toolkits. Evidence for the complexity of the task includes procuring and combining raw materials from various sources (implying they had a mental template of the process they would follow), possibly using pyrotechnology to facilitate fat extraction from bone, using a probable recipe to produce the compound, and the use of shell containers for mixing and storage for later use. Modern behaviors, such as the making of shell beads, bone tools and arrows, and the use of ochre pigment, are evident at a Kenyan site by 78,000-67,000 years ago. Evidence of early stone-tipped projectile weapons (a characteristic tool of Homo sapiens), the stone tips of javelins or throwing spears, were discovered in 2013 at the Ethiopian site of Gademotta, and date to around 279,000 years ago.

Expanding subsistence strategies beyond big-game hunting and the consequential diversity in tool types has been noted as signs of behavioral modernity. A number of South African sites have shown an early reliance on aquatic resources from fish to shellfish. Pinnacle Point, in particular, shows exploitation of marine resources as early as 120,000 years ago, perhaps in response to more arid conditions inland. Establishing a reliance on predictable shellfish deposits, for example, could reduce mobility and facilitate complex social systems and symbolic behavior. Blombos Cave and Site 440 in Sudan both show evidence of fishing as well. Taphonomic change in fish skeletons from Blombos Cave have been interpreted as capture of live fish, clearly an intentional human behavior.

Humans in North Africa (Nazlet Sabaha, Egypt) are known to have dabbled in chert mining, as early as ≈100,000 years ago, for the construction of stone tools.

Evidence was found in 2018, dating to about 320,000 years ago, at the Kenyan site of Olorgesailie, of the early emergence of modern behaviors including: long-distance trade networks (involving goods such as obsidian), the use of pigments, and the possible making of projectile points. It is observed by the authors of three 2018 studies on the site, that the evidence of these behaviors is approximately contemporary to the earliest known Homo sapiens fossil remains from Africa (such as at Jebel Irhoud and Florisbad), and they suggest that complex and modern behaviors had already begun in Africa around the time of the emergence of anatomically modern Homo sapiens.

In 2019, further evidence of early complex projectile weapons in Africa was found at Aduma, Ethiopia, dated 100,000-80,000 years ago, in the form of points considered likely to belong to darts delivered by spear throwers.

Olduvai Hominid 1 wore facial piercings.

Europe

While traditionally described as evidence for the later Upper Paleolithic Model, European archaeology has shown that the issue is more complex. A variety of stone tool technologies are present at the time of human expansion into Europe and show evidence of modern behavior. Despite the problems of conflating specific tools with cultural groups, the Aurignacian tool complex, for example, is generally taken as a purely modern human signature. The discovery of "transitional" complexes, like "proto-Aurignacian", have been taken as evidence of human groups progressing through "steps of innovation". If, as this might suggest, human groups were already migrating into eastern Europe around 40,000 years and only afterward show evidence of behavioral modernity, then either the cognitive change must have diffused back into Africa or was already present before migration.

In light of a growing body of evidence of Neanderthal culture and tool complexes some researchers have put forth a "multiple species model" for behavioral modernity. Neanderthals were often cited as being an evolutionary dead-end, apish cousins who were less advanced than their human contemporaries. Personal ornaments were relegated as trinkets or poor imitations compared to the cave art produced by H. sapiens. Despite this, European evidence has shown a variety of personal ornaments and artistic artifacts produced by Neanderthals; for example, the Neanderthal site of Grotte du Renne has produced grooved bear, wolf, and fox incisors, ochre and other symbolic artifacts. Although burials are few and controversial, there has been circumstantial evidence of Neanderthal ritual burials. There are two options to describe this symbolic behavior among Neanderthals: they copied cultural traits from arriving modern humans or they had their own cultural traditions comparative with behavioral modernity. If they just copied cultural traditions, which is debated by several authors, they still possessed the capacity for complex culture described by behavioral modernity. As discussed above, if Neanderthals also were "behaviorally modern" then it cannot be a species-specific derived trait.

Asia

Most debates surrounding behavioral modernity have been focused on Africa or Europe but an increasing amount of focus has been placed on East Asia. This region offers a unique opportunity to test hypotheses of multi-regionalism, replacement, and demographic effects. Unlike Europe, where initial migration occurred around 50,000 years ago, human remains have been dated in China to around 100,000 years ago. This early evidence of human expansion calls into question behavioral modernity as an impetus for migration.

Stone tool technology is particularly of interest in East Asia. Following Homo erectus migrations out of Africa, Acheulean technology never seems to appear beyond present-day India and into China. Analogously, Mode 3, or Levallois technology, is not apparent in China following later hominin dispersals. This lack of more advanced technology has been explained by serial founder effects and low population densities out of Africa. Although tool complexes comparative to Europe are missing or fragmentary, other archaeological evidence shows behavioral modernity. For example, the peopling of the Japanese archipelago offers an opportunity to investigate the early use of watercraft. Although one site, Kanedori in Honshu, does suggest the use of watercraft as early as 84,000 years ago, there is no other evidence of hominins in Japan until 50,000 years ago.

The Zhoukoudian cave system near Beijing has been excavated since the 1930s and has yielded precious data on early human behavior in East Asia. Although disputed, there is evidence of possible human burials and interred remains in the cave dated to around 34-20,000 years ago. These remains have associated personal ornaments in the form of beads and worked shell, suggesting symbolic behavior. Along with possible burials, numerous other symbolic objects like punctured animal teeth and beads, some dyed in red ochre, have all been found at Zhoukoudian. Although fragmentary, the archaeological record of eastern Asia shows evidence of behavioral modernity before 50,000 years ago but, like the African record, it is not fully apparent until that time.

Culture war

From Wikipedia, the free encyclopedia
Bismarck (left) and the Pope, from the German satirical magazine Kladderadatsch, 1875

A culture war is a cultural conflict between social groups and the struggle for dominance of their values, beliefs, and practices. It commonly refers to topics on which there is general societal disagreement and polarization in societal values.

The term is commonly used to describe aspects of contemporary politics in the United States, with issues such as abortion, homosexuality, transgender rights, pornography, multiculturalism, racial viewpoints and other cultural conflicts based on values, morality, and lifestyle which are described as the major political cleavage.

Etymology

The term culture war is a loan translation (calque) of the German Kulturkampf ('culture struggle'). In German, Kulturkampf, a term coined by Rudolf Virchow, refers to the clash between cultural and religious groups in the campaign from 1871 to 1878 under Chancellor Otto von Bismarck of the German Empire against the influence of the Roman Catholic Church.[3] The translation was printed in some American newspapers at the time.[4]

United States

1920s–1980s: Origins

In American usage, "culture war" may imply a conflict between those values considered traditionalist or conservative and those considered progressive or liberal. This usage originated in the 1920s when urban and rural American values came into closer conflict. This followed several decades of immigration to the States by people who earlier European immigrants considered 'alien'. It was also a result of the cultural shifts and modernizing trends of the Roaring '20s, culminating in the presidential campaign of Al Smith in 1928. In subsequent decades during the 20th century, the term was published occasionally in American newspapers.

The expression would join the vocabulary of U.S. politics in 1991 with the publication of Culture Wars: The Struggle to Define America by James Davison Hunter, who redefined the American notion of "culture war." Tracing the concept to the 1960s, Hunter perceived a dramatic realignment and polarization that had transformed U.S. politics and culture, including the issues of abortion, federal and state gun laws, immigration, separation of church and state, privacy, recreational drug use, LGBT rights, and censorship. The perceived focus of the American culture war and its definition have taken various forms since then.

1991–2001: Rise in prominence

James Davison Hunter, a sociologist at the University of Virginia, introduced the expression again in his 1991 publication, Culture Wars: The Struggle to Define America. Hunter described what he saw as a dramatic realignment and polarization that had transformed American politics and culture.

He argued that on an increasing number of "hot-button" defining issues—abortion, gun politics, separation of church and state, privacy, recreational drug use, homosexuality, censorship—there existed two definable polarities. Furthermore, not only were there a number of divisive issues, but society had divided along essentially the same lines on these issues, so as to constitute two warring groups, defined primarily not by nominal religion, ethnicity, social class, or even political affiliation, but rather by ideological world-views.

Hunter characterized this polarity as stemming from opposite impulses, toward what he referred to as Progressivism and as Orthodoxy. Others have adopted the dichotomy with varying labels. For example, Bill O'Reilly, a conservative political commentator and former host of the Fox News talk show The O'Reilly Factor, emphasizes differences between "Secular-Progressives" and "Traditionalists" in his 2006 book Culture Warrior.

Historian Kristin Kobes Du Mez attributes the 1990s emergence of culture wars to the end of the Cold War in 1991. She writes that Evangelical Christians viewed a particular Christian masculine gender role as the only defense of America against the threat of communism. When this threat ended upon the close of the Cold War, Evangelical leaders transferred the perceived source of threat from foreign communism to domestic changes in gender roles and sexuality.

Pat Buchanan in 2008

During the 1992 presidential election, commentator Pat Buchanan mounted a campaign for the Republican nomination for president against incumbent George H. W. Bush. In a prime-time slot at the 1992 Republican National Convention, Buchanan gave his speech on the culture war. He argued: "There is a religious war going on in our country for the soul of America. It is a cultural war, as critical to the kind of nation we will one day be as was the Cold War itself." In addition to criticizing environmentalists and feminism, he portrayed public morality as a defining issue:

The agenda [Bill] Clinton and [Hillary] Clinton would impose on America—abortion on demand, a litmus test for the Supreme Court, homosexual rights, discrimination against religious schools, women in combat units—that's change, all right. But it is not the kind of change America wants. It is not the kind of change America needs. And it is not the kind of change we can tolerate in a nation that we still call God's country.

A month later, Buchanan characterized the conflict as about power over society's definition of right and wrong. He named abortion, sexual orientation and popular culture as major fronts—and mentioned other controversies, including clashes over the Confederate flag, Christmas, and taxpayer-funded art. He also said that the negative attention his "culture war" speech received was itself evidence of America's polarization.

The culture war had significant impact on national politics in the 1990s. The rhetoric of the Christian Coalition of America may have weakened president George H. W. Bush's chances for re-election in 1992 and helped his successor, Bill Clinton, win reelection in 1996. On the other hand, the rhetoric of conservative cultural warriors helped Republicans gain control of Congress in 1994.

The culture wars influenced the debate over state-school history curricula in the United States in the 1990s. In particular, debates over the development of national educational standards in 1994 revolved around whether the study of American history should be a "celebratory" or "critical" undertaking and involved such prominent public figures as Lynne Cheney, Rush Limbaugh, and historian Gary Nash.

2001–2014: Post-9/11 era

43rd President George W. Bush, Donald Rumsfeld, and Paul Wolfowitz were prominent neoconservatives of the 2000s.

A political view called neoconservatism shifted the terms of the debate in the early 2000s. Neoconservatives differed from their opponents in that they interpreted problems facing the nation as moral issues rather than economic or political issues. For example, neoconservatives saw the decline of the traditional family structure as a spiritual crisis that required a spiritual response. Critics accused neoconservatives of confusing cause and effect.

During the 2000s, voting for Republicans began to correlate heavily with traditionalist or orthodox religious belief across diverse religious sects. Voting for Democrats became more correlated to liberal or modernist religious belief, and to being nonreligious. Belief in scientific conclusions, such as climate change, also became tightly coupled to political party affiliation in this era, causing climate scholar Andrew Hoffman to observe that climate change had "become enmeshed in the so-called culture wars."

Rally for Proposition 8, an item on the 2008 California ballot to ban same-sex marriage

Topics traditionally associated with culture war were not prominent in media coverage of the 2008 election season, with the exception of coverage of vice-presidential candidate Sarah Palin, who drew attention to her conservative religion and created a performative climate change denialism brand for herself. Palin's defeat in the election and subsequent resignation as governor of Alaska caused the Center for American Progress to predict "the coming end of the culture wars," which they attributed to demographic change, particularly high rates of acceptance of same-sex marriage among millennials.

2014–present: Broadening of the culture war

While traditional culture war issues, notably abortion, continue to be a focal point, the issues identified with culture war broadened and intensified in the mid-late 2010s. Journalist Michael Grunwald says that "President Donald Trump has pioneered a new politics of perpetual culture war" and lists the Black Lives Matter movement, U.S. national anthem protests, climate change, education policy, healthcare policy including Obamacare, and infrastructure policy as culture war issues in 2018. The rights of transgender people and the role of religion in lawmaking were identified as "new fronts in the culture war" by political scientist Jeremiah Castle, as the polarization of public opinion on these two topics resemble that of previous culture war issues. In 2020, during the COVID-19 pandemic, North Dakota governor Doug Burgum described opposition to wearing face masks as a "senseless" culture war issue that jeopardizes human safety.

This broader understanding of culture war issues in the mid-late 2010s and 2020s is associated with a political strategy called "owning the libs." Conservative media figures employing this strategy, prominently Ben Shapiro, emphasize and expand upon culture war issues with the goal of upsetting liberal people. According to Nicole Hemmer of Columbia University, this strategy is a substitute for the cohesive conservative ideology that existed during the Cold War. It holds a conservative voting bloc together in the absence of shared policy preferences among the bloc's members.

The Unite the Right rally in Charlottesville, Virginia in August 2017, an alt-right event regarded as a battle of the culture wars.

A number of conflicts about diversity in popular culture occurring in the 2010s, such as the Gamergate controversy, Comicsgate and the Sad Puppies science fiction voting campaign, were identified in the media as being examples of the culture war. Journalist Caitlin Dewey described Gamergate as a "proxy war" for a larger culture war between those who want greater inclusion of women and minorities in cultural institutions versus anti-feminists and traditionalists who do not. The perception that culture war conflict had been demoted from electoral politics to popular culture led writer Jack Meserve to call popular movies, games, and writing the "last front in the culture war" in 2015.

These conflicts about representation in popular culture re-emerged into electoral politics via the alt-right and alt-lite movements. According to media scholar Whitney Phillips, Gamergate "prototyped" strategies of harassment and controversy-stoking that proved useful in political strategy. For example, Republican political strategist Steve Bannon publicized pop-culture conflicts during the 2016 presidential campaign of Donald Trump, encouraging a young audience to "come in through Gamergate or whatever and then get turned onto politics and Trump."

Canada

Some observers in Canada have used the term "culture war" to refer to differing values between Western versus Eastern Canada, urban versus rural Canada, as well as conservatism versus liberalism and progressivism.

Nevertheless, Canadian society is generally not dramatically polarized over immigration, gun control, drug legality, sexual morality, or government involvement in healthcare: the main issues at play in the United States. In all of those cases, the majority of Canadians, including Conservatives would support the "progressive" position in the United States. In Canada a different set of issues create a clash of values. Chief among these are language policy in Canada, minority religious rights, pipeline politics, indigenous land rights, climate policy, and federal-provincial disputes.

It is a relatively new phrase in Canadian political commentary. It can still be used to describe historical events in Canada, such as the Rebellions of 1837, Western Alienation, the Quebec sovereignty movement, and any Aboriginal conflicts in Canada; but is more relevant to current events such as the Grand River land dispute and the increasing hostility between conservative and liberal Canadians. The phrase has also been used to describe the Harper government's attitude towards the arts community. Andrew Coyne termed this negative policy towards the arts community as "class warfare."

Australia

Interpretations of Aboriginal history became part of the wider political debate sometimes called the "culture wars" during the tenure of the Liberal–National Coalition government of 1996 to 2007, with the Prime Minister of Australia John Howard publicly championing the views of some of those associated with Quadrant. This debate extended into a controversy over the presentation of history in the National Museum of Australia and in high-school history curricula. It also migrated into the general Australian media, with major broadsheets such as The Australian, The Sydney Morning Herald and The Age regularly publishing opinion pieces on the topic. Marcia Langton has referred to much of this wider debate as "war porn" and as an "intellectual dead end".

Two Australian Prime Ministers, Paul Keating (in office 1991–1996) and John Howard (in office 1996–2007), became major participants in the "wars". According to Mark McKenna's analysis for the Australian Parliamentary Library, John Howard believed that Paul Keating portrayed Australia pre-Whitlam (Prime Minister from 1972 to 1975) in an unduly negative light; while Keating sought to distance the modern Labor movement from its historical support for the monarchy and for the White Australia policy by arguing that it was the conservative Australian parties which had been barriers to national progress. He accused Britain of having abandoned Australia during the Second World War. Keating staunchly supported a symbolic apology to Australian Aboriginals for their mistreatment at the hands of previous adminsitrations, and outlined his view of the origins and potential solutions to contemporary Aboriginal disadvantage in his Redfern Park Speech of 10 December 1992 (drafted with the assistance of historian Don Watson). In 1999, following the release of the 1998 Bringing Them Home Report, Howard passed a Parliamentary Motion of Reconciliation describing treatment of Aborigines as the "most blemished chapter" in Australian history, but he refused to issue an official apology. Howard saw an apology as inappropriate as it would imply "intergeneration guilt"; he said that "practical" measures were a better response to contemporary Aboriginal disadvantage. Keating has argued for the eradication of remaining symbols linked to colonial origins: including deference for ANZAC Day, for the Australian flag and for the monarchy in Australia, while Howard supported these institutions. Unlike fellow Labor leaders and contemporaries, Bob Hawke (Prime Minister 1983–1991) and Kim Beazley (Labor Party leader 2005–2006), Keating never traveled to Gallipoli for ANZAC Day ceremonies. In 2008 he described those who gathered there as "misguided".

In 2006 John Howard said in a speech to mark the 50th anniversary of Quadrant that "Political Correctness" was dead in Australia but: "we should not underestimate the degree to which the soft-left still holds sway, even dominance, especially in Australia's universities". Also in 2006, Sydney Morning Herald political editor Peter Hartcher reported that Opposition foreign-affairs spokesman Kevin Rudd was entering the philosophical debate by arguing in response that "John Howard, is guilty of perpetrating 'a fraud' in his so-called culture wars ... designed not to make real change but to mask the damage inflicted by the Government's economic policies".

The defeat of the Howard government in the Australian Federal election of 2007 and its replacement by the Rudd Labor government altered the dynamic of the debate. Rudd made an official apology to the Aboriginal Stolen Generation with bi-partisan support. Like Keating, Rudd supported an Australian republic, but in contrast to Keating, Rudd declared support for the Australian flag and supported the commemoration of ANZAC Day; he also expressed admiration for Liberal Party founder Robert Menzies.

Subsequent to the 2007 change of government, and prior to the passage, with support from all parties, of the Parliamentary apology to indigenous Australians, Professor of Australian Studies Richard Nile argued: "the culture and history wars are over and with them should also go the adversarial nature of intellectual debate", a view contested by others, including conservative commentator Janet Albrechtsen. The Liberal Party parliamentarian Christopher Pyne indicated an intention to re-engage in the history wars.

Africa

According to political scientist Constance G. Anthony, American culture war perspectives on human sexuality were exported to Africa as a form of neocolonialism. In his view, this began during the AIDS epidemic in Africa, with the United States government first tying HIV/AIDS assistance money to evangelical leadership and the Christian right during the Bush administration, then to LGBTQ tolerance during the administration of Barack Obama. This stoked a culture war that resulted in (among others) the Uganda Anti-Homosexuality Act of 2014.

Europe

Several media outlets have described the Law and Justice party of Poland and Viktor Orbán of Hungary, Aleksandar Vučić of Serbia, and Janez Janša of Slovenia as igniting culture wars in their respective countries by encouraging fights over LGBT rights, legal abortion, and other topics. In the United Kingdom, the Conservative Party have similarly been described as attempting to ignite culture wars in regards to "conservative values" under the tenure of Prime Minister Boris Johnson.

Criticism and evaluation

Since the time that James Davison Hunter first applied the concept of culture wars to American life, the idea has been subject to questions about whether "culture wars" names a real phenomenon, and if so, whether the phenomenon it describes is a cause of, or merely a result of, membership in groups like political parties and religions. Culture wars have also been subject to the criticism of being artificial, imposed, or asymmetric conflicts, rather than a result of authentic differences between cultures.

Validity

Researchers have differed about the scientific validity of the notion of culture war. Some claim it does not describe real behavior, or that it describes only the behavior of a small political elite. Others claim culture war is real and widespread, and even that it is fundamental to explaining Americans' political behavior and beliefs.

Political scientist Alan Wolfe participated in a series of scholarly debates in the 1990s and 2000s against Hunter, claiming that Hunter's concept of culture wars did not accurately describe the opinions or behavior of Americans, which Wolfe claimed were more united than polarized.

A meta-analysis of opinion data from 1992 to 2012 published in the American Political Science Review concluded that, in contrast to a common belief that political party and religious membership shape opinion on culture war topics, instead opinions on culture war topics lead people to revise their political party and religious orientations. The researchers view culture war attitudes as "foundational elements in the political and religious belief systems of ordinary citizens."

Artificiality or asymmetry

Some writers and scholars have said that culture wars are created or perpetuated by political special interest groups, by reactionary social movements, by dynamics within the Republican party, or by electoral politics as a whole. These authors view culture war not as an unavoidable result of widespread cultural differences, but as a technique used to create in-groups and out-groups for a political purpose.

Political commentator E. J. Dionne has written that culture war is an electoral technique to exploit differences and grievances, remarking that the real cultural division is "between those who want to have a culture war and those who don't."

Sociologist Scott Melzer says that culture wars are created by conservative, reactive organizations and movements. Members of these movements possess a "sense of victimization at the hands of a liberal culture run amok. In their eyes, immigrants, gays, women, the poor, and other groups are (undeservedly) granted special rights and privileges." Melzer writes about the example of the National Rifle Association, which he says intentionally created a culture war in order to unite conservative groups, particularly groups of white men, against a common perceived threat.

Similarly, religion scholar Susan B. Ridgely has written that culture wars were made possible by Focus on the Family. This organization produced conservative Christian "alternative news" that began to bifurcate American media consumption, promoting a particular "traditional family" archetype to one part of the population, particularly conservative religious women. Ridgely says that this tradition was depicted as under liberal attack, seeming to necessitate a culture war to defend the tradition.

Political scientists Matt Grossmann and David A. Hopkins have written about an asymmetry between the US's two major political parties, saying the Republican party should be understood as an ideological movement built to wage political conflict, and the Democratic party as a coalition of social groups with less ability to impose ideological discipline on members. This encourages Republicans to perpetuate and to draw new issues into culture wars, because Republicans are well equipped to fight such wars.

Right to property

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Right_to_property The right to property , or the right to own property ...