Fringe science refers to ideas whose attributes include being highly speculative or relying on premises already refuted. The chance of ideas rejected by editors and published outside the mainstream being correct is remote. When the general public does not distinguish between science and imitators, it risks exploitation,
and in some cases, a "yearning to believe or a generalized suspicion of
experts is a very potent incentive to accepting some pseudoscientific
claims".
A concept that was once accepted by the mainstream scientific community may become fringe science because of a later evaluation of previous research. For example, focal infection theory, which held that focal infections of the tonsils or teeth are a primary cause of systemic disease, was once considered to be medical fact. It has since been dismissed because of a lack of evidence.
Description
The boundary between fringe science and pseudoscience
is disputed. Friedlander writes that there is no widespread
understanding of what separates science from nonscience or
pseudoscience.Pseudoscience, however, is something that is not scientific but is incorrectly characterised as science.
The term may be considered pejorative. For example, Lyell D. Henry Jr. wrote, "Fringe science [is] a term also suggesting kookiness."
The confusion between science and
pseudoscience, between honest scientific error and genuine scientific
discovery, is not new, and it is a permanent feature of the scientific
landscape .... Acceptance of new science can come slowly.
Examples
Historical
Some historical ideas that are considered to have been refuted by mainstream science are:
Wilhelm Reich's work with orgone,
a physical energy he claimed to have discovered, contributed to his
alienation from the psychiatric community. He was eventually sentenced
to two years in a federal prison, where he died. At that time and continuing today, scientists disputed his claim that he had scientific evidence for the existence of orgone. Nevertheless, amateurs and a few fringe researchers continued to believe that orgone is real.
Focal infection theory
(FIT), as the primary cause of systemic disease, rapidly became
accepted by mainstream dentistry and medicine after World War I. This
acceptance was largely based upon what later turned out to be
fundamentally flawed studies. As a result, millions of people were
subjected to needless dental extractions and surgeries. The original studies supporting FIT began falling out of favor in the 1930s. By the late 1950s, it was regarded as a fringe theory.
The Clovis First
theory held that the Clovis culture was the first culture in North
America. It was long regarded as a mainstream theory until mounting
evidence of a pre-Clovis culture discredited it.
Modern
Relatively recent fringe sciences include:
Aubrey de Grey, featured in a 2006 60 Minutes special report, is studying human longevity. He calls his work "strategies for engineered negligible senescence" (SENS). Many mainstream scientists believe his research is fringe science (especially his view of the importance of nuclear epimutations and his timeline for antiaging therapeutics). In a 2005 article in Technology Review
(part of a larger series), it was stated that "SENS is highly
speculative. Many of its proposals have not been reproduced, nor could
they be reproduced with today's scientific knowledge and technology.
Echoing Myhrvold,
we might charitably say that de Grey's proposals exist in a kind of
antechamber of science, where they wait (possibly in vain) for
independent verification. SENS does not compel the assent of many
knowledgeable scientists; but neither is it demonstrably wrong."
A nuclear fusion reaction called cold fusion, which occurs near room temperature and pressure, was reported by chemists Martin Fleischmann and Stanley Pons in March 1989. Numerous research efforts at the time were unable to replicate their results. Subsequently, several scientists have worked on cold fusion or have
participated in international conferences on it. In 2004, the United
States Department of Energy commissioned a panel on cold fusion to
reexamine the concept. They wanted to determine whether their policies
should be altered because of new evidence.
The theory of abiogenic petroleum origin holds that petroleum was formed from deep carbon deposits, perhaps dating to the formation of the Earth. The ubiquity of hydrocarbons in the Solar System
may be evidence that there may be more petroleum on Earth than commonly
thought and that petroleum may originate from carbon-bearing fluids
that migrate upward from the Earth's mantle. Abiogenic hypotheses saw a
revival in the last half of the twentieth century by Russian and
Ukrainian scientists. More interest was generated in the West after the
1992 publication by Thomas Gold of the journal article, "The Deep, Hot Biosphere". Gold's version of the theory is partly based on the existence of a biosphere composed of thermophile bacteria in the Earth's crust, which might explain the existence of specific biomarkers in extracted petroleum.
Michael W. Friedlander has suggested some guidelines for responding to fringe science, which, he argues, is a more difficult problem than scientific misconduct.
His suggested methods include impeccable accuracy, checking cited
sources, not overstating orthodox science, thorough understanding of the
Wegener continental drift example, examples of orthodox science investigating radical proposals, and prepared examples of errors from fringe scientists.
Friedlander suggests that fringe science is necessary so
mainstream science will not atrophy. Scientists must evaluate the
plausibility of each new fringe claim, and certain fringe discoveries
"will later graduate into the ranks of accepted" — while others "will
never receive confirmation".
Margaret Wertheim profiled many "outsider scientists" in her book Physics on the Fringe,
who receive little or no attention from professional scientists. She
describes all of them as trying to make sense of the world using the
scientific method but in the face of being unable to understand modern
science's complex theories. She also finds it fair that credentialed
scientists do not bother spending a lot of time learning about and
explaining problems with the fringe theories of uncredentialed
scientists since the authors of those theories have not taken the time
to understand the mainstream theories they aim to disprove.
Controversies
As Donald E. Simanek
asserts, "Too often speculative and tentative hypotheses of cutting
edge science are treated as if they were scientific truths, and so
accepted by a public eager for answers." However, the public is ignorant
that "As science progresses from ignorance to understanding it must
pass through a transitional phase of confusion and uncertainty."
The media also play a role in propagating the belief that certain
fields of science are controversial. In their 2003 paper "Optimising
Public Understanding of Science and Technology in Europe: A Comparative
Perspective", Jan Nolin et al. write that "From a media
perspective it is evident that controversial science sells, not only
because of its dramatic value, but also since it is often connected to
high-stake societal issues."
Nakedness and clothing use are characteristics of humans related by evolutionary and social prehistory. The major loss of body hair distinguishes humans from other primates. Current evidence indicates that anatomically-modern humans were naked in prehistory for at least 90,000 years before they invented clothing. Today, isolated Indigenous peoples in tropical climates continue to be without clothing in many everyday activities.
Evolution of hairlessness
Humans' closest living relatives, like chimpanzees and bonobos, have both extensive areas of fur and also bare patches
The general hairlessness of humans in comparison to related species may be due to loss of functionality in the pseudogene called KRT41P (which helps produce keratin) in the human lineage about 240,000 years ago. On an individual basis, mutations in the gene HR can lead to complete hair loss, though this is not typical in humans. Humans may also lose their hair as a result of hormonal imbalance due to drugs or pregnancy.
To comprehend why humans have significantly less body hair than
other primates, one must understand that mammalian body hair is not
merely an aesthetic characteristic; it protects the skin from wounds,
bites, heat, cold, and ultraviolet radiation. Additionally, it can be used as a communication tool and as a camouflage.
The first member of the genus Homo to be hairless was Homo erectus, originating about 1.6 million years ago. The dissipation of body heat remains the most widely accepted
evolutionary explanation for the loss of body hair in early members of
the genus Homo, the surviving member of which is modern humans. Less hair and an increase in sweat glands made it easier for their bodies to cool when they moved from living in shady forest to open savanna. This change in environment also resulted in a change in diet, from largely vegetarian to hunting-gathering. Pursuing game on the savanna also increased the need for regulation of body heat.
The anthropologist and paleo-biologistNina Jablonski posits that the ability to dissipate excess body heat through eccrine sweating helped make possible the dramatic enlargement of the brain, the most temperature-sensitive organ in human body. Thus the loss of fur was also a factor in further adaptations, both
physical and behavioral, that differentiated humans from other primates.
Some of these changes are thought to be the result of sexual selection: by selecting more hairless mates, humans accelerated changes initiated by natural selection. Sexual selection may also account for the remaining human hair in the pubic area and armpits, which are sites for pheromones, while hair on the head continued to provide protection from the sun. Anatomically-modern humans, whose traits include hairlessness, evolved 260,000 to 350,000 years ago.
Phenotypic changes
Humans are the only primate species to have undergone significant hair loss, and of the approximately 5000 extant species of mammal, only a handful are effectively hairless. This list includes elephants, rhinoceroses, hippopotamuses, walruses, some species of pigs, whales and other cetaceans, and naked mole rats. Most mammals have light skin that is covered by fur, and biologists
believe that early human ancestors started out this way also. Dark skin
probably evolved after humans lost their body fur, because the naked
skin was vulnerable to the strong UV radiation as explained in the Out of Africa hypothesis.
Therefore, evidence of the time when human skin darkened has been used
to date the loss of human body hair, assuming that the dark skin was
needed after the fur was gone.
With the loss of fur, darker, high-melanin skin evolved as a protection from ultraviolet radiation damage. As humans migrated outside of the tropics, varying degrees of depigmentation evolved in order to permit UVB-induced synthesis of previtamin D3. The relative lightness of female compared to male skin in a given
population may be due to the greater need for women to produce more
vitamin D during lactation.
The sweat glands in humans could have evolved to spread from the
hands and feet as the body hair changed, or the hair change could have
occurred to facilitate sweating. Horses
and humans are two of the few animals capable of sweating on most of
their body, yet horses are larger and still have fully developed fur. In
humans, the skin hairs lie flat in hot conditions, as the arrector pili muscles relax, preventing heat from being trapped by a layer of still air between the hairs, and increasing heat loss by convection.
Sexual selection hypothesis
Another hypothesis for the thick body hair on humans proposes that Fisherian runawaysexual selection played a role (as well as in the selection of long head hair), (see terminal and vellus hair), as well as a much larger role of testosterone in men. Sexual selection is the only theory thus far that explains the sexual dimorphism seen in the hair patterns of men and women. On average, men have more body hair than women. Males have more terminal hair, especially on the face, chest, abdomen, and back, and females have more vellus hair, which is less visible. The halting of hair development at a juvenile stage, vellus hair, would also be consistent with the neoteny evident in humans, especially in females, and thus they could have occurred at the same time. This theory, however, has significant holdings in today's cultural
norms. There is no evidence that sexual selection would proceed to such a
drastic extent over a million years ago when a full, lush coat of hair
would most likely indicate health and would therefore be more likely to
be selected for, not against.
Water-dwelling hypothesis
The aquatic ape hypothesis
(AAH) includes hair loss as one of several characteristics of modern
humans that could indicate adaptations to an aquatic environment.
Serious consideration may be given by contemporary anthropologists to
some hypotheses related to AAH, but hair loss is not one of them.
Parasite hypothesis
A divergent explanation of humans' relative hairlessness holds that ectoparasites (such as ticks) residing in fur became problematic as humans became hunters living in larger groups with a "home base". Nakedness would also make the lack of parasites apparent to prospective mates. However, this theory is inconsistent with the abundance of parasites
that continue to exist in the remaining patches of human hair.
The "ectoparasite" explanation of modern human nakedness is based
on the principle that a hairless primate would harbor fewer parasites.
When our ancestors adopted group-dwelling social arrangements roughly
1.8 Mya
(million years ago), ectoparasite loads increased dramatically. Early
humans became the only one of the 193 primate species to have fleas,
which can be attributed to the close living arrangements of large
groups of individuals. While primate species have communal sleeping
arrangements, these groups are always on the move and thus are less
likely to harbor ectoparasites. However, humans have as many follicles
as other primates, but the hair is shorter and finer. This "peach fuzz"
may have been retained because it both allows humans to detect the
presence of ectoparasites while inhibiting their movement on the skin.
It was expected that dating the split of the ancestral human louse into two species, the head louse and the pubic louse,
would date the loss of body hair in human ancestors. However, it turned
out that the human pubic louse does not descend from the ancestral
human louse, but from the gorilla louse,
diverging 3.3 million years ago. This suggests that humans had lost
body hair (but retained head hair) and developed thick pubic hair prior
to this date, were living in or close to the forest where gorillas
lived, and acquired pubic lice from butchering gorillas or sleeping in
their nests.[26][27] The evolution of the body louse from the head louse, on the other hand, places the date of clothing much later, some 100,000 years ago.
A necklace reconstructed from perforated sea snail shells from Upper Palaeolithic Europe, dated between 39,000 and 25,000 BCE. The practice of body adornment is associated with the emergence of behavioral modernity.
A 2010 study published in Molecular Biology and Evolution indicates that the habitual wearing of clothing began at some point in time between 83,000 and 170,000 years ago based upon a genetic analysis indicating when clothing lice
diverged from their head louse ancestors. That information suggests
that the use of clothing likely originated with anatomically-modern
humans in Africa prior to their migration to colder climates, which made the migration possible.
Some of the technology for what is now called clothing may have originated to make other types of adornment, including jewelry, body paint, tattoos, and other body modifications, "dressing" the naked body without concealing it. According to Mark Leary and Nicole R. Buttermore, body adornment is one of the changes that occurred in the late Paleolithic (40,000 to 60,000 years ago) in which humans became not only anatomically modern but also behaviorally modern, and capable of self-reflection and symbolic interaction. More recent studies place the use of adornment at 77,000 years ago in South Africa, and 90,000—100,000 years ago in Palestine and Algeria. While modesty may be a factor, often overlooked purposes for body
coverings are camouflage used by hunters, body armor, and costumes used
to impersonate "spirit-beings".
The origin of complex, fitted clothing required the invention of fine stone knives for cutting skins into pieces, and the eyed needle for sewing. This was done by Cro-Magnons, who migrated to Europe around 35,000 years ago. The Neanderthal
occupied the same region, but became extinct in part because they could
not make fitted garments, but draped themselves with crudely cut
skins—based upon their simple stone tools—which did not provide the
warmth needed to survive as the climate grew colder in the Last Glacial Period. In addition to being less functional, the simple wrappings would not
have been habitually worn by Neanderthal because they were more tolerant
to the cold than Homo sapiens and would not have acquired the secondary functions of decoration and promoting modesty.
The earliest archeological evidence of fabric clothing is inferred from representations in figurines in the southern Levant, dated between 11,700 and 10,500 years ago. The surviving examples of woven cloth are linen from Egypt dated 5000 BCE, while knotted or twisted flax fibers have been found as early as 7000 BCE.
Adults are rarely completely naked in modern societies and cover
at least their genitals, but adornments and clothing often emphasize,
enhance, or otherwise call attention to the sexuality of the body.
Behavioral modernity refers to a suite of behavioral and cognitive traits associated with humans (Homo sapiens), reflecting capacities such as abstract and symbolic thought, planning depth, cumulative culture, and complex social learning. These traits are often inferred archaeologically through evidence including symbolic artifacts (e.g., art, ornamentation), ritualized behavior, music and dance, sophisticated hunting strategies, and advanced lithic technologies such as blade production. Rather than representing an absolute boundary between Homo sapiens and other hominins,
behavioral modernity is increasingly understood as a mosaic of traits
that emerged gradually and were expressed variably across time and
populations.
Evolution
The Venus of Hohle Fels figurine was carved about 40,000 years ago and is a product of behavioral modernity
Anatomically modern humans possessed much of the necessary neural architecture
by at least ~300 thousand years ago, but early populations were small
and fragmented, limiting the persistence and transmission of complex
behaviors. As a result, archaeological signals of symbolism, art, and advanced
technologies appear sporadically in Africa between ~150–75 kya,
reflecting intermittent expression, even though the cognitive capacity
was probably present. Widespread, continuous manifestations of these behaviors became
archaeologically visible only after populations grew denser and social
networks expanded, appearing across continents, one instance being
during the Upper Paleolithic in Europe.
In this view, behavioral modernity is primarily cultural and
learned, shaped by high-fidelity social learning, cumulative culture,
and demographic thresholds, while resting on an evolved cognitive substrate that predates its full material expression. Differences between Homo sapiens
and other hominins are therefore understood as differences of degree,
stability, and cultural accumulation, not the presence or absence of a
single cognitive mutation.
Underlying these behaviors and technological innovations are
cognitive and cultural foundations that have been documented
experimentally and ethnographically by evolutionary and cultural anthropologists. These human universal patterns include cumulative cultural adaptation, social norms, language, and extensive help and cooperation beyond close kin.
Within the tradition of evolutionary anthropology and related
disciplines, it has been argued that the development of these modern
behavioral traits, in combination with the climatic conditions of the Last Glacial Period and Last Glacial Maximum causing population bottlenecks, contributed to the evolutionary success of Homo sapiens worldwide relative to Neanderthals, Denisovans, and other archaic humans.
There are many other hypotheses on the evolution of behavioral
modernity. These approaches tend to fall into two camps, cognitive and
gradualist:
The late Upper Paleolithic
model hypothesizes that modern human behavior arose through cognitive,
genetic changes in Africa abruptly around 40,000–50,000 years ago around
the time of the Out-of-Africa migration,
dubbed the "cognitive revolution" or the "Upper Paleolithic
revolution", prompting the movement of some modern humans out of Africa
and across the world.
Gradualist
models focus on how modern human behavior may have arisen through
gradual steps, with the archaeological signatures of such behavior
appearing only through demographic or subsistence-based changes. Many
cite evidence of behavioral modernity earlier (by at least about
150,000–75,000 years ago and possibly earlier), namely in the African Middle Stone Age. Anthropologists Sally McBrearty and Alison S. Brooks
have been notable proponents of gradualism—challenging Europe-centered
models by situating more change in the African Middle Stone Age—though
this model is more difficult to substantiate due to the general thinning
of the fossil record further back in time.
Definition
A Māori man performing haka,
a ceremonial dance. He is displaying several hallmarks of behavioral
modernity including the use of jewelry, application of body paint, music
and dance, and symbolic behavior.
To classify what should be included in modern human behavior, it is
necessary to define behaviors that are universal among living human
groups. Some examples of these human universals are abstract thought,
planning, trade, cooperative labor, body decoration, and the control
and use of fire. Along with these traits, humans possess much reliance
on social learning. This cumulative cultural change or cultural "ratchet" separates human culture from social learning in animals.
In addition, a reliance on social learning may be responsible in part
for humans' rapid adaptation to many environments outside of Africa.
Since cultural universals are found in all cultures, including isolated
indigenous groups, these traits must have evolved or have been invented
in Africa prior to the exodus.
Archaeologically, a number of empirical traits have been used as
indicators of modern human behavior. While these are often debated a few are generally agreed upon. Archaeological evidence of behavioral modernity includes:
Several critiques have been placed against the traditional concept of
behavioral modernity, both methodologically and philosophically. Anthropologist John Shea
outlines a variety of problems with this concept, arguing instead for
"behavioral variability", which, according to the author, better
describes the archaeological record. The use of trait lists, according
to Shea, runs the risk of taphonomic
bias, where some sites may yield more artifacts than others despite
similar populations; as well, trait lists can be ambiguous in how
behaviors may be empirically recognized in the archaeological record. In particular, Shea cautions that population pressure, cultural change, or optimality models, like those in human behavioral ecology, might better predict changes in tool types or subsistence strategies than a change from "archaic" to "modern" behavior. Some researchers argue that a greater emphasis should be placed on
identifying only those artifacts which are unquestionably, or purely,
symbolic as a metric for modern human behavior.
Since 2018, recent dating methods utilized on various cave art sites in Spain and France
have shown that Neanderthals performed symbolic artistic expression,
consisting of red "lines, dots, and hand stencils" found in caves, prior
to contact with anatomically modern humans. This is contrary to
previous suggestions that Neanderthals lacked these capabilities.
Hypotheses and models
Late Upper Paleolithic model or "Upper Paleolithic Revolution"
The Late Upper Paleolithic Model, or Upper Paleolithic Revolution, refers to the idea that, though anatomically modern humans
first appear around 150,000 years ago (as was once believed), they were
not cognitively or behaviorally "modern" until around 50,000 years ago,
leading to their expansion out of Africa and into Europe and Asia. These authors note that traits used as a metric for behavioral
modernity do not appear as a package until around 40–50,000 years ago.
Anthropologist Richard Klein
specifically describes that evidence of fishing, tools made from bone,
hearths, significant artifact diversity, and elaborate graves are all
absent before this point. According to both Shea and Klein, art only becomes common beyond this
switching point, signifying a change from archaic to modern humans. Most researchers argue that a neurological or genetic change, perhaps one enabling complex language, such as FOXP2, caused this revolutionary change in humans. The role of FOXP2 as a driver of evolutionary selection has been called into question following recent research results.
The African Middle Stone Age period gives us some of the earliest
evidence of Behavioral Modernity. In Southern Africa, groups of people
would bypass closer deposits of rich, deep-red ochre in favor of mining
more distant ones. After mining, pieces of ochre would show evidence of
grinding to make a powder, which indicates that ochre was being ground
for a reason other than simply for decoration. In North Africa, similar
early evidence of behavioral modernity can be seen in the
82,000-year-old Taforalt cave where the perforation of marine shells was
for the construction of necklaces. It was noted that the cave was very
inland, which implies the presence of a trade network in the area where
people were able to acquire shells of the coastal Nassarius species.
People wore the beads along with the necklace, and evidence of wear
showed that the bead was a personal ornament, which indicates behavioral
modernity. In Africa, the evidence presents behavioral modernity in the
form of symbolism, personal ornamentation, and the movement of trade in
necklace beads. All this occurred in Africa 10s of thousands of years
before these behaviors appeared in Europe. These results dispute older
models that relocated the origin of modern behavior to the Upper
Paleolithic period and the 'sudden' appearance of the behaviors to
Africa.
Building on the FOXP2 gene hypothesis, cognitive scientist Philip Lieberman
has argued that proto-language behaviour existed prior to 50,000 BP,
albeit in a more primitive form. Lieberman has advanced fossil evidence,
such as neck and throat dimensions, to demonstrate that so-called
"anatomically modern" humans from 100,000 BP continued to evolve their
SVT (supralaryngeal vocal tract), which already possessed a horizontal
portion (SVTh) capable of producing many phonemes which were mostly
consonants. According to his hypothesis, Neanderthals and early Homo sapiens would have been able to communicate using sounds and gestures.
From 100,000 BP, Homo sapiens necks continued to lengthen to a point, by around 50,000 BP, where Homo sapiens
necks were long enough to accommodate a vertical portion to their SVT
(SVTv), which is now a universal trait among humans. This SVTv enabled
the enunciation of quantal vowels:
[i]; [u]; and [a]. These quantal vowels could then be immediately put
to use by the already sophisticated neuro-motor-control features of the
FOXP2 gene to generate more nuanced sounds and in effect increase by
orders of magnitude the number of distinct sounds that can be produced,
allowing for fully symbolic language.
Goody (1986) draws an analogy between the development of spoken language and that of writing: the shift from pictographic or ideographic symbols into a fully abstract logographic writing system (such as hieroglyphs), or from a logographic system into an abjad or alphabet, led to dramatic changes in human civilization.
Contrasted with this view of a spontaneous leap in cognition among ancient humans, some anthropologists like Alison S. Brooks,
primarily working in African archaeology, point to the gradual
accumulation of "modern" behaviors, starting well before the 50,000-year
benchmark of the Upper Paleolithic Revolution models. Howiesons Poort, Blombos,
and other South African archaeological sites, for example, show
evidence of marine resource acquisition, trade, the making of bone
tools, blade and microlithic technology, and abstract ornamentation at least by 80,000 years ago. Given evidence from Africa and the Middle East, a variety of hypotheses
have been put forth to describe an earlier, gradual transition from
simple to more complex human behavior. Some authors have pushed back the
appearance of fully modern behavior to around 80,000 years ago or
earlier in order to incorporate the South African data.
Others focus on the slow accumulation of different technologies
and behaviors across time. These researchers describe how anatomically
modern humans could have been cognitively the same, and what we define
as behavioral modernity is just the result of thousands of years of
cultural adaptation and learning. Archaeologist Francesco d'Errico, and others, have looked at Neanderthal culture, rather than early human behavior exclusively, for clues into behavioral modernity. Noting that Neanderthal assemblages often portray traits similar to
those listed for modern human behavior, researchers stress that the
foundations for behavioral modernity may in fact, lie deeper in our
hominin ancestors. If both modern humans and Neanderthals express abstract art and complex
tools then "modern human behavior" cannot be a derived trait for our
species. They argue that the original "human revolution" hypothesis
reflects a profound Eurocentric bias. Recent archaeological evidence,
they argue, proves that humans evolving in Africa some 300,000 or even
400,000 years ago were already becoming cognitively and behaviourally
"modern". These features include blade and microlithic technology, bone
tools, increased geographic range, specialized hunting, the use of
aquatic resources, long-distance trade, systematic processing and use of
pigment, and art and decoration. These items do not occur suddenly
together as predicted by the "human revolution" model, but at sites that
are widely separated in space and time. This suggests a gradual
assembling of the package of modern human behaviours in Africa, and its
later export to other regions of the Old World.
Between these extremes is the view—currently supported by archaeologists Chris Henshilwood, Curtis Marean, Ian Watts and others—that there was indeed some kind of "human revolution" but
that it occurred in Africa and spanned tens of thousands of years. The
term "revolution," in this context, would mean not a sudden mutation but
a historical development along the lines of the industrial revolution or the Neolithic revolution. In other words, it was a relatively accelerated process, too rapid for
ordinary Darwinian "descent with modification" yet too gradual to be
attributed to a single genetic or other sudden event. These
archaeologists point in particular to the relatively explosive emergence
of ochre crayons and shell necklaces, apparently used for cosmetic
purposes. These archaeologists see symbolic organisation of human social
life as the key transition in modern human evolution. Recently
discovered at sites such as Blombos Cave and Pinnacle Point, South
Africa, pierced shells, pigments and other striking signs of personal
ornamentation have been dated within a time-window of 70,000–160,000
years ago in the African Middle Stone Age, suggesting that the emergence
of Homo sapiens coincided, after all, with the transition to modern cognition and behaviour. While viewing the emergence of language as a "revolutionary"
development, this school of thought generally attributes it to
cumulative social, cognitive and cultural evolutionary processes as
opposed to a single genetic mutation.
A further view, taken by archaeologists such as Francesco d'Errico and João Zilhão, is a multi-species perspective arguing that evidence for symbolic culture,
in the form of utilised pigments and pierced shells, are also found in
Neanderthal sites, independently of any "modern" human influence.
Cultural evolutionary models may also shed light on why although
evidence of behavioral modernity exists before 50,000 years ago, it is
not expressed consistently until that point. With small population
sizes, human groups would have been affected by demographic and cultural
evolutionary forces that may not have allowed for complex cultural
traits. According to some authors, until population density became significantly high, complex traits
could not have been maintained effectively. Some genetic evidence
supports a dramatic increase in population size before human migration
out of Africa. High local extinction rates within a population also can significantly
decrease the amount of diversity in neutral cultural traits, regardless
of cognitive ability.
Research from 2017 indicates that Homo sapiens originated in Africa between around 350,000 and 260,000 years ago. There is some evidence for the beginning of modern behavior among early African H. sapiens around that period.
Before the Out of Africa theory
was generally accepted, there was no consensus on where the human
species evolved and, consequently, where modern human behavior arose.
Now, however, African archaeology has become extremely important in
discovering the origins of humanity. The first Cro-Magnon expansion into Europe around 48,000 years ago is generally accepted as already "modern", and it is now generally believed that behavioral modernity appeared in
Africa before 50,000 years ago, either significantly earlier, or
possibly as a late Upper Paleolithic "revolution" soon before which
prompted migration out of Africa.
A variety of evidence of abstract imagery, widened subsistence
strategies, and other "modern" behaviors have been discovered in Africa,
especially South, North, and East Africa. The Blombos Cave site in South Africa, for example, is famous for rectangular slabs of ochre engraved with geometric designs. Using multiple dating techniques, the site was dated to be around 77,000 and 100,000 to 75,000 years old. Ostrich egg shell containers engraved with geometric designs dating to 60,000 years ago were found at Diepkloof, South Africa. Beads and other personal ornamentation have been found from Morocco
which might be as much as 130,000 years old; as well, the Cave of
Hearths in South Africa has yielded a number of beads dating from
significantly prior to 50,000 years ago, and shell beads dating to about 75,000 years ago have been found at Blombos Cave, South Africa.
Specialized projectile weapons as well have been found at various
sites in Middle Stone Age Africa, including bone and stone arrowheads
at South African sites such as Sibudu Cave (along with an early bone needle also found at Sibudu) dating approximately 72,000–60,000 years ago on some of which poisons may have been used, and bone harpoons at the Central African site of Katanda dating to about 90,000 years ago. Traces of toxic plant alkaloids on microlithic arrowheads in KwaZulu-Natal, South Africa, dated to 60,000 years ago. Evidence also exists for the systematic heat treating of silcrete
stone to increase its flake-ability for the purpose of toolmaking,
beginning approximately 164,000 years ago at the South African site of Pinnacle Point and becoming common there for the creation of microlithic tools at about 72,000 years ago.
In 2008, an ochre processing workshop likely for the production
of paints was uncovered dating to c. 100,000 years ago at Blombos Cave,
South Africa. Analysis shows that a liquefied pigment-rich mixture was
produced and stored in the two abalone shells, and that ochre, bone,
charcoal, grindstones, and hammer-stones also formed a composite part of
the toolkits. Evidence for the complexity of the task includes
procuring and combining raw materials from various sources (implying
they had a mental template of the process they would follow), possibly
using pyrotechnology to facilitate fat extraction from bone, using a
probable recipe to produce the compound, and the use of shell containers
for mixing and storage for later use. Modern behaviors, such as the making of shell beads, bone tools and
arrows, and the use of ochre pigment, are evident at a Kenyan site by
78,000–67,000 years ago. Evidence of early stone-tipped projectile weapons (a characteristic tool of Homo sapiens), the stone tips of javelins or throwing spears, were discovered in 2013 at the Ethiopian site of Gademotta, and date to around 279,000 years ago.
Expanding subsistence strategies beyond big-game hunting and the
consequential diversity in tool types has been noted as signs of
behavioral modernity. A number of South African sites have shown an
early reliance on aquatic resources from fish to shellfish. Pinnacle Point,
in particular, shows exploitation of marine resources as early as
120,000 years ago, perhaps in response to more arid conditions inland. Establishing a reliance on predictable shellfish deposits, for example,
could reduce mobility and facilitate complex social systems and
symbolic behavior. Blombos Cave and Site 440 in Sudan both show evidence
of fishing as well. Taphonomic change in fish skeletons from Blombos
Cave have been interpreted as capture of live fish, clearly an
intentional human behavior.
Humans in North Africa (Nazlet Sabaha, Egypt) are known to have dabbled in chertmining, as early as ≈100,000 years ago, for the construction of stone tools.
Evidence was found in 2018, dating to about 320,000 years ago, at the Kenyan site of Olorgesailie,
of the early emergence of modern behaviors including: long-distance
trade networks (involving goods such as obsidian), the use of pigments,
and the possible making of projectile points. It is observed by the
authors of three 2018 studies on the site that the evidence of these
behaviors is approximately contemporary to the earliest known Homo sapiens fossil remains from Africa (such as at Jebel Irhoud and Florisbad),
and they suggest that complex and modern behaviors had already begun in
Africa around the time of the emergence of anatomically modern Homo sapiens.
In 2019, further evidence of early complex projectile weapons in
Africa was found at Aduma, Ethiopia, dated 100,000–80,000 years ago, in
the form of points considered likely to belong to darts delivered by
spear throwers.
Olduvai Hominid 1 wore facial piercings.
Europe
While traditionally described as evidence for the later Upper Paleolithic Model, European archaeology has shown that the issue is more complex. A
variety of stone tool technologies are present at the time of human
expansion into Europe and show evidence of modern behavior. Despite the
problems of conflating specific tools with cultural groups, the Aurignacian tool complex, for example, is generally taken as a purely modern human signature. The discovery of "transitional" complexes, like "proto-Aurignacian",
have been taken as evidence of human groups progressing through "steps
of innovation". If, as this might suggest, human groups were already migrating into
eastern Europe around 40,000 years and only afterward show evidence of
behavioral modernity, then either the cognitive change must have
diffused back into Africa or was already present before migration.
In light of a growing body of evidence of Neanderthal culture and
tool complexes some researchers have put forth a "multiple species
model" for behavioral modernity. Neanderthals were often cited as being an evolutionary dead-end, apish
cousins who were less advanced than their human contemporaries. Personal
ornaments were relegated as trinkets or poor imitations compared to the
cave art produced by H. sapiens. Despite this, European evidence
has shown a variety of personal ornaments and artistic artifacts
produced by Neanderthals; for example, the Neanderthal site of Grotte du Renne has produced grooved bear, wolf, and fox incisors, ochre and other symbolic artifacts. Although few and controversial, circumstantial evidence of Neanderthal ritual burials has been uncovered. There are two options to describe this symbolic behavior among
Neanderthals: they copied cultural traits from arriving modern humans or
they had their own cultural traditions comparative with behavioral
modernity. If they just copied cultural traditions, which is debated by
several authors, they still possessed the capacity for complex culture described by
behavioral modernity. As discussed above, if Neanderthals also were
"behaviorally modern" then it cannot be a species-specific derived
trait.
Asia
Most debates surrounding behavioral modernity have been focused on
Africa or Europe but an increasing amount of focus has been placed on
East Asia. This region offers a unique opportunity to test hypotheses of
multi-regionalism, replacement, and demographic effects. Unlike Europe, where initial migration occurred around 50,000 years
ago, human remains have been dated in China to around 100,000 years ago. This early evidence of human expansion calls into question behavioral modernity as an impetus for migration.
Stone tool technology is particularly of interest in East Asia. Following Homo erectus migrations out of Africa, Acheulean technology never seems to appear beyond present-day India and into China. Analogously, Mode 3, or Levallois technology, is not apparent in China following later hominin dispersals. This lack of more advanced technology has been explained by serial founder effects and low population densities out of Africa. Although tool complexes comparative to Europe are missing or
fragmentary, other archaeological evidence shows behavioral modernity.
For example, the peopling of the Japanese archipelago offers an
opportunity to investigate the early use of watercraft. Although one
site, Kanedori in Honshu, does suggest the use of watercraft as early as
84,000 years ago, there is no other evidence of hominins in Japan until
50,000 years ago.
The Zhoukoudian
cave system near Beijing has been excavated since the 1930s and has
yielded precious data on early human behavior in East Asia. Although
disputed, there is evidence of possible human burials and interred
remains in the cave dated to around 34–20,000 years ago. These remains have associated personal ornaments in the form of beads
and worked shell, suggesting symbolic behavior. Along with possible
burials, numerous other symbolic objects like punctured animal teeth and
beads, some dyed in red ochre, have all been found at Zhoukoudian. Although fragmentary, the archaeological record of eastern Asia shows
evidence of behavioral modernity before 50,000 years ago but, like the
African record, it is not fully apparent until that time
Map of Southwest Asia showing the main archaeological sites of the Pre-Pottery Neolithic period, c. 7500 BCE, in the "Fertile Crescent". Black squares indicate pre-agricultural sites.
The Neolithic Revolution, also known as the First Agricultural Revolution, was the wide-scale transition of many human cultures during the Neolithic period from the egalitarian lifestyle of nomadic and semi-nomadic hunter-gatherers to one of agriculture, settlement, establishment of cross-group organisations, population growth and increasing social differentiation.
Archaeological data indicate that the food producing domestication
of some types of wild animals and plants happened independently in
separate locations worldwide, starting in Mesopotamia after the end of
the last Ice Age,
around 11,700 years ago. The climate became warmer, and vast areas were
flooded due to the relatively sudden rise in sea levels—an event that
some scientists consider the basis of the widespread myths of a catastrophic flood caused by gods. Between 12,000 and 6,000 BC, the coastline was thrust inland by up to
1,000 km, leading to the traces typical of the Neolithic period: a
relatively higher population density in the reduced areas; onset and
intensification of agriculture; rise in birth rate; and deterioration in
general health.
The introduction of agriculture implies also an increase in hard labor (cf. Athrahasis) and a significant loss of access to high-quality food compared to what was previously available through hunting and foraging. Nevertheless, many researchers argue that the production of calorie-rich crops
allowed humans to invest their efforts in other activities, describing
it as "ultimately necessary to the rise of modern civilization". A minority of scientists take a critical stance toward this optimistic
view. They consider that since the dawn of agriculture, a reciprocal
relationship may have been initiated whereby more and more people need
to be fed by ever larger areas of cultivated land, and that this process
must be stabilized at a level that prevents the collapse of global
ecosystems (s. The Limits to Growth).
The social forms of human co-existence before and since the
beginning of the Neolithic Revolution, the features of political
organisations as well as that of agriculture, the sequence of their
emergence, and empirical interrelations at sites like the megalithic
monuments at Göbekli Tepe
are the subject of current interdisciplinary research and debate. In
anthropology, it is generally assumed that the relatively rapid evolution of the brain toward that of homo sapiens
is a crucial prerequisite for all these cultural achievements, namely
adaptive measures that were consciously introduced (supplemented by
their cumulative transmission
through learning across subsequent generations) to compensate for food
shortages and other adverse circumstances, and therefore do not occur to
such extent, if at all, among our closest relatives in animal kingdom.
In particular, chimpanzees are unable to establish cross-group
organizations—probably the most challenging of all Neolithic measures.
These achievements did not all occur at exactly the same time;
they varied depending on environmental factors (local flora and fauna;
climate) and the ingenuity of the groups responsible; were sometimes
even abandoned, reverting to humanity’s original way of life if the
environment improved accordingly. Only gradually did the general trend
emerge: increasing intensification of various technologies and
chronification of political alliances, whereby the unchecked population
growth (see its mythical connection with the Flood catastrophe) became
the most decisive factor. Despite this pattern, the complexity of this
process cannot be explained as a strictly linear development; rather,
the emergence of Neolithic cultures appears to be governed by the
playful trial-and-error principle of Darwinian laws, in which the most economical solution finally prevails, assimilating or displacing all others.
Overview
The Neolithic Revolution encompassed much more than just the
introduction of various food production techniques. Cultivating large
areas of land and erecting monumental works of art such as those at Göbekli Tepe required a level of labour that the small groups of nomadic hunter-gatherers who had hitherto dominated human prehistory could hardly have achieved on their own. Modern scientists such as Klaus Schmidt
therefore assume that the period under discussion was also marked by
the establishment of cross-group organizations. Small communities that
had previously lived autonomously and often in competition with each
other decided instead to cooperate, forming first alliances, some of which may have decided to settle down
and build permanent villages close to their agricultural lands. In the
following millennia, the most successful among them will have grown into
city-states like Shuruppak mentioned in humanity's oldest written documents. These societies radically altered their natural environment through animal breeding, deforestation, cultivation of certain crops and irrigation. Other developments that began to spread are pottery, polished stone tools,
and the change from round to rectangular dwellings. In many regions,
agriculture enabled the production of food surpluses, which in turn
resulted in rapid population growth, a phenomenon known as the Neolithic demographic transition.
These developments are sometimes called the Neolithic package. Including earliest political alliances, they form the backdrop to an increasing division of labour, leading to the emergence of centralised administrations and specialised crafts, followed by hierarchical ideologies. In turn, there was an expansion of trade and military operations, development of depersonalised systems of knowledge (e.g. writing), aggregation of property and architecture in densely populated settlements, whose often monumental art primarily proclaim the power of the founders, depicting them as gods.
Three
of the monumental artworks from Göbekli Tepe, each of which appears to
depict a group of about 11 men. Belonging to the youngest of the total
around 40 monuments, they are located right at the top within an
artificial hill, a so called tell. It was erected over the course of at least 1,500 years (approx. 50 generations) layer by layer like a tower towards heaven, so the oldest of these representations form its fundament.
Among the oldest known large-scale art in human history, erected between approximately 9,500 BP
and at least 8,000 BP in northern Mesopotamia, are the numerous
circular formations at Göbekli Tepe. Each of these monuments consists of
a group of about eleven megalithic pillars, which, due to their distinctive features, are interpreted as symbolic male figures.
The earliest written records, dated to c. 6,500 BP, originate from the Sumerian civilisation, which reached the Bronze Age and emerged also in the Fertile Crescent. Initially, the records exclusively documented quantities of foodstuffs to be delivered, often signed with impressions of cylinder seals. Over the millennia, these simple signs were developed into a complex cuneiform
script, enabling both the recording of myths (which until then had been
passed down only orally) and the beginning of a more or less realistic
approach to historiography. See the beginning of Mesopotamia's
task-specific transformation into a fertile garden landscape by
essentially three groups of gods, as described in the Atra-Hasis epic; also the list of rulers from allegedly some millennia before to after the Flood appearing in this context.
Background
Battle
from Les Dogues, c. 5800 BC. Armed conflicts between two parties of
hunters as shown here (the top group of 11 men appears to be superior
in a sophisticated defensive formation) illustrate behaviour that male
humans are typically forced into when the sources of life are severely
scarce. Cross-group organisations (politics) and agriculture counteract
this in each their own way.
Prehistoric hunter-gatherers had different subsistence requirements and lifestyles from agriculturalists. They lived in relatively small groups that were mostly very mobile (migratory), built only temporary shelters, and had limited contact with foreign communities. The self-sufficient
economy of such groups explains their mutual competition for available
resources. Territorial conflicts, as sometimes documented by the actors
themselves, were therefore not uncommon in human prehistory, but
Aristotle already assumed that humans naturally possessed the ability to
form political alliances. Their highly evolved reasoning allowed
Neolithic hunter-gatherer groups to cooperate with foreign communities
based on an understanding of the advantages of such measures – and this
"much earlier than science had previously believed." (Klaus Schmidt).
Agriculture is another achievement of our reason. While human
intelligence here just deals with the handling of other species in order
to use them as food or beasts of burden, the establishment of political
organisations entails the challenging task of learning to collaborate
with alien groups of one's own kind, which can become far more dangerous
to each other than any other predator. Various species demonstrate the ability to engage in a form of agricultural domestication
(cf. aphid-herding ants), but only humans are capable of concluding
treaties to regulate the coexistence of participating groups (cf.
Sumer's Tablets of Destiny). In the event of a breach of such contracts, deadly conflicts threaten to erupt, as the struggles of our closest relatives in the primate kingdom show in a frighteningly human way.[21] Without our highly developed thinking skills—deeply connected with the ability to exchange and coordinate ideas by using articulated sounds
between the members of a group—they have no choice but to follow their
territorial urge to fight. When a group has grown too large splits into
two parties and there is no space for one of them to migrate by
conquering an own new territory, the aggressive energy intended for this
purpose begins to discharge itself as a ‘war’ between them, inevitably
continuing until the weaker male party is completely wiped out. In
analogous situations (local overpopulation), humans are presented with
an option that has not existed in evolution until now. Due to their
heightened consciousness, opposing groups of men can choose to establish
treaties, agreeing to live in peace with their former enemies by
adhering to the agreed-upon rules, and sharing the resources of a
previously contested territory. In this regard, Aristotle's definition
of Homo sapiens as zoon politikon (political animal) remains justified to this day.
Agriculture and politics differ massively in terms of their
content, which is why both likely introduced independently of one
another, even if they mostly merged in the course of further demographic
and civilisational development. Both represent adaptive measures
designed to compensate for two different adverse living conditions (mere
food shortages versus overpopulation crisis), without it being possible
to determine with certainty which was introduced first. The initiation
of agriculture by an already existing political organisation is no more
strictly necessary than the reverse, but entirely conceivable that the
culture of nomadic herders could easily have originated from the idea of
a first hunting group feeding captured young animals grass to tame them
and let them grow, thereby ensuring a constant supply of living meat.
Creating first small gardens, which, due to their nature, favour
settlement over nomadic herding, would be a parallel innovation, likely
initiated by other groups of hunter-gatherers. Both directions of
agriculture seem to have clashed significantly over the available areas
in Mesopotamian's steppe 'Eden'
(see also the biblical feud between Cain as the 'tiller of the ground'
and Abel as the provider of meat); nevertheless, they reached a
political agreement in the context of the later establishment of first
city-states.
According to current research, the first Neolithic Revolution
began in Mesopotamia about 11.600 years ago. From there, it expanded via
migration into immediately adjacent regions, displacing and/or
assimilating the local hunter-gatherer cultures. This process is called Neolithisation,
reaching northern Europe around 5500 BCE. Cross-group organisations
founded by egalitarian communities may have existed there even before
the introduction of agriculture, a conclusion that archaeologists like
K. Schmidt and C. Renfrew have drawn from their cognitive‑archaeological calculations of the man-hours required for the construction of desert kites
(kilometre‑long traps for catching entire wild animal herds) or
megalithic buildings above a specified scale. An example for northern
Europe is Stonehenge. The initially very simple structure of this
monument: a circular earthen rampart or henge that encompassed an open mortuary ground (referring to Renfrew a Cause Away Camp),
was repeatedly reconsidered, altered, and expanded over a period of
2000 years and more. Ultimately this creative process culminated in a
version for which two fundamentally different types of material were
used (Sarsen, a soft surface sandstone vs. Bluestones of hard deep rock)
to erect two times two formations that are each identical in shape but
arranged concentrically in tiers, whereby their menhirs also contrast in
size, like the mythical giants and blue dwarfs.
An archaeological interpretation suggests that these differences may symbolize two previously unrelated ethnics
that encountered one another in southern England and, after initial
conflicts, reached an agreement to unite under an overarching
organization. Viewed in this light, the final version of Stonehenge represents a
politically conceived work of art. It depicts two distinct populations
that jointly administer the area and use their monument for two main
purposes: internally as a gathering place, for example for council
meetings or ceremonial rituals (promoting social cohesion), and
externally as a means of intimidating surrounding rival tribes. (See
also Renfrew’s hypothesis of a mutual 'arms race' to explaine the increasingly work-intensive megalithic structures built over time.)
The nucleus of this monument consists of two arc-shaped
formations which, unlike the two circles (excluding their environment
equally on all sides) indicate a clear direction. The larger arc, with
its 10 supporting Sarsen numerically only half as strong but truly
gigantic, encompasses that of the blue 'dwars', both equipped with two
additional menhirs on the monument’s axis aiming at the sun on the
morning of summer solstice, just as this heavenly God (cf. Helios; Aton; Shamash) begins to emerge from behind the horizon. These formations have been interpreted in various ways. Alongside the thesis
suggested by Thorpe: the larger arc with its menhirs as symbolic men
could depict the leading team of a political organization (cf.
Poseidon’s 10 sons as the ruling group on Atlantis), the prevailing view in science is that the clear targeting on the
moment when the classical sky deity starts losing his power may be
linked to a calendar marking the beginning of the harvest season in an
agrarian society. Renfrew supplements this picture with his assumption
that the creators of the megalithic monuments must have lived in
“egalitarian” group relationships—a thesis he bases on the average of 9
men and 8 women died per generation (about 30 years) and were laid
together without any indication of rank differences in the large
communal burial mounds of southern England.
The amazing monument of Stonehenge appears to have undergone no
further constructive changes since about 1400 BC; on the contrary, there
are traces of deliberate destruction, as archaeology often records when
foreign cultures displaces the previous ones. Around the same time, the
custom of communal burial came to an end; in its place, individual
tombs for single rulers began to appear (chieftains, priest‑kings, such
as the pharaohs in their pyramids), which bear witness to a distinctly
non‑egalitarian power hierarchy. Apart from that, the megalith culture
in the south of this island reached the Bronze Age around 3000 BC as evidenced by the tin mine in Cornwall and the proven trade of its metal as far as the Aegean.
Centres of Neolithic Revolution have been discovered in various
archaeologically locations worldwide; they emerged independently of one
another and at different times, though always within the current geological epoch. The most recent Neolithizations happened in the last 300 years in connection with the discovery and subsequent colonization
of Australia, the Americas and polar regions of our planet, still
ongoing in the depths of the Amazon rainforest. Communities living there
as Stone Age hunter-gatherers (the women often parallel create simplest
of gardens) were and are wiped out or introduced to the achievements of
our modern civilisation within a few decades.
Shift from egalitarian to hierarchical relationships
The need to plan and coordinate the argriculturuan communities' food production, manpower and resource allocation encouraged the division of labour, gradually leading to the emergence of specialised professions within increasingly complex societies.
Migration, military conquests, diplomacy, and trade in surplus goods
brought agrarian cultures into contact with outsiders, regardless of
whether these were small foraging societies remaining self-sufficient (see the rebellious herd of alleged animals around beast-man Enkidu), allready settled cross-group organizations, or nomadic tribes of ‘predatory’ horsemen. Cultures that were in some cases very strange encountered each other,
separately developed traditions, languages and narratives about the
world's creation became mixed. Knowledge was exchanged, and thinkers attempted to create uniform cosmogonies or metaphysical systems contributing to the rise of civilisations, philosophy and technological advancements.
Traditionally entrenched hierarchies between superior and
inferior groups are difficult to assume among the egalitarian social
associations of hunter-gatherers who founded first politically
organisations (proto-states, primordial polis) such as that of the three
male groups around Enlil, Anu, and Enki. Together with their seven divine wombs of Ninḫursaga,
this organization is described in Sumerian myths not only as originator
of agriculture and of the first human couples in the landscape of “Eden”, but also as the one responsible for the catastrophic deluge that later became known as the Flood. Both narratives—the Atrahasian
original no less than its biblical echo—tell of the gods’ attempt to
destroy humanity, their wayward creation; yet only the former one speaks
of humans who were made in order to pacify a political conflict among
the gods and to serve them as subservient laborers. It is difficult to
verify the historical authenticity of this story; certainty only that
various myths reveal the pattern of creating artificial humans and/or
arranging their mating, always with the aim of subduing rebellious
groups. (See the manufacture of Pandora in response to Prometheus’s breach of contract; Plato’s dismemberment of the spherical humans into weak individuals; Enkidu’s separation from his group through Shamkat’s
seduction). Whether fictional or not, the epic Atra-Hasis tells of the
introduction of slavery in Eden – of humans reduced to ‘working
cattle’, as ultimate expression of a hierarchical relationship between
mental powerless creatures and intellectually superior groups of gods.
Leaving aside the question of how the pantheon of these
distinctly anthropomorphic creators came to be singularised and
abstracted into the infallible as well as almighty superpower of monotheistic religion,
it is evident that the increasingly specialized division into governing
“thinkers” and executing “workers” over time can initiate an ever more
power imbalance. This phenomenon of documented shifts in social
relations (including supportive ideologies) is linked to the emergence
and growth of initially simple cross-group organisations into modern
nations. As such, politics can be distinguished from advancements in the
domain of pure technology, including animal and plant breeding,
metallurgy, and so forth.
Physical health
The diet of hunter-gatherers was and remains well-balanced though
heavily dependent on what the environment could provide each season. In contrast, cultures that had already established the cultivation of calorie-rich crops were able to produce food surpluses, enabling population growth that would have been impossible under a hunter-gatherer lifestyle.
However, food abundance did not necessarily correlate with improved health. Reliance on a very limited variety of staple crops can adversely affect health even while making it possible to feed more people. A prime example of this is maize, which was domesticated in the Americas at the dawn of the Neolithic Revolution there. It is rich in starch but a poor source of iron; it also supplies insufficient amounts of essential amino acids such as lysine and tryptophan. Other factors that likely began to affect the health of early farmers as well as their livestock include the exchange of parasites, damaging bacteria, and viruses between both sides of this relationship. Originally evolutionarily adapted to their specific host, these pathogens jumped to the other species, leading to the emergence of previously unknown diseases.
Increasingly densely populated areas, with their accumulation of human
and animal waste, represent another source of infection by contaminated
food and water supplies. Fertilizers and irrigation may have increased crop yields but also would have promoted proliferation of bacteria in the local environment while grain storage attracted additional insects and rodents.
Evolution of temperatures in the Post-Glacial period after the Last Glacial Maximum (LGM) according to Greenland ice cores. The birth of agriculture corresponds to the period of quickly rising temperature at the end of the cold spell of the Younger Dryas and the beginning of the long and warm period of the Holocene.Map
of the world showing approximate centres of origin of agriculture and
its spread in prehistory: the Fertile Crescent (11,000 BP), the Yangtze
and Yellow River basins (9,000 BP) and the Papua New Guinea Highlands
(9,000–6,000 BP), Central Mexico (5,000–4,000 BP), Northern South
America (5,000–4,000 BP), sub-Saharan Africa (5,000–4,000 BP, exact
location unknown), eastern North America (4,000–3,000 BP).Associations of wild cereals and other wild grasses in Israel.
The term 'neolithic revolution' was invented by V. Gordon Childe in his book Man Makes Himself (1936). Childe introduced it as the first in a series of agricultural revolutions in Middle Eastern history, calling it a "revolution" to denote its significance, the degree of
change to communities adopting and refining agricultural practices.
The beginning of this process in different regions has been dated from 10,000 to 8,000 BCE in the Fertile Crescent, and perhaps 8000 BCE in the Kuk Early Agricultural Site of Papua New Guinea in Melanesia. Everywhere, this transition is associated with a change from a largely nomadichunter-gatherer way of life to a more settled, agrarian one, with the domestication
of various plant and animal species – depending on the species locally
available, and influenced by local culture. Archaeological research in
2003 suggests that in some regions, such as the Southeast Asian
peninsula, the transition from hunter-gatherer to agriculturalist was
not linear, but region-specific.
Once agriculture started gaining momentum, around 9000 BP, human activity resulted in the selective breeding of cereal grasses (beginning with emmer, einkorn and barley),
and not simply of those that favoured greater caloric returns through
larger seeds. Plants with traits such as small seeds or bitter taste
were seen as undesirable. Plants that rapidly shed their seeds on
maturity tended not to be gathered at harvest, therefore not stored and
not seeded the following season; successive years of harvesting
spontaneously selected for strains that retained their edible seeds
longer.
An
"Orange slice" sickle blade element with inverse, discontinuous retouch
on each side, not denticulated. Found in large quantities at Qaraoun II
and often with Heavy Neolithic tools in the flint workshops of the Beqaa Valley in Lebanon. Suggested by James Mellaart to be older than the Pottery Neolithic of Byblos (around 8,400 cal. BP).
Daniel Zohary identified several plant species as "pioneer crops" or Neolithic founder crops. He highlighted the importance of wheat, barley and rye, and suggested that domestication of flax, peas, chickpeas, bitter vetch and lentils came a little later. Based on analysis of the genes of domesticated plants, he preferred theories of a single, or at most a very small number of domestication events for each taxon that spread in an arc from the Levantine corridor around the Fertile Crescent and later into Europe.Gordon Hillman
and Stuart Davies carried out experiments with varieties of wild wheat
to show that the process of domestication would have occurred over a
relatively short period of between 20 and 200 years.
Some of the pioneering attempts failed at first and crops were
abandoned, sometimes to be taken up again and successfully domesticated
thousands of years later: rye, tried and abandoned in Neolithic Anatolia,
made its way to Europe as weed seeds and was successfully domesticated
in Europe, thousands of years after the earliest agriculture. Wild lentils presented a different problem: most of the wild seeds do
not germinate in the first year; the first evidence of lentil
domestication, breaking dormancy in their first year, appears in the
early Neolithic at Jerf el Ahmar (in modern Syria), and lentils quickly spread south to the Netiv HaGdud site in the Jordan Valley. The process of domestication allowed the founder crops to adapt and
eventually become larger, more easily harvested, more dependable in storage and more useful to the human population.
Neolithic grindstone or quern for processing grain
Selectively propagated figs, wild barley and wild oats were cultivated at the early Neolithic site of Gilgal I, where in 2006 archaeologists found caches of seeds of each in quantities too large to be accounted for even by intensive gathering, at strata datable to c.
11,000 years ago. Some of the plants tried and then abandoned during
the Neolithic period in the Ancient Near East, at sites like Gilgal,
were later successfully domesticated in other parts of the world.
Once early farmers perfected their agricultural techniques like irrigation (traced as far back as the 6th millennium BCE in Khuzistan), their crops yielded
surpluses that needed storage. Most hunter-gatherers could not easily
store food for long due to their migratory lifestyle, whereas those with
a sedentary dwelling could store their surplus grain. Eventually granaries
were developed that allowed villages to store their seeds longer. So
with more food, the population expanded and communities developed
specialized workers and more advanced tools.
The process was not as linear as was once thought, but a more
complicated effort, which was undertaken by different human populations
in different regions in many different ways.
Genetic analysis on the spread of barley from 9,000 to 2,000 BP
One of the world's most important crops, barley, was domesticated in the Near East around 11,000 years ago (c. 9,000 BCE). Barley is a highly resilient crop, able to grow in varied and marginal
environments, such as in regions of high altitude and latitude. Archaeobotanical evidence shows that barley had spread throughout Eurasia by 2,000 BCE. To further elucidate the routes by which barley cultivation was spread
through Eurasia, genetic analysis was used to determine genetic
diversity and population structure in extant barley taxa. Genetic analysis shows that cultivated barley spread through Eurasia
via several different routes, which were most likely separated in both
time and space.
When hunter-gathering began to be replaced by sedentary food production
it became more efficient to keep animals close at hand. Therefore, it
became necessary to bring animals permanently to their settlements,
although in many cases there was a distinction between relatively
sedentary farmers and nomadic herders. The animals' size, temperament, diet, mating patterns, and life span
were factors in the desire and success in domesticating animals. Animals
that provided milk, such as cows and goats, offered a source of protein
that was renewable and therefore quite valuable. The animal's ability
as a worker (for example ploughing or towing), as well as a food source,
also had to be taken into account. Besides being a direct source of
food, certain animals could provide leather, wool, hides, and
fertilizer. Some of the earliest domesticated animals included dogs (East Asia, about 15,000 years ago), sheep, goats, cows, and pigs.
The presence of these animals gave the region a large advantage
in cultural and economic development. As the climate in the Middle East
changed and became drier, many of the farmers were forced to leave,
taking their domesticated animals with them. It was this massive
emigration from the Middle East that later helped distribute these
animals to the rest of Afroeurasia.
This emigration was mainly on an east–west axis of similar climates, as
crops usually have a narrow optimal climatic range outside of which
they cannot grow for reasons of light or rain changes. For instance,
wheat does not normally grow in tropical climates, just like tropical
crops such as bananas do not grow in colder climates. Some authors, like
Jared Diamond, have postulated that this east–west axis is the main reason why plant and animal domestication spread so quickly from the Fertile Crescent to the rest of Eurasia and North Africa, while it did not reach through the north–south axis of Africa to reach the Mediterranean climates of South Africa, where temperate crops were successfully imported by ships in the last 500 years. Similarly, the African Zebu
of central Africa and the domesticated bovines of the fertile-crescent –
separated by the dry sahara desert – were not introduced into each
other's region.
Use-wear analysis of five glossed flint blades found at Ohalo II, a 23,000-years-old fisher-hunter-gatherers' camp on the shore of the Sea of Galilee, Northern Israel, provides the earliest evidence for the use of composite cereal harvesting tools. The Ohalo site is at the junction of the Upper Paleolithic and the Early Epipaleolithic, and has been attributed to both periods.
The wear traces indicate that tools were used for harvesting
near-ripe semi-green wild cereals, shortly before grains are ripe and
disperse naturally. The studied tools were not used intensively, and they reflect two
harvesting modes: flint knives held by hand and inserts hafted in a
handle. The finds shed new light on cereal harvesting techniques some 8,000 years before the Natufian and 12,000 years before the establishment of sedentary farming communities in the Near East. Furthermore, the new finds accord well with evidence for the earliest
ever cereal cultivation at the site and the use of stone-made grinding
implements.
Agriculture appeared first in West Asia
about 10,000–9,000 years ago. The region was the centre of
domestication for three cereals (einkorn wheat, emmer wheat and barley),
four legumes (lentil, pea, bitter vetch and chickpea), and flax.
Domestication was a slow process that unfolded across multiple regions,
and was preceded by centuries if not millennia of pre-domestication cultivation.
Other sites in the Levantine corridor that show early evidence of agriculture include Wadi Faynan 16 and Netiv Hagdud. Jacques Cauvin noted that the settlers of Aswad did not domesticate on site, but "arrived, perhaps from the neighbouring Anti-Lebanon, already equipped with the seed for planting". In the Eastern Fertile Crescent, evidence of cultivation of wild plants has been found in Choga Gholan in Iran
dated to 12,000 BP, with domesticated emmer wheat appearing in 9,800
BP, suggesting there may have been multiple regions in the Fertile
Crescent where cereal domestication evolved roughly contemporaneously. The Heavy NeolithicQaraoun culture has been identified at around fifty sites in Lebanon around the source springs of the River Jordan, but never reliably dated.
Spatial distribution of rice, millet and mixed farming sites in Neolithic China (He et al., 2017)
Agriculture in Neolithic China can be separated into two broad regions, Northern China and Southern China.
The agricultural centre in northern China is believed to be the homelands of the early Sino-Tibetan-speakers, associated with the Houli, Peiligang, Cishan, and Xinglongwacultures, clustered around the Yellow River basin. It was the domestication centre for foxtail millet (Setaria italica) and broomcorn millet (Panicum miliaceum), with early evidence of domestication approximately 8,000 years ago, and widespread cultivation 7,500 years ago. (Soybean was also domesticated in northern China 4,500 years ago. Orange and peach also originated in China, being cultivated c. 2500 BCE.)
Possible language family homelands, and likely routes of early rice transfer (c. 3,500 to 500 BCE). The approximate coastlines during the early Holocene are shown in lighter blue. (Bellwood, 2011)
The agricultural centres in southern China are clustered around the Yangtze River basin. Rice was domesticated in this region, together with the development of paddy field cultivation, between 13,500 and 8,200 years ago.
There are two possible centres of domestication for rice. The first is in the lower Yangtze River, believed to be the homelands of pre-Austronesians and associated with the Kauhuqiao, Hemudu, Majiabang, and Songzecultures.
It is characterized by typical pre-Austronesian features, including
stilt houses, jade carving, and boat technologies. Their diet were also
supplemented by acorns, water chestnuts, foxnuts, and pig domestication. The second is in the middle Yangtze River, believed to be the homelands of the early Hmong–Mien speakers and associated with the Pengtoushan and Daxicultures. Both of these regions were heavily populated and had regular trade contacts with each other, as well as with early Austroasiatic speakers to the west, and early Kra-Dai speakers to the south, facilitating the spread of rice cultivation throughout southern China.
The millet and rice-farming cultures also first came into contact
with each other at around 9,000 to 7,000 BP, resulting in a corridor
between the millet and rice cultivation centres where both rice and
millet were cultivated. At around 5,500 to 4,000 BP, there was increasing migration into Taiwan from the early Austronesian Dapenkeng culture,
bringing rice and millet cultivation technology with them. During this
period, there is evidence of large settlements and intensive rice
cultivation in Taiwan and the Penghu Islands, which may have resulted in overexploitation. Bellwood (2011) proposes that this may have been the impetus of the Austronesian expansion which started with the migration of the Austronesian-speakers from Taiwan to the Philippines at around 5,000 BP.
Austronesians carried rice cultivation technology to Island Southeast Asia
along with other domesticated species. The new tropical island
environments also had new food plants that they exploited. They carried
useful plants and animals during each colonization voyage, resulting in the rapid introduction of domesticated and semi-domesticated species throughout Oceania. They also came into contact with the early agricultural centres of Papuan-speaking populations of New Guinea as well as the Dravidian-speaking regions of South India and Sri Lanka
by around 3,500 BP. They acquired further cultivated food plants like
bananas and pepper from them, and in turn introduced Austronesian
technologies like wetland cultivation and outrigger canoes. During the 1st millennium CE, they also colonized Madagascar and the Comoros, bringing Southeast Asian food plants, including rice, to East Africa.
Africa
On the African continent, three areas have been identified as having independently developed agriculture: the Ethiopian highlands, the Sahel and West Africa. By contrast, agriculture in the Nile River Valley is thought to be related to migration of populations and to have developed from the original Neolithic Revolution in the Fertile Crescent.
Many grinding stones are found with the early Egyptian Sebilian and Mechian cultures and evidence has been found of a Neolithic domesticated crop-based economy dating around 7,000 BP. Unlike the Middle East, this evidence appears as a "false dawn" to
agriculture, as the sites were later abandoned, and permanent farming
then was delayed until 6,500 BP with the Tasian culture and Badarian culture and the arrival of crops and animals from the Near East.
Bananas and plantains, which were first domesticated in Southeast Asia, most likely Papua New Guinea, were re-domesticated in Africa possibly as early as 5,000 years ago. Asian yams and taro were also cultivated in Africa.
The most famous crop domesticated in the Ethiopian highlands is coffee. In addition, khat, ensete, noog, teff and finger millet were also domesticated in the Ethiopian highlands. Crops domesticated in the Sahel region include sorghum and pearl millet. The kola nut was first domesticated in West Africa. Other crops domesticated in West Africa include African rice, yams and the oil palm.
Agriculture spread to Central and Southern Africa in the Bantu expansion during the 1st millennium BCE to 1st millennium CE.
Map of the world in 2000 BCE, just after the end of the 3rd millennium BCE, colour coded by cultural stage.
The term "Neolithic" is not customarily used in describing cultures
in the Americas. However, a broad similarity exists between Eastern
Hemisphere cultures of the Neolithic and cultures in the Americas. Maize (corn), beans and squash were among the earliest crops domesticated in Mesoamerica: squash as early as 6000 BCE, beans no later than 4000 BCE, and maize beginning about 7000 BCE. Potatoes and manioc were domesticated in South America. In what is now the eastern United States, Native Americans domesticated sunflower, sumpweed and goosefoot c. 2500 BCE.
In the highlands of central Mexico, sedentary village life based on
farming did not develop until the "formative period" in the second
millennium BCE.
Evidence of drainage ditches at Kuk Swamp on the borders of the Western and Southern Highlands of Papua New Guinea indicates cultivation of taro and a variety of other crops, dating back to 11,000 BP. Two potentially significant economic species, taro (Colocasia esculenta) and yam (Dioscorea sp.), have been identified dating at least to 10,200 calibrated years before present (cal BP). Further evidence of bananas and sugarcane
dates to 6,950 to 6,440 BCE. This was at the altitudinal limits of
these crops, and it has been suggested that cultivation in more
favourable ranges in the lowlands may have been even earlier. CSIRO has found evidence that taro was introduced into the Solomon Islands for human use, from 28,000 years ago, making taro the earliest cultivated crop in the world. It seems to have resulted in the spread of the Trans–New Guinea languages from New Guinea east into the Solomon Islands and west into Timor and adjacent areas of Indonesia. This seems to confirm the theories of Carl Sauer who, in "Agricultural Origins and Dispersals", suggested as early as 1952 that this region was a centre of early agriculture.
Spread of farming from Southwest Asia to Europe, between 9600 and 3800 BCE
Archaeologists trace the emergence of food-producing societies in the Levantine
region of southwest Asia at the close of the last glacial period around
12,000 BCE, and developed into a number of regionally distinctive
cultures by the eighth millennium BCE. Remains of food-producing
societies in the Aegean have been carbon-dated to c. 6500 BCE at Knossos, Franchthi Cave, and a number of mainland sites in Thessaly. Neolithic groups appear soon afterwards in the Balkans and south-central Europe. The Neolithic cultures of southeastern Europe (the Balkans and the Aegean) show some continuity with groups in southwest Asia and Anatolia (e.g., Çatalhöyük).
Current evidence suggests that Neolithic material culture was
introduced to Europe via western Anatolia. All Neolithic sites in Europe
contain ceramics, and contain the plants and animals domesticated in Southwest Asia: einkorn, emmer, barley, lentils, pigs, goats, sheep, and cattle.
Genetic data suggest that no independent domestication of animals took
place in Neolithic Europe, and that all domesticated animals were
originally domesticated in Southwest Asia. The only domesticate not from Southwest Asia was broomcorn millet, domesticated in East Asia.The earliest evidence of cheese-making dates to 5500 BCE in Kujawy, Poland.
The diffusion across Europe, from the Aegean to Britain, took
about 2,500 years (8500–6000 BP). The Baltic region was penetrated a bit
later, around 5500 BP, and there was also a delay in settling the Pannonian plain.
In general, colonization shows a "saltatory" pattern, as the Neolithic
advanced from one patch of fertile alluvial soil to another, bypassing
mountainous areas. Analysis of radiocarbon
dates show clearly that Mesolithic and Neolithic populations lived side
by side for as much as a millennium in many parts of Europe, especially
in the Iberian peninsula and along the Atlantic coast.
Carbon 14 evidence
Ancient
European Neolithic farmers were genetically closest to modern
Near-Eastern/ Anatolian populations. The map shows genetic matrilineal
distances between European Neolithic Linear Pottery Culture populations (5,500–4,900 calibrated BP) and modern Western Eurasian populations.
The spread of the Neolithic from the Near East Neolithic to Europe was first studied quantitatively in the 1970s, when a sufficient number of Carbon 14 age determinations for early Neolithic sites had become available. In 1973, Ammerman and Cavalli-Sforza
discovered a linear relationship between the age of an Early Neolithic
site and its distance from the conventional source in the Near East (Jericho), demonstrating that the Neolithic spread at an average speed of about 1 km/yr. More recent studies (2005) confirm these results and yield the speed of 0.6–1.3 km/yr (at 95% confidence level).
Analysis of mitochondrial DNA
Since the original human expansions out of Africa 200,000 years ago, different prehistoric and historic migration events have taken place in Europe. Considering that the movement of the people implies a consequent
movement of their genes, it is possible to estimate the impact of these
migrations through the genetic analysis of human populations. Agricultural and husbandry practices originated 10,000 years ago in a region of the Near East known as the Fertile Crescent. According to the archaeological record this phenomenon, known as
"Neolithic", rapidly expanded from these territories into Europe.
However, whether this diffusion was accompanied or not by human migrations is greatly debated. Mitochondrial DNA – a type of maternally inherited DNA located in the cell cytoplasm – was recovered from the remains of Pre-Pottery Neolithic B (PPNB) farmers in the Near East
and then compared to available data from other Neolithic populations in
Europe and also to modern populations from South Eastern Europe and the
Near East. The obtained results show that substantial human migrations were
involved in the Neolithic spread and suggest that the first Neolithic
farmers entered Europe following a maritime route through Cyprus and the Aegean Islands.
Map of the spread of Neolithic farming cultures from the Near-East to Europe, with dates in year BCE.
Modern distribution of the haplotypes of PPNB farmers
Genetic distance between PPNB farmers and modern populations
Early Neolithic sites in the Near East and South Asia 10,000–3,800 BP
Neolithic dispersal from the Near East to South Asia suggested by the time of establishment of Neolithic sites as a function of distance from Gesher, Israel. The dispersal rate amounts to about 0.6 km per year
The earliest Neolithic site in South Asia is Mehrgarh, dated to between 6500 and 5500 BCE, in the Kachi plain of Balochistan, Pakistan; the site has evidence of farming (wheat and barley) and herding (cattle, sheep and goats).
There is strong evidence for causal connections between the
Near-Eastern Neolithic and that further east, up to the Indus Valley. There are several lines of evidence that support the idea of connection
between the Neolithic in the Near East and in the Indian subcontinent. The prehistoric site of Mehrgarh in Baluchistan (modern Pakistan) is
the earliest Neolithic site in the north-west Indian subcontinent, dated
as early as 8500 BCE.
Neolithic domesticated crops in Mehrgarh include more than 90%
barley and a small amount of wheat. There is good evidence for the local
domestication of barley and the zebu cattle at Mehrgarh, but the wheat
varieties are suggested to be of Near-Eastern origin, as the modern
distribution of wild varieties of wheat is limited to Northern Levant
and Southern Turkey.
A detailed satellite map study of a few archaeological sites in
the Baluchistan and Khybar Pakhtunkhwa regions also suggests
similarities in early phases of farming with sites in Western Asia. Pottery prepared by sequential slab construction, circular fire pits
filled with burnt pebbles, and large granaries are common to both
Mehrgarh and many Mesopotamian sites.
The postures of the skeletal remains in graves at Mehrgarh bear
strong resemblance to those at Ali Kosh in the Zagros Mountains of
southern Iran. Despite their scarcity, the Carbon-14 and archaeological age
determinations for early Neolithic sites in Southern Asia exhibit
remarkable continuity across the vast region from the Near East to the
Indian Subcontinent, consistent with a systematic eastward spread at a
speed of about 0.65 km/yr.
Causes
The most prominent of several theories (not mutually exclusive) as to
factors that caused populations to develop agriculture include:
The Oasis Theory, originally proposed by Raphael Pumpelly in 1908, popularized by V. Gordon Childe in 1928 and summarised in Childe's book Man Makes Himself. This theory maintains that as the climate got drier due to the Atlantic
depressions shifting northward, communities contracted to oases
where they were forced into close association with animals, which were
then domesticated together with planting of seeds. However, this theory
now has little support amongst archaeologists because subsequent climate
data suggests that the region was getting wetter rather than drier.
The Hilly Flanks hypothesis, proposed by Robert John Braidwood in 1948, suggests that agriculture began in the hilly flanks of the Taurus and Zagros Mountains,
where the climate was not drier as Childe had believed, and fertile
land supported a variety of plants and animals amenable to
domestication.
The Feasting model by Brian Hayden suggests that agriculture was
driven by ostentatious displays of power, such as giving feasts, to
exert dominance. This required assembling large quantities of food,
which drove agricultural technology.
The Demographic theories proposed by Carl Sauer and adapted by Lewis Binford and Kent Flannery posit an increasingly sedentary population that expanded up to the carrying capacity
of the local environment and required more food than could be gathered.
Various social and economic factors helped drive the need for food.
The evolutionary/intentionality theory, developed by David Rindos
and others, considers agriculture as an evolutionary adaptation of
plants and humans. Starting with domestication by protection of wild
plants, it resulted specialization of location and then complete
domestication.
Leonid Grinin
argues that whatever plants were cultivated, the independent invention
of agriculture always occurred in special natural environments (e.g.,
South-East Asia). It is supposed that the cultivation of cereals started
somewhere in the Near East: in the hills of Israel or Egypt. So Grinin
dates the beginning of the agricultural revolution within the interval
12,000 to 9,000 BP, though in some cases the first cultivated plants or
domesticated animals' bones are even of a more ancient age of 14–15
thousand years ago.
Andrew Moore suggested that the Neolithic Revolution originated over long periods of development in the Levant, possibly beginning during the Epipaleolithic. In "A Reassessment of the Neolithic Revolution", Frank Hole further expanded the relationship between plant and animal domestication.
He suggested the events could have occurred independently during
different periods of time, in as yet unexplored locations. He noted that
no transition site had been found documenting the shift from what he
termed immediate and delayed return social systems. He noted that the full range of domesticated animals (goats, sheep, cattle and pigs) were not found until the sixth millennium BCE at Tell Ramad. Hole concluded that "close attention should be paid in future investigations to the western margins of the Euphrates basin, perhaps as far south as the Arabian Peninsula, especially where wadis carrying Pleistocene rainfall runoff flowed."
The "niche construction" model, which has gained traction in recent
decades, emphasises that humans actively modified their environments —
through burning, selective harvesting, and transplantation — creating
conditions that made agriculture increasingly viable and eventually
necessary. Research by Dolores Piperno and Irene Holst on the phytolith
and starch grain evidence from early Neolithic sites has provided
empirical support for gradual, multi-stage transitions to domestication
rather than sudden revolutionary change.
Genetic and genomic research since the 2000s has also transformed
understanding of agricultural origins. Ancient DNA studies have shown
that the spread of agriculture into Europe involved both the migration
of farming populations from Anatolia and the adoption of farming
practices by indigenous hunter-gatherers, rather than a single process.
Consequences
Social change
World population (estimated) did not rise for a few millennia after the Neolithic revolution.
Despite the significant technological advance and advancements in
knowledge, arts and trade, the Neolithic revolution did not lead
immediately to a rapid growth of population. Its benefits appear to have
been offset by various adverse effects, mostly diseases and warfare.
The introduction of agriculture has not necessarily led to
unequivocal progress. The nutritional standards of the growing Neolithic
populations were inferior to that of hunter-gatherers. Several
ethnological and archaeological studies conclude that the transition to
cereal-based diets caused a reduction in life expectancy and stature, an
increase in infant mortality and infectious diseases, the development
of chronic, inflammatory or degenerative diseases (such as obesity, type 2 diabetes and cardiovascular diseases) and multiple nutritional deficiencies, including vitamin deficiencies, iron deficiency anemia and mineral disorders affecting bones (such as osteoporosis and rickets) and teeth. Average height for Europeans went down from 178 centimetres (5 ft
10 in) for men and 168 centimetres (5 ft 6 in) for women to 165 and 155
centimetres (5 ft 5 in and 5 ft 1 in) respectively, and it took until
the twentieth century for average height for Europeans to return to the
pre-Neolithic Revolution levels.
The traditional view is that agricultural food production
supported a denser population, which in turn supported larger sedentary
communities, the accumulation of goods and tools, and specialization in
diverse forms of new labor. Food surpluses made possible the development
of a social elite who were not otherwise engaged in agriculture,
industry or commerce, but dominated their communities by other means and
monopolized decision-making. Nonetheless, larger societies made it more
feasible for people to adopt diverse decision making and governance
models. Jared Diamond (in The World Until Yesterday)
identifies the availability of milk and cereal grains as permitting
mothers to raise both an older (e.g. 3 or 4 year old) and a younger
child concurrently. The result is that a population can increase more
rapidly. Diamond, in agreement with feminist scholars such as V. Spike Peterson, points out that agriculture brought about deep social divisions and encouraged gender inequality. This social reshuffle is traced by historical theorists, like Veronica Strang, through developments in theological depictions. Strang supports her theory through a comparison of aquatic deities
before and after the Neolithic Agricultural Revolution, most notably the
Venus of Lespugue and the Greco-Roman deities such as Circe or Charybdis:
the former venerated and respected, the latter dominated and conquered.
The theory, supplemented by the widely accepted assumption from Parsons
that "society is always the object of religious veneration", argues that with the centralization of government and the dawn of the
Anthropocene, roles within society became more restrictive and were
rationalized through the conditioning effect of religion; a process that
is crystallized in the progression from polytheism to monotheism.
Andrew Sherratt has argued that following upon the Neolithic Revolution was a second phase of discovery that he refers to as the secondary products revolution. Animals, it appears, were first domesticated purely as a source of meat. The Secondary Products Revolution occurred when it was recognised that
animals also provided a number of other useful products. These included:
milk (from goats, cattle, yaks, sheep, horses, and camels)
traction (from oxen, onagers, donkeys, horses, camels, and dogs)
guarding and herding assistance (dogs)
Sherratt argued that this phase in agricultural development enabled
humans to make use of the energy possibilities of their animals in new
ways, and permitted permanent intensive subsistence farming and crop
production, and the opening up of heavier soils for farming. It also
made possible nomadic pastoralism in semi arid areas, along the margins of deserts, and eventually led to the domestication of both the dromedary and Bactrian camel. Overgrazing of these areas, particularly by herds of goats, greatly extended the areal extent of deserts.
Diet and health
Compared to foragers, Neolithic farmers' diets were higher in carbohydrates but lower in fibre, micronutrients, and protein. This led to an increase in the frequency of carious teeth and slower growth in childhood and increased body fat,
and studies have consistently found that populations around the world
became shorter after the transition to agriculture. This trend may have
been exacerbated by the greater seasonality of farming diets and with it
the increased risk of famine due to crop failure.
Throughout the development of sedentary societies, disease spread
more rapidly than it had during the time in which hunter-gatherer
societies existed. Inadequate sanitary practices and the domestication
of animals may explain the rise in deaths and sickness following the
Neolithic Revolution, as diseases jumped from the animal to the human
population. Some examples of infectious diseases spread from animals to humans are influenza, smallpox, and measles. Ancient microbial genomics has shown that progenitors to human-adapted strains of Salmonella enterica
infected up to 5,500 year old agro-pastoralists throughout Western
Eurasia, providing molecular evidence for the hypothesis that the
Neolithization process facilitated the emergence of Salmonella
entericia.
In concordance with a process of natural selection, the humans who first domesticated the big mammals
quickly built up immunities to the diseases as within each generation
the individuals with better immunities had better chances of survival.
In their approximately 10,000 years of shared proximity with animals,
such as cows, Eurasians and Africans became more resistant to those
diseases compared with the indigenous populations encountered outside Eurasia and Africa. For instance, the population of most Caribbean and several Pacific Islands have been completely wiped out by diseases. 90% or more of many populations of the Americas were wiped out by European and African diseases before recorded contact with European explorers or colonists. Some cultures like the Inca Empire did have a large domestic mammal, the llama,
but llama milk was not drunk, nor did llamas live in a closed space
with humans, so the risk of contagion was limited. According to
bioarchaeological research, the effects of agriculture on dental health
in Southeast Asian rice farming societies from 4000 to 1500 BP was not
detrimental to the same extent as in other world regions.