Search This Blog

Tuesday, March 15, 2022

Bioarchaeology

From Wikipedia, the free encyclopedia

The term bioarchaeology has been attributed to British archaeologist Grahame Clark who, in 1972, defined it as the study of animal and human bones from archaeological sites. Redefined in 1977 by Jane Buikstra, bioarchaeology in the United States now refers to the scientific study of human remains from archaeological sites, a discipline known in other countries as osteoarchaeology, osteology or palaeo-osteology. Compared to bioarchaeology, osteoarchaeology is the scientific study that solely focus on the human skeleton. The human skeleton is used to tell us about health, lifestyle, diet, mortality and physique of the past. Furthermore, palaeo-osteology is simple the study of ancient bones.

In contrast, the term bioarchaeology is used in Europe to describe the study of all biological remains from archaeological sites. Although Clark used it to describe just human remains and animal remains (zoology/archaeozoology), increasingly modern archaeologists also include botanical remains (botany/archaeobotany

Bioarchaeology was largely born from the practices of New Archaeology, which developed in the United States in the 1970s as a reaction to a mainly cultural-historical approach to understanding the past. Proponents of New Archaeology advocated using processual methods to test hypotheses about the interaction between culture and biology, or a biocultural approach. Some archaeologists advocate a more holistic approach to bioarchaeology that incorporates critical theory and is more relevant to modern descent populations.

If possible, human remains from archaeological sites are analyzed to determine sex, age, and health. which all fall under the term 'Bioarchaeology'.

Paleodemography

Paleodemography is the field that attempts to identify demographic characteristics from the past population. The information gathered is used to make interpretations. Bioarchaeologists use paleodemography sometimes and create life tables, a type of cohort analysis, to understand the demographic characteristics (such as risk of death or sex ratio) of a given age cohort within a population. Age and sex are crucial variables in the construction of a life table, although this information is often not available to bioarchaeologists. Therefore, it is often necessary to estimate the age and sex of individuals based on specific morphological characteristics of the skeleton.

Age estimation

The estimation of age in bioarchaeology and osteology actually refers to an approximation of skeletal or biological age-at-death. The primary assumption in age estimation is that an individual's skeletal age is closely associated with their chronological age. Age estimation can be based on patterns of growth and development or degenerative changes in the skeleton. Many methods tracking these types of changes have been developed using a variety of skeletal series. For instance, in children age is typically estimated by assessing their dental development, ossification and fusion of specific skeletal elements, or long bone length. For children, the different points of time at which different teeth erupt from the gums are best known for telling a child's age down to the exact year. But once the teeth are fully developed, age in hard to be determined using teeth. In adults, degenerative changes to the pubic symphysis, the auricular surface of the ilium, the sternal end of the 4th rib, and dental attrition are commonly used to estimate skeletal age.

When using bones to determine age, there might be problems that you might face. Until the age of about 30, the human bones are still growing. Different bones are fusing at different points of growth. Some bones might not follow the correct stages of growth which can mess with your analysis. Also, as you get older there is wear and tear on the humans' bones and the age estimate becomes less precise as the bone gets older. The bones then become categorized as either 'young' (20–35 years), 'middle' (35–50 years), or 'old' (50+ years).

Sex determination

Differences in male and female skeletal anatomy are used by bioarchaeologists to determine the biological sex of human skeletons. Humans are sexually dimorphic, although overlap in body shape and sexual characteristics is possible. Not all skeletons can be assigned a sex, and some may be wrongly identified as male or female. Sexing skeletons is based on the observation that biological males and biological females differ most in the skull and pelvis; bioarchaeologists focus on these parts of the body when determining sex, although other body parts can also be used. The female pelvis is generally broader than the male pelvis, and the angle between the two inferior pubic rami (the sub-pubic angle) is wider and more U-shaped, while the sub-pubic angle of the male is more V-shaped and less than 90 degrees. Phenice details numerous visual differences between the male and female pelvis.

In general, the male skeleton is more robust than the female skeleton because of the greater muscles mass of the male. Males generally have more pronounced brow ridges, nuchal crests, and mastoid processes. It should be remembered that skeletal size and robustness are influenced by nutrition and activity levels. Pelvic and cranial features are considered to be more reliable indicators of biological sex. Sexing skeletons of young people who have not completed puberty is more difficult and problematic than sexing adults, because the body has not had time to develop fully.

Bioarchaeological sexing of skeletons is not error-proof. In reviewing the sexing of Egyptian skulls from Qua and Badari, Mann found that 20.3% could be assigned to a different sex than the sex indicated in the archaeological literature. A re-evalutaion of Mann's work showed that he did not understand the tomb numbering system of the old excavation and assigned wrong tomb numbers to the skulls. The sexing of the bone material was actually quite correct. However, recording errors and re-arranging of human remains may play a part in this great incidence of misidentification.

Direct testing of bioarchaeological methods for sexing skeletons by comparing gendered names on coffin plates from the crypt at Christ Church, Spitalfields, London to the associated remains resulted in a 98 percent success rate.

Sex-based differences are not inherently a form of inequality, but become an inequality when members of one sex are given privileges based on their sex. This stems from society investing differences with cultural and social meaning. Gendered work patterns may make their marks on the bones and be identifiable in the archaeological record. Molleson found evidence of gendered work patterns by noting extremely arthritic big toes, a collapse of the last dorsal vertebrae, and muscular arms and legs among female skeletons at Abu Hureyra. She interpreted this sex-based pattern of skeletal difference as indicative of gendered work patterns. These kinds of skeletal changes could have resulted from women spending long periods of time kneeling while grinding grain with the toes curled forward. Investigation of gender from mortuary remains is of growing interest to archaeologists.

Non-specific stress indicators

Dental non-specific stress indicators

Enamel hypoplasia

Enamel hypoplasia refers to transverse furrows or pits that form in the enamel surface of teeth when the normal process of tooth growth stops, resulting in a deficit of enamel. Enamel hypoplasias generally form due to disease and/or poor nutrition. Linear furrows are commonly referred to as linear enamel hypoplasias (LEHs); LEHs can range in size from microscopic to visible to the naked eye. By examining the spacing of perikymata grooves (horizontal growth lines), the duration of the stressor can be estimated, although Mays argues that the width of the hypoplasia bears only an indirect relationship to the duration of the stressor.

Studies of dental enamel hypoplasia are used to study child health. Unlike bone, teeth are not remodeled, so they can provide a more reliable indicator of past health events as long as the enamel remains intact. Dental hypoplasias provide an indicator of health status during the time in childhood when the enamel of the tooth crown is being formed. Not all of the enamel layers are visible on the surface of the tooth because enamel layers that are formed early in crown development are buried by later layers. Hypoplasias on this part of the tooth do not show on the surface of the tooth. Because of this buried enamel, teeth record stressors form a few months after the start of the event. The proportion of enamel crown formation time represented by this buried in enamel varies from up to 50 percent in molars to 15-20 percent in anterior teeth. Surface hypoplasias record stressors occurring from about one to seven years, or up to 13 years if the third molar is included.

Skeletal non-specific stress indicators

Porotic hyperostosis/cribra orbitalia

It was long assumed that iron deficiency anemia has marked effects on the flat bones of the cranium of infants and young children. That as the body attempts to compensate for low iron levels by increasing red blood cell production in the young, sieve-like lesions develop in the cranial vaults (termed porotic hyperostosis) and/or the orbits (termed cribra orbitalia). This bone is spongy and soft.

It is however, highly unlikely that iron deficiency anemia is a cause of either porotic hyperostosis or cribra orbitalia. These are more likely the result of vascular activity in these areas and are unlikely to be pathological. The development of cribra orbitalia and porotic hyperostosis could also be attributed to other causes besides an iron deficiency in the diet, such as nutrients lost to intestinal parasites. However, dietary deficiencies are the most probable cause.

Anemia incidence may be a result of inequalities within society, and/or indicative of different work patterns and activities among different groups within society. A study of iron-deficiency among early Mongolian nomads showed that although overall rates of cribra orbitalia declined from 28.7 percent (27.8 percent of the total female population, 28.4 percent of the total male population, 75 percent of the total juvenile population) during the Bronze and Iron Ages, to 15.5 percent during the Hunnu (2209–1907 BP) period, the rate of females with cribra orbitalia remained roughly the same, while the incidence of cribra orbitalia among males and children declined (29.4 percent of the total female population, 5.3 percent of the total male population, and 25 percent of the juvenile population had cribra orbitalia). Bazarsad posits several reasons for this distribution of cribra orbitalia: adults may have lower rates of cribra orbitalia than juveniles because lesions either heal with age or lead to death. Higher rates of cribia orbitalia among females may indicate lesser health status, or greater survival of young females with cribia orbitalia into adulthood.

Harris lines

Harris lines form before adulthood, when bone growth is temporarily halted or slowed down due to some sort of stress (either disease or malnutrition). During this time, bone mineralization continues, but growth does not, or does so at very reduced levels. If and when the stressor is overcome, bone growth will resume, resulting in a line of increased mineral density that will be visible in a radiograph. If there is not recovery from the stressor, no line will be formed.

Hair

The stress hormone cortisol is deposited in hair as it grows. This has been used successfully to detect fluctuating levels of stress in the later lifespan of mummies.

Mechanical stress and activity indicators

Examining the effects that activities and workload has upon the skeleton allows the archaeologist to examine who was doing what kinds of labor, and how activities were structured within society. The division of labor within the household may be divided according to gender and age, or be based on other hierarchical social structures. Human remains can allow archaeologists to uncover patterns in the division of labor.

Living bones are subject to Wolff's law, which states that bones are physically affected and remodeled by physical activity or inactivity. Increases in mechanical stress tend to produce bones that are thicker and stronger. Disruptions in homeostasis caused by nutritional deficiency or disease or profound inactivity/disuse/disability can lead to bone loss. While the acquisition of bipedal locomotion and body mass appear to determine the size and shape of children's bones, activity during the adolescent growth period seems to exert a greater influence on the size and shape of adult bones than exercise later in life.

Muscle attachment sites (also called entheses) have been thought to be impacted in the same way causing what were once called musculoskeletal stress markers, but now widely named entheseal changes. These changes were widely used to study activity-patterns, but research has shown that processes associated with aging have a greater impact than occupational stresses. It has also been shown that geometric changes to bone structure (described above) and entheseal changes differ in their underlying cause with the latter poorly affected by occupation. Joint changes, including osteoarthritis, have also been used to infer occupations but in general these are also manifestations of the aging process.

Markers of occupational stress, which include morphological changes to the skeleton and dentition as well as joint changes at specific locations have also been widely used to infer specific (rather than general) activities. Such markers are often based on single cases described in clinical literature in the late nineteenth century. One such marker has been found to be a reliable indicator of lifestyle: the external auditory exostosis also called surfer's ear, which is a small bony protuberance in the ear canal which occurs in those working in proximity to cold water.

One example of how these changes have been used to study activities is the New York African Burial Ground in New York. This provides evidence of the brutal working conditions under which the enslaved labored; osteoarthritis of the vertebrae was very common, even among the young. The pattern of osteoarthritis combined with the early age of onset provides evidence of labor that resulted in mechanical strain to the neck. One male skeleton shows stress lesions at 37 percent of 33 muscle or ligament attachments, showing he experienced significant musculoskeletal stress. Overall, the interred show signs of significant musculoskeletal stress and heavy workloads, although workload and activities varied among different individuals. Some individuals show high levels of stress, while others do not. This references the variety of types of labor (e.g., domestic vs. carrying heavy loads) labor that enslaved individuals were forced to perform.

Injury and workload

Fractures to bones during or after excavation will appear relatively fresh, with broken surfaces appearing white and unweathered. Distinguishing between fractures around the time of death and post-depositional fractures in bone is difficult, as both types of fractures will show signs of weathering. Unless evidence of bone healing or other factors are present, researchers may choose to regard all weathered fractures as post-depositional.

Evidence of perimortal fractures (or fractures inflicted on a fresh corpse) can be distinguished in unhealed metal blade injuries to the bones. Living or freshly dead bones are somewhat resilient, so metal blade injuries to bone will generate a linear cut with relatively clean edges rather than irregular shattering. Archaeologists have tried using the microscopic parallel scratch marks on cut bones in order to estimate the trajectory of the blade that caused the injury.

Diet and dental health

Caries

Dental caries, commonly referred to as cavities or tooth decay, are caused by localized destruction of tooth enamel, as a result of acids produced by bacteria feeding upon and fermenting carbohydrates in the mouth. Subsistence based upon agriculture is strongly associated with a higher rate of caries than subsistence based upon foraging, because of the higher levels of carbohydrates in diets based upon agriculture. For example, bioarchaeologists have used caries in skeletons to correlate a diet of rice and agriculture with the disease. Females may be more vulnerable to caries compared to men, due to lower saliva flow than males, the positive correlation of estrogens with increased caries rates, and because of physiological changes associated with pregnancy, such as suppression of the immune system and a possible concomitant decrease in antimicrobial activity in the oral cavity.

Stable isotope analysis

Stable isotope analysis of carbon and nitrogen in human bone collagen allows bioarchaeologists to carry out dietary reconstruction and to make nutritional inferences. These chemical signatures reflect long-term dietary patterns, rather than a single meal or feast. Stable isotope analysis monitors the ratio of carbon 13 to carbon 12 (13C/12C), which is expressed as parts per mil (per thousand) using delta notation (δ13C). The ratio of carbon isotopes varies according to the types of plants consumed with different photosynthesis pathways. The three photosynthesis pathways are C3 carbon fixation, C4 carbon fixation and Crassulacean acid metabolism. C4 plants are mainly grasses from tropical and subtropical regions, and are adapted to higher levels of radiation than C3 plants. Corn, millet and sugar cane are some well-known C4 domesticates, while all trees and shrubs use the C3 pathway. C3 plants are more common and numerous than C4 plants. Both types of plants occur in tropical areas, but only C3 plants occur naturally in colder areas. 12C and 13C occur in a ratio of approximately 98.9 to 1.1.

The 13C and 12C ratio is either depleted (more negative) or enriched (more positive) relative to the international standard, which is set to an arbitrary zero. The different photosynthesis pathways used by C3 and C4 plants cause them to discriminate differently towards 13C The C4 and C3 plants have distinctly different ranges of 13C; C4 plants range between -9 and -16 per mil, and C3 plants range between -22 to -34 per mil. δ13C studies have been used in North America to document the transition from a C3 to a C4 (native North American plants to corn) diet. The rapid and dramatic increase in 13C after the adoption of maize agriculture attests to the change in the southeastern American diet by 1300 CE.

Isotope ratios in food, especially plant food, are directly and predictably reflected in bone chemistry, allowing researchers to partially reconstruct recent diet using stable isotopes as tracers. Nitrogen isotopes (14N and 15N) have been used to estimate the relative contributions of legumes verses nonlegumes, as well as terrestrial versus marine resources to the diet.

The increased consumption of legumes, or animals that eat them, causes 15N in the body to decrease. Nitrogen isotopes in bone collagen are ultimately derived from dietary protein, while carbon can be contributed by protein, carbohydrate, or fat in the diet. Compared to other plants, legumes have lower 14N/15N ratios because they can fix molecular nitrogen, rather than having to rely on nitrates and nitrites in the soil. Legumes have δ15N values close to 0%, while other plants, which have δ15N values that range from 2 to 6%. Nitrogen isotope ratios can be used to index the importance of animal protein in the diet. 15N increases about 3-4% with each trophic step upward. 15N values increase with meat consumption, and decrease with legume consumption. The 14N/15N ratio could be used to gauge the contribution of meat and legumes to the diet.

Skeletons excavated from the Coburn Street Burial Ground (1750 to 1827 CE) in Cape Town, South Africa, were analyzed using stable isotope data by Cox et al. in order to determine geographical histories and life histories of the interred. The people buried in this cemetery were assumed to be slaves and members of the underclass based on the informal nature of the cemetery; biomechanical stress analysis and stable isotope analysis, combined with other archaeological data, seem to support this supposition.

Based on stable isotope levels, eight Cobern Street Burial Ground individuals consumed a diet based on C4 (tropical) plants in childhood, then consumed more C3 plants, which were more common at the Cape later in their lives. Six of these individuals had dental modifications similar to those carried out by peoples inhabiting tropical areas known to be targeted by slavers who brought enslaved individuals from other parts of Africa to the colony. Based on this evidence, Cox et al. argue that these individuals represent enslaved persons from areas of Africa where C4 plants are consumed and who were brought to the Cape as laborers. Cox et al. do not assign these individuals to a specific ethnicity, but do point out that similar dental modifications are carried out by the Makua, Yao, and Marav peoples. Four individuals were buried with no grave goods, in accordance with Muslim tradition, facing Signal Hill, which is a point of significance for local Muslims. Their isotopic signatures indicate that they grew up in a temperate environment consuming mostly C3 plants, but some C4 plants. Many of the isotopic signatures of interred individuals indicate that they Cox et al. argue that these individuals were from the Indian Ocean area. They also suggest that these individuals were Muslims. Cox et al. argue that stable isotopic analysis of burials, combined with historical and archaeological data can be an effective way in of investigating the worldwide migrations forced by the African Slave Trade, as well as the emergence of the underclass and working class in the colonial Old World.

Stable isotope analysis of strontium and oxygen can also be carried out. The amounts of these isotopes vary in different geological locations. Because bone is a dynamic tissue that is remodeled over time, and because different parts of the skeleton are laid down at particular times over the course of a human life, stable isotope analysis can be used to investigate population movements in the past and indicate where people lived at various points of their lives.

Archaeological uses of DNA

aDNA analysis of past populations is used by archaeology to genetically determine the sex of individuals, determine genetic relatedness, understand marriage patterns, and investigate prehistoric population movements.

An example of Archaeologists using DNA to find evidence, in 2012 archaeologists found skeletal remains of an adult male. He was buried under a car park in England. with the use of DNA evidence, the archaeologists were able to confirm that the remains belonged to Richard III, the former king of England who died in the Battle of Bosworth.

In 2021, Canadian researchers used DNA analysis on skeletal remains found on King William Island, identifying them as belonging to Warrant Officer John Gregory, an engineer serving aboard HMS Erebus in the ill-fated 1845 Franklin Expedition. He was the first expedition member to be identified by DNA analysis.

Bioarchaeological treatments of equality and inequality

Aspects of the relationship between the physical body and socio-cultural conditions and practices can be recognized through the study of human remains. This is most often emphasized in a "biocultural bioarchaeology" model. It has often been the case that bioarchaeology has been regarded as a positivist, science-based discipline, while theories of the living body in the social sciences have been viewed as constructivist in nature. Physical anthropology and bioarchaeology have been criticized for having little to no concern for culture or history. Blakey has argued that scientific or forensic treatments of human remains from archaeological sites construct a view of the past that is neither cultural nor historic, and has suggested that a biocultural version of bioarchaeology will be able to construct a more meaningful and nuanced history that is more relevant to modern populations, especially descent populations. By biocultural, Blakey means a type of bioarchaeology that is not simply descriptive, but combines the standard forensic techniques of describing stature, sex and age with investigations of demography and epidemiology in order to verify or critique socioeconomic conditions experienced by human communities of the past. The incorporation of analysis regarding the grave goods interred with individuals may further the understanding of the daily activities experienced in life.

Currently, some bioarchaeologists are coming to view the discipline as lying at a crucial interface between the science and the humanities; as the human body is non-static, and is constantly being made and re-made by both biological and cultural factors.

Buikstra considers her work to be aligned with Blakey's biocultural version of bioarchaeology because of her emphasis on models stemming from critical theory and political economy. She acknowledges that scholars such as Larsen are productive, but points out that his is a different type of bioarchaeology that focuses on quality of life, lifestyle, behavior, biological relatedness, and population history. It does not closely link skeletal remains to their archaeological context, and is best viewed as a "skeletal biology of the past."

Inequalities exist in all human societies, even so-called “egalitarian” ones. It is important to note that bioarchaeology has helped to dispel the idea that life for foragers of the past was “nasty, brutish and short”; bioarchaeological studies have shown that foragers of the past were often quite healthy, while agricultural societies tend to have increased incidence of malnutrition and disease. However, based on a comparison of foragers from Oakhurst to agriculturalists from K2 and Mapungubwe, Steyn believes that agriculturalists from K2 and Mapungubwe were not subject to the lower nutritional levels expected for this type of subsistence system.

Danforth argues that more “complex” state-level societies display greater health differences between elites and the rest of society, with elites having the advantage, and that this disparity increases as societies become more unequal. Some status differences in society do not necessarily mean radically different nutritional levels; Powell did not find evidence of great nutritional differences between elites and commoners, but did find lower rates of anemia among elites in Moundville.

An area of increasing interest among bioarchaeologists interested in understanding inequality is the study of violence. Researchers analyzing traumatic injuries on human remains have shown that a person's social status and gender can have a significant impact on their exposure to violence. There are numerous researchers studying violence, exploring a range of different types of violent behavior among past human societies. Including intimate partner violence, child abuse, institutional abuse, torture, warfare, human sacrifice, and structural violence.

Archaeological ethics

There are ethical issues with bioarchaeology that revolve around treatment and respect for the dead. Large-scale skeletal collections were first amassed in the US in the 19th century, largely from the remains of Native Americans. No permission was ever granted from surviving family for study and display. Recently, federal laws such as NAGPRA (Native American Graves Protection and Repatriation Act) have allowed Native Americans to regain control over the skeletal remains of their ancestors and associated artifacts in order to reassert their cultural identities.

NAGPRA passed in 1990. At this time, many archaeologists underestimated the public perception of archaeologists as non-productive members of society and grave robbers. Concerns about occasional mistreatment of Native American remains are not unfounded: in a Minnesota excavation 1971, White and Native American remains were treated differently; remains of White people were reburied, while remains of Native American people were placed in cardboard boxes and placed in a natural history museum. Blakey relates the growth in African American bioarchaeology to NAGPRA and its effect of cutting physical anthropologist off from their study of Native American remains.

Bioarchaeology in Europe is not as affected by these repatriation issues as American bioarchaeology but regardless the ethical considerations associated with working with human remains are, and should, be considered. However, because much of European archaeology has been focused on classical roots, artifacts and art have been overemphasized and Roman and post-Roman skeletal remains were nearly completely neglected until the 1980s. Prehistoric archaeology in Europe is a different story, as biological remains began to be analyzed earlier than in classical archaeology.

Encephalization quotient

From Wikipedia, the free encyclopedia

Encephalization quotient (EQ), encephalization level (EL), or just encephalization is a relative brain size measure that is defined as the ratio between observed to predicted brain mass for an animal of a given size, based on nonlinear regression on a range of reference species. It has been used as a proxy for intelligence and thus as a possible way of comparing the intelligences of different species. For this purpose it is a more refined measurement than the raw brain-to-body mass ratio, as it takes into account allometric effects. Expressed as a formula, the relationship has been developed for mammals and may not yield relevant results when applied outside this group.

Perspective on intelligence measures

Encephalization quotient was developed in an attempt to provide a way of correlating an animal's physical characteristics with perceived intelligence. It improved on the previous attempt, brain-to-body mass ratio, so it has persisted. Subsequent work, notably Roth, found EQ to be flawed and suggested brain size was a better predictor, but that has problems as well.

Currently the best predictor for intelligence across all animals is forebrain neuron count. This was not seen earlier because neuron counts were previously inaccurate for most animals. For example, human brain neuron count was given as 100 billion for decades before Herculano-Houzel found a more reliable method of counting brain cells.

It could have been anticipated that EQ might be superseded because of both the number of exceptions and the growing complexity of the formulae it used. (See the rest of this article.) The simplicity of counting neurons has replaced it. The concept in EQ of comparing the brain capacity exceeding that required for body sense and motor activity may yet live on to provide an even better prediction of intelligence, but that work has not been done yet.

Variance in brain sizes

Body size accounts for 80–90% of the variance in brain size, between species, and a relationship described by an allometric equation: the regression of the logarithms of brain size on body size. The distance of a species from the regression line is a measure of its encephalization (Finlay, 2009). The scales are logarithmic, distance, or residual, is an encephalization quotient (EQ), the ratio of actual brain size to expected brain size. Encephalization is a characteristic of a species.

Rules for brain size relates to the number brain neurons have varied in evolution, then not all mammalian brains are necessarily built as larger or smaller versions of a same plan, with proportionately larger or smaller numbers of neurons. Similarly sized brains, such as a cow or chimpanzee, might in that scenario contain very different numbers of neurons, just as a very large cetacean brain might contain fewer neurons than a gorilla brain. Size comparison between the human brain and non-primate brains, larger or smaller, might simply be inadequate and uninformative – and our view of the human brain as outlier, a special oddity, may have been based on the mistaken assumption that all brains are made the same (Herculano-Houzel, 2012).

Limitations and possible improvements over EQ

There is a distinction between brain parts that are necessary for the maintenance of the body and those that are associated with improved cognitive functions. These brain parts, although functionally different, all contribute to the overall weight of the brain. Jerison (1973) for this reason, has considered 'extra neurons', neurons that contribute strictly to cognitive capacities, as more important indicators of intelligence than pure EQ. Gibson et al. (2001) reasoned that bigger brains generally contain more 'extra neurons' and thus are better predictors of cognitive abilities than pure EQ, among primates.

Factors, such as the recent evolution of the cerebral cortex and different degrees of brain folding (gyrification), which increases the surface area (and volume) of the cortex, are positively correlated to intelligence in humans.

In a meta-analysis, Deaner et al. (2007) tested ABS, cortex size, cortex-to-brain ratio, EQ, and corrected relative brain size (cRBS) against global cognitive capacities. They have found that, after normalization, only ABS and neocortex size showed significant correlation to cognitive abilities. In primates, ABS, neocortex size, and Nc (the number of cortical neurons) correlated fairly well with cognitive abilities. However, there were inconsistencies found for Nc. According to the authors, these inconsistencies were the result of the faulty assumption that Nc increases linearly with the size of the cortical surface. This notion is incorrect because the assumption does not take into account the variability in cortical thickness and cortical neuron density, which should influence Nc.

According to Cairo (2011), EQ has flaws to its design when considering individual data points rather than a species as a whole. It is inherently biased given that the cranial volume of an obese and underweight individual would be roughly similar, but their body masses would be drastically different. Another difference of this nature is a lack of accounting for sexual dimorphism. For example, the female human generally has smaller cranial volume than the male, however this does not mean that a female and male of the same body mass would have different cognitive abilities. Considering all of these flaws, EQ should be a metric for interspecies comparison only, not for intraspecies comparison.

The notion that encephalization quotient corresponds to intelligence has been disputed by Roth and Dicke (2012). They consider the absolute number of cortical neurons and neural connections as better correlates of cognitive ability. According to Roth and Dicke (2012), mammals with relatively high cortex volume and neuron packing density (NPD) are more intelligent than mammals with the same brain size. The human brain stands out from the rest of the mammalian and vertebrate taxa because of its large cortical volume and high NPD, conduction velocity, and cortical parcellation. All aspects of human intelligence are found, at least in its primitive form, in other nonhuman primates, mammals, or vertebrates, with the exception of syntactical language. Roth and Dicke consider syntactical language an "intelligence amplifier".

Brain-body size relationship

Species Simple brain-to-body
ratio (E/S)
small birds 112
human 140
mouse 140
dolphin 150
cat 1100
chimpanzee 1113
dog 1125
frog 1172
lion 1550
elephant 1560
horse 1600
shark 12496
hippopotamus 12789

Brain size usually increases with body size in animals (is positively correlated), i.e. large animals usually have larger brains than smaller animals. The relationship is not linear, however. Generally, small mammals have relatively larger brains than big ones. Mice have a direct brain/body size ratio similar to humans (140), while elephants have a comparatively small brain/body size (1560), despite being quite intelligent animals.

Several reasons for this trend are possible, one of which is that neural cells have a relative constant size. Some brain functions, like the brain pathway responsible for a basic task like drawing breath, are basically similar in a mouse and an elephant. Thus, the same amount of brain matter can govern breathing in a large or a small body. While not all control functions are independent of body size, some are, and hence large animals need comparatively less brain than small animals. This phenomenon can be described by an equation: C = E / S2/3 , where E and S are brain and body weights respectively, and C is called the cephalization factor. To determine the value of this factor, the brain- and body-weights of various mammals were plotted against each other, and the curve of E = C × S2/3 chosen as the best fit to that data.

The cephalization factor and the subsequent encephalization quotient was developed by H.J. Jerison in the late 1960s. The formula for the curve varies, but an empirical fitting of the formula to a sample of mammals gives . As this formula is based on data from mammals, it should be applied to other animals with caution. For some of the other vertebrate classes the power of 34 rather than 23 is sometimes used, and for many groups of invertebrates the formula may give no meaningful results at all.

Calculation

Snell's equation of simple allometry is:

Here "E" is the weight of the brain, "C" is the cephalization factor and "S" is body weight and "r" is the exponential constant.

The "encephalization quotient" (EQ) is the coefficient "C" in Snell's allometry equation, usually normalized with respect to a reference species. In the following table, the coefficients have been normalized with respect to the value for the cat, which is therefore attributed an EQ of 1.

Another way to calculate encephalization quotient is by dividing the actual weight of an animal's brain with its predicted weight according to Jerison's formula.

Species EQ
Human 7.4–7.8
Dog 1.2
Bottlenose dolphin 5.3
Cat 1.0
Chimpanzee 2.2–2.5
Horse 0.9
Raven 2.49
Sheep 0.8
Rhesus monkey 2.1
Mouse 0.5
African elephant 1.3
Rat 0.4
Rabbit 0.4
Opossum 0.2

This measurement of approximate intelligence is more accurate for mammals than for other classes and phyla of Animalia.

EQ and intelligence in mammals

Intelligence in animals is hard to establish, but the larger the brain is relative to the body, the more brain weight might be available for more complex cognitive tasks. The EQ formula, as opposed to the method of simply measuring raw brain weight or brain weight to body weight, makes for a ranking of animals that coincides better with observed complexity of behaviour. A primary reason for the use of EQ instead of a simple brain to body mass ratio is that smaller animals tend to have a higher proportional brain mass, but do not show the same indications of higher cognition as animals with a high EQ.

Grey floor

The driving theorization behind the development of EQ is that an animal of a certain size requires a minimum number of neurons for basic functioning- sometimes referred to as a grey floor. There is also a limit to how large an animal's brain can grow given its body size – due to limitations like gestation period, energetics, and the need to physically support the encephalized region throughout maturation. When normalizing a standard brain size for a group of animals, a slope can be determined to show what a species' expected brain to body mass ratio would be. Species with brain to body mass ratios below this standard are nearing the grey floor, and do not need extra grey matter. Species which fall above this standard have more grey matter than is necessary for basic functions. Presumably these extra neurons are used for higher cognitive processes.

Taxonomic trends

Mean EQ for mammals is around 1, with carnivorans, cetaceans and primates above 1, and insectivores and herbivores below. Large mammals tend to have the highest EQs of all animals, while small mammals and avians have similar EQs. This reflects two major trends. One is that brain matter is extremely costly in terms of energy needed to sustain it. Animals with nutrient rich diets tend to have higher EQs, which is necessary for the energetically costly tissue of brain matter. Not only is it metabolically demanding to grow throughout embryonic and postnatal development, it is costly to maintain as well.

Arguments have been made that some carnivores may have higher EQ's due to their relatively enriched diets, as well as the cognitive capacity required for effectively hunting prey. One example of this is brain size of a wolf; about 30% larger than a similarly sized domestic dog, potentially derivative of different needs in their respective way of life.

Dietary trends

It is worth noting, however, that of the animals demonstrating the highest EQ's (see associated table), many are primarily frugivores, including apes, macaques, and proboscideans. This dietary categorization is significant to inferring the pressures which drive higher EQ's. Specifically, frugivores must utilize a complex, trichromatic, map of visual space to locate and pick ripe fruits, and are able to provide for the high energetic demands of increased brain mass.

Trophic level—"height" on the food chain—is yet another factor that has been correlated with EQ in mammals. Eutheria with either high AB (absolute brain-mass) or high EQ occupy positions at high trophic levels. Eutheria low on the network of food chains can only develop a high RB (relative brain-mass) so long as they have small body masses. This presents an interesting conundrum for intelligent small animals, who have behaviors radically different from intelligent large animals.

According to Steinhausen et al.(2016):

Animals with high RB [relative brain-mass] usually have (1) a short life span, (2) reach sexual maturity early, and (3) have short and frequent gestations. Moreover, males of species with high RB also have few potential sexual partners. In contrast, animals with high EQs have (1) a high number of potential sexual partners, (2) delayed sexual maturity, and (3) rare gestations with small litter sizes.

Sociality

Another factor previously thought to have great impact on brain size is sociality and flock size. This was a long-standing theory until the correlation between frugivory and EQ was shown to be more statistically significant. While no longer the predominant inference as to selection pressure for high EQ, the social brain hypothesis still has some support. For example, dogs (a social species) have a higher EQ than cats (a mostly solitary species). Animals with very large flock size and/or complex social systems consistently score high EQ, with dolphins and orcas having the highest EQ of all cetaceans, and humans with their extremely large societies and complex social life topping the list by a good margin.

Comparisons with non-mammalian animals

Birds generally have lower EQ than mammals, but parrots and particularly the corvids show remarkable complex behaviour and high learning ability. Their brains are at the high end of the bird spectrum, but low compared to mammals. Bird cell size is on the other hand generally smaller than that of mammals, which may mean more brain cells and hence synapses per volume, allowing for more complex behaviour from a smaller brain. Both bird intelligence and brain anatomy are however very different from those of mammals, making direct comparison difficult.

Manta rays have the highest EQ among fish, and either octopuses or jumping spiders have the highest among invertebrates. Despite the jumping spider having a huge brain for its size, it is minuscule in absolute terms, and humans have a much higher EQ despite having a lower raw brain-to-body weight ratio. Mean EQs for reptiles are about one tenth of those of mammals. EQ in birds (and estimated EQ in other dinosaurs) generally also falls below that of mammals, possibly due to lower thermoregulation and/or motor control demands. Estimation of brain size in Archaeopteryx (one of the oldest known ancestors of birds), shows it had an EQ well above the reptilian range, and just below that of living birds.

Biologist Stephen Jay Gould has noted that if one looks at vertebrates with very low encephalization quotients, their brains are slightly less massive than their spinal cords. Theoretically, intelligence might correlate with the absolute amount of brain an animal has after subtracting the weight of the spinal cord from the brain. This formula is useless for invertebrates because they do not have spinal cords or, in some cases, central nervous systems.

EQ in paleoneurology

Behavioral complexity in living animals can to some degree be observed directly, making the predictive power of the encephalization quotient less relevant. It is however central in paleoneurology, where the endocast of the brain cavity and estimated body weight of an animal is all one has to work from. The behavior of extinct mammals and dinosaurs is typically investigated using EQ formulas.

Encephalization quotient is also used in estimating evolution of intelligent behavior in human ancestors. This technique can help in mapping the development of behavioral complexities during human evolution. However, this technique is only limited to when there are both cranial and post-cranial remains associated with individual fossils, to allow for brain to body size comparisons. For example, remains of one Middle Pleistocene human fossil from Jinniushan province in northern China has allowed scientists to study the relationship between brain and body size using the Encephalization Quotient. Researchers obtained an EQ of 4.150 for the Jinniushan fossil, and then compared this value with preceding Middle Pleistocene estimates of EQ at 3.7770. The difference in EQ estimates has been associated with a rapid increase in encephalization in Middle Pleistocene hominins. Paleo-neurological comparisons between Neanderthals and anatomically modern Homo sapiens (AMHS) via Encephalization quotient often rely on the use of endocasts, but there are a lot of drawbacks associated with using this method. For example, endocasts do not provide any information regarding the internal organization of the brain. Furthermore, endocasts are often unclear in terms of the preservation of their boundaries, and it becomes hard to measure where exactly a certain structure starts and ends. If endocasts themselves are not reliable, then the value for brain size used to calculate the EQ could also be unreliable. Additionally, previous studies have suggested that Neanderthals have the same encephalization quotient as modern humans, although their post-crania suggests that they weighed more than modern humans. Because EQ relies on values from both postcrania and crania, the margin for error increases in relying on this proxy in paleo-neurology because of the inherent difficulty in obtaining accurate brain and body mass measurements from the fossil record.

EQ of livestock animals

The EQ of livestock farm animals such as the domestic pig may be significantly lower than would suggest for their apparent intelligence. According to Minervini et al (2016) the brain of the domestic pig is a rather small size compared to the mass of the animal. The tremendous increase in body weight imposed by industrial farming significantly influences brain-to-body weight measures, including the EQ. The EQ of the domestic adult pig is just 0.38, yet pigs can use visual information seen in a mirror to find food, show evidence of self-recognition when presented with their reflections and there is evidence suggesting that pigs are as socially complex as many other highly intelligent animals, possibly sharing a number of cognitive capacities related to social complexity.

History

The concept of encephalization has been a key evolutionary trend throughout human evolution, and consequently an important area of study. Over the course of hominin evolution, brain size has seen an overall increase from 400 cm3 to 1400 cm3. Furthermore, the genus Homo is specifically defined by a significant increase in brain size. The earliest Homo species were larger in brain size as compared to contemporary Australopithecus counterparts, with which they co-inhabited parts of Eastern and Southern Africa.

Throughout modern history, humans have been fascinated by the large relative size of our brains, trying to connect brain sizes to overall levels of intelligence. Early brain studies were focused in the field of phrenology, which was pioneered by Franz Joseph Gall in 1796 and remained a prevalent discipline throughout the early 19th century. Specifically, phrenologists paid attention to the external morphology of the skull, trying to relate certain lumps to corresponding aspects of personality. They further measured physical brain size in order to equate larger brain sizes to greater levels of intelligence. Today, however, phrenology is considered a pseudoscience.

Among ancient Greek philosophers, Aristotle in particular believed that after the heart, the brain was the second most important organ of the body. He also focused on the size of the human brain, writing in 335 BCE that "of all the animals, man has the brain largest in proportion to his size." In 1861, French Neurologist Paul Broca tried to make a connection between brain size and intelligence. Through observational studies, he noticed that people working in what he deemed to be more complex fields had larger brains than people working in less complex fields. Also, in 1871, Charles Darwin wrote in his book The Descent of Man: "No one, I presume, doubts that the large proportion which the size of man's brain bears to his body, compared to the same proportion in the gorilla or orang, is closely connected with his mental powers." The concept of quantifying encephalization is also not a recent phenomenon. In 1889, Sir Francis Galton, through a study on college students, attempted to quantify the relationship between brain size and intelligence.

Due to Hitler's racial policies during World War II, studies on brain size and intelligence temporarily gained a negative reputation. However, with the advent of imaging techniques such as the fMRI and PET scan, several scientific studies were launched to suggest a relationship between encephalization and advanced cognitive abilities. Harry J. Jerison, who invented the formula for encephalization quotient, believed that brain size was proportional to the ability of humans to process information. With this belief, a higher level of encephalization equated to a higher ability to process information. A larger brain could mean a number of different things, including a larger cerebral cortex, a greater number of neuronal associations, or a greater number of neurons overall.

Influence and reception of Friedrich Nietzsche

Nietzsche portrait

Friedrich Nietzsche's influence and reception varied widely and may be roughly divided into various chronological periods. Reactions were anything but uniform, and proponents of various ideologies attempted to appropriate his work quite early.

Overview

Beginning while Nietzsche was still alive, though incapacitated by mental illness, many Germans discovered his appeals for greater heroic individualism and personality development in Thus Spoke Zarathustra, but responded to those appeals in diverging ways. He had some following among left-wing Germans in the 1890s. Nietzsche's anarchistic influence was particularly strong in France and the United States.

By World War I, German soldiers even received copies of Thus Spoke Zarathustra as gifts. The Dreyfus affair provides another example of his reception: the French antisemitic Right labelled the Jewish and leftist intellectuals who defended Alfred Dreyfus as "Nietzscheans". Such seemingly paradoxical acceptance by diametrically opposed camps is typical of the history of the reception of Nietzsche's thought. In the context of the rise of French fascism, one researcher notes, "Although, as much recent work has stressed, Nietzsche had an important impact on "leftist" French ideology and theory, this should not obscure the fact that his work was also crucial to the right and to the neither right nor left fusions of developing French fascism.

Indeed, as Ernst Nolte proposed, Maurrassian ideology of "aristocratic revolt against egalitarian-utopian 'transcendence'" (transcendence being Nolte's term for the ontological absence of theodic center justifying modern "emancipation culture") and the interrelation between Nietzschean ideology and proto-fascism offer extensive space for criticism and the Nietzschean ambiance pervading French ideological fermentation of extremism in time birthing formal fascism, is unavoidable.

Many political leaders of the 20th century were at least superficially familiar with Nietzsche's ideas. However, it is not always possible to determine whether or not they actually read his work. Regarding Hitler, for example, there is a debate. Some authors claim that he probably never read Nietzsche, or that if he did, his reading was not extensive. Hitler more than likely became familiar with Nietzsche quotes during his time in Vienna when quotes by Nietzsche were frequently published in pan-German newspapers. Nevertheless, others point to a quote in Hitler's Table Talk, where the dictator mentioned Nietzsche when he spoke about what he called "great men", as an indication that Hitler may have been familiarized with Nietzsche's work. Other authors like Melendez (2001) point out to the parallels between Hitler's and Nietzsche's titanic anti-egalitarianism, and the idea of the "übermensch", a term which was frequently used by Hitler and Mussolini to refer to the so-called "Aryan race", or rather, its projected future after fascist engineering. Alfred Rosenberg, an influential Nazi ideologist, also delivered a speech in which he related National Socialism to Nietzsche's ideology. Broadly speaking, despite Nietzsche's hostility towards anti-semitism and nationalism, the Nazis made very selective use of Nietzsche's philosophy, and eventually, this association caused Nietzsche's reputation to suffer following World War II.

On the other hand, it is known that Mussolini early on heard lectures about Nietzsche, Vilfredo Pareto, and others in ideologically forming fascism. A girlfriend of Mussolini, Margherita Sarfatti, who was Jewish, relates that Nietzsche virtually was the transforming factor in Mussolini's "conversion" from hard socialism to spiritualistic, ascetic fascism,: "In 1908 he presented his conception of the superman's role in modern society in a writing on Nietzsche entitled, "The Philosophy of Force."

Nietzsche's influence on Continental philosophy increased dramatically after the Second World War.

Nietzsche and anarchism

During the 19th century, Nietzsche was frequently associated with anarchist movements, in spite of the fact that in his writings he definitely holds a negative view of egalitarian anarchists. Nevertheless, Nietzsche's ideas generated strong interest from key figures from the historical anarchist movement which began in the 1890s. According to a recent study, "Gustav Landauer, Emma Goldman and others reflected on the chances offered and the dangers posed by these ideas in relation to their own politics. Heated debates over meaning, for example on the will to power or on the status of women in Nietzsche’s works, provided even the most vehement critics such as Peter Kropotkin with productive cues for developing their own theories. In recent times, a newer strand called post-anarchism has invoked Nietzsche’s ideas, while also disregarding the historical variants of Nietzschean anarchism. This calls into question the innovative potential of post-anarchism."

Some hypothesize on certain grounds Nietzsche's violent stance against anarchism may (at least partially) be the result of a popular association during this period between his ideas and those of Max Stirner. Thus far, no plagiarism has been detected at all, but a probable concealed influence in his formative years.

Spencer Sunshine writes, "There were many things that drew anarchists to Nietzsche: his hatred of the state; his disgust for the mindless social behavior of "herds"; his anti-Christianity; his distrust of the effect of both the market and the state on cultural production; his desire for an "overman" — that is, for a new human who was to be neither master nor slave; his praise of the ecstatic and creative self, with the artist as his prototype, who could say, "Yes" to the self-creation of a new world on the basis of nothing; and his forwarding of the "transvaluation of values" as source of change, as opposed to a Marxist conception of class struggle and the dialectic of a linear history." Lacking in Nietzsche is the anarchist utopian-egalitarian belief that every soul is capable of epic greatness: Nietzsche's aristocratic elitism is the death-knell of any Nietzschean conventional anarchism.

According to Sunshine: "The list is not limited to culturally oriented anarchists such as Emma Goldman, who gave dozens of lectures about Nietzsche and baptized him as an honorary anarchist. Pro-Nietzschean anarchists also include prominent Spanish CNTFAI members in the 1930s such as Salvador Seguí and anarcha-feminist Federica Montseny; anarcho-syndicalist militants like Rudolf Rocker; and even the younger Murray Bookchin, who cited Nietzsche's conception of the 'transvaluation of values' in support of the Spanish anarchist project." Also in European individualist anarchist circles his influence is clear in thinker/activists such as Émile Armand and Renzo Novatore among others. Also more recently in post-left anarchy, Nietzsche is present in the thought of Hakim Bey and Wolfi Landstreicher.

Nietzsche and fascism

The Italian and German fascist regimes were eager to lay claim to Nietzsche's ideas, and to position themselves as inspired by them. In 1932, Nietzsche's sister, Elisabeth Förster-Nietzsche, received a bouquet of roses from Adolf Hitler during a German premiere of Benito Mussolini's 100 Days, and in 1934 Hitler personally presented her with a wreath for Nietzsche's grave carrying the words "To A Great Fighter". Also in 1934, Elisabeth gave to Hitler Nietzsche's favorite walking stick, and Hitler was photographed gazing into the eyes of a white marble bust of Nietzsche. Heinrich Hoffmann's popular biography Hitler as Nobody Knows Him (which sold nearly a half-million copies by 1938) featured this photo with the caption reading: "The Führer before the bust of the German philosopher whose ideas have fertilized two great popular movements: the national socialist of Germany and the fascist of Italy."

Nietzsche was no less popular among French fascists, perhaps with more doctrinal truthfulness, as Robert S. Wistrich has pointed out

The "fascist" Nietzsche was above all considered to be a heroic opponent of necrotic Enlightenment "rationality" and a kind of spiritual vitalist, who had glorified war and violence in an age of herd-lemming shopkeepers, inspiring the anti-Marxist revolutions of the interwar period. According to the French fascist Pierre Drieu La Rochelle, it was the Nietzschean emphasis on the autotelic power of the Will that inspired the mystic voluntarism and political activism of his comrades. Such politicized readings were vehemently rejected by another French writer, the socialo-communist anarchist Georges Bataille, who in the 1930s sought to establish (in ambiguous success) the "radical incompatibility" between Nietzsche (as a thinker who abhorred mass politics) and "the fascist reactionaries." He argued that nothing was more alien to Nietzsche than the pan-Germanism, racism, militarism and anti-Semitism of the Nazis, into whose service the German philosopher had been pressed. Bataille here was sharp-witted but combined half-truths without his customary dialectical finesse.

The German philosopher Martin Heidegger, an active member of the Nazi Party, noted that everyone in his day was either 'for' or 'against' Nietzsche while claiming that this thinker heard a "command to reflect on the essence of a planetary domination." Alan D. Schrift cites this passage and writes, "That Heidegger sees Nietzsche heeding a command to reflect and prepare for earthly domination is of less interest to me than his noting that everyone thinks in terms of a position for or against Nietzsche. In particular, the gesture of setting up 'Nietzsche' as a battlefield on which to take one's stand against or to enter into competition with the ideas of one's intellectual predecessors or rivals has happened quite frequently in the twentieth century."

Marching in ideological warfare against the arrows from Bataille, Thomas Mann, Albert Camus and others, claimed that the Nazi movement, despite Nietzsche' virulent hatred of both volkist-populist socialist and nationalism ("national socialism"), did, in certain of its emphases, share an affinity with Nietzsche's ideas, including his ferocious attacks against democracy, egalitarianism, the communistic-socialistic social model, popular Christianity, parliamentary government, and a number of other things. In The Will to Power Nietzsche praised – sometimes metaphorically, other times both metaphorically and literally – the sublimity of war and warriors, and heralded an international ruling race that would become the "lords of the earth". Here Nietzsche was referring to pan-Europeanism of a Caesarist type, positively embracing Jews, not a Germanic master race but a neo-imperial elite of culturally refined "redeemers" of humanity, which was otherwise considered wretched and plebeian and ugly in its mindless existence.

The Nazis appropriated, or rather received also inspiration in this case, from Nietzsche's extremely old-fashioned and semi-feudal views on women: Nietzsche despised modern feminism, along with democracy and socialism, as mere egalitarian leveling movements of nihilism. He forthrightly declared, "Man shall be trained for war and woman for the procreation of the warrior, anything else is folly"; and was indeed unified with the Nazi world-view at least in terms of the social role of women: "They belong in the kitchen and their chief role in life is to beget children for German warriors." Here is one area where Nietzsche indeed did not contradict the Nazis in his politics of "aristocratic radicalism."

During the interbellum years, certain Nazis had employed a highly selective reading of Nietzsche's work to advance their ideology, notably Alfred Baeumler, who strikingly omitted the fact of Nietzsche's anti-socialism and anti-nationalism (for Nietzsche, both equally contemptible mass herd movements of modernity) in his reading of The Will to Power. The era of Nazi rule (1933–1945) saw Nietzsche's writings widely studied in German (and, after 1938, Austrian) schools and universities. Despite the fact that Nietzsche had expressed his disgust with plebeian-volkist antisemitism and supremacist German nationalism in the most forthright terms possible (e.g. he resolved "to have nothing to do with anyone involved in the perfidious race-fraud"), phrases like "the will to power" became common in Nazi circles. The wide popularity of Nietzsche among Nazis stemmed in part from the endeavors of his sister, Elisabeth Förster-Nietzsche, the editor of Nietzsche's work after his 1889 breakdown, and an eventual Nazi sympathizer. Mazzino Montinari, while editing Nietzsche's posthumous works in the 1960s, found that Förster-Nietzsche, while editing the posthumous fragments making up The Will to Power, had cut extracts, changed their order, quoted him out of context, etc.

Nietzsche's reception among the more intellectually percipient or zealous fascists was not universally warm. For example, one "rabidly Nazi writer, Curt von Westernhagen, who announced in his book Nietzsche, Juden, Antijuden (1936) that the time had come to expose the 'defective personality of Nietzsche whose inordinate tributes for, and espousal of, Jews had caused him to depart from the Germanic principles enunciated by Meister Richard Wagner'."

The real problem with the labelling of Nietzsche as a fascist, or worse, a Nazi, is that it ignores the fact that Nietzsche's aristocratism seeks to revive an older conception of politics, one which he locates in Greek agon which [...] has striking affinities with the philosophy of action expounded in our own time by Hannah Arendt. Once an affinity like this is appreciated, the absurdity of describing Nietzsche's political thought as 'fascist', or Nazi, becomes readily apparent.

Nietzsche and Zionism

Jacob Golomb observed, "Nietzsche's ideas were widely disseminated among and appropriated by the first Hebrew Zionist writers and leaders." According to Steven Aschheim, "Classical Zionism, that essentially secular and modernizing movement, was acutely aware of the crisis of Jewish tradition and its supporting institutions. Nietzsche was enlisted as an authority for articulating the movement's ruptured relationship with the past and a force in its drive to normalization and its activist ideal of self-creating Hebraic New Man."

Francis R. Nicosia notes, "At the height of his fame between 1895 and 1902, some of Nietzsche's ideas seemed to have a particular resonance for some Zionists, including Theodor Herzl." Under his editorship the Neue Freie Presse dedicated seven consecutive issues to Nietzsche obituaries, and Golomb notes that Herzl's cousin Raoul Auernheimer claimed Herzl was familiar with Nietzsche and had "absorbed his style."

However, Gabriel Sheffer suggests that Herzl was too bourgeois and too eager to be accepted into mainstream society to be much of a revolutionary (even an "aristocratic" one), and hence could not have been strongly influenced by Nietzsche, but remarks, "Some East European Jewish intellectuals, such as the writers Yosef Hayyim Brenner and Micha Josef Berdyczewski, followed after Herzl because they thought that Zionism offered the chance for a Nietzschean 'transvaluation of values' within Jewry". Nietzsche also influenced Theodor Lessing.

Martin Buber was fascinated by Nietzsche, whom he praised as a heroic figure, and he strove to introduce "a Nietzschean perspective into Zionist affairs." In 1901, Buber, who had just been appointed the editor of Die Welt, the primary publication of the World Zionist Organization, published a poem in Zarathustrastil (a style reminiscent of Nietzsche's Thus Spoke Zarathustra) calling for the return of Jewish literature, art and scholarship.

Max Nordau, an early Zionist orator and controversial racial anthropologist, insisted that Nietzsche had been insane since birth, and advocated "branding his disciples [...] as hysterical and imbecile."

Nietzsche, analytical psychology and psychoanalysis

Carl Jung, the psychiatrist and psychoanalyst who founded analytical psychology, recognized Nietzsche's profundity early on. "From the time Jung first became gripped by Nietzsche’s ideas as a student in Basel to his days as a leading figure in the psychoanalytic movement, Jung read, and increasingly developed, his own thought in a dialogue with the work of Nietzsche. … Untangling the exact influence of Nietzsche on Jung, however, is a complicated business. Jung never openly addressed the exact influence Nietzsche had on his own concepts, and when he did link his own ideas to Nietzsche’s, he almost never made it clear whether the idea in question was inspired by Nietzsche or whether he merely discovered the parallel at a later stage." In 1934, Jung held a lengthy and insightful seminar on Nietzsche's Zarathustra. In 1936, Jung explained that Germans of the present day had been seized or possessed by the psychic force known in Germanic mythology as Wotan, "the god of storm and frenzy, the unleasher of passions and the lust of battle"—Wotan being synonymous with Nietzsche's Dionysus, Jung said. A 12th-century stick found among the Bryggen inscriptions, Bergen, Norway bears a runic message by which the population called upon Thor and Wotan for help: Thor is asked to receive the reader, and Wotan to own them. "Nietzsche provided Jung both with the terminology (the Dionysian) and the case study (Zarathustra as an example of the Dionysian at work in the psyche) to help him put into words his thoughts about the spirit of his own age: an age confronted with an uprush of the Wotanic/Dionysian spirit in the collective unconscious. This, in a nutshell, is how Jung came to see Nietzsche, and explains why he was so fascinated by Nietzsche as a thinker."

Nietzsche had also an important influence on psychotherapist and founder of the school of individual psychology Alfred Adler. According to Ernest Jones, biographer and personal acquaintance of Sigmund Freud, Freud frequently referred to Nietzsche as having "more penetrating knowledge of himself than any man who ever lived or was likely to live". Yet Jones also reports that Freud emphatically denied that Nietzsche's writings influenced his own psychological discoveries; in the 1890s, Freud, whose education at the University of Vienna in the 1870s had included a strong relationship with Franz Brentano, his teacher in philosophy, from whom he had acquired an enthusiasm for Aristotle and Ludwig Feuerbach, was acutely aware of the possibility of convergence of his own ideas with those of Nietzsche and doggedly refused to read the philosopher as a result. In his excoriating — but also sympathetic — critique of psychoanalysis, The Psychoanalytic Movement, Ernest Gellner depicts Nietzsche as setting out the conditions for elaborating a realistic psychology, in contrast with the eccentrically implausible Enlightenment psychology of Hume and Smith, and assesses the success of Freud and the psychoanalytic movement as in large part based upon its success in meeting this "Nietzschean minimum".

Early 20th-century thinkers

Early twentieth-century thinkers who read or were influenced by Nietzsche include: philosophers Martin Heidegger, Ludwig Wittgenstein, Ernst Jünger, Theodor Adorno, Georg Brandes, Martin Buber, Karl Jaspers, Henri Bergson, Jean-Paul Sartre, Albert Camus, Leo Strauss, Michel Foucault, Julius Evola, Emil Cioran, Miguel de Unamuno, Lev Shestov, Ayn Rand, José Ortega y Gasset, Rudolf Steiner and Muhammad Iqbal; sociologists Ferdinand Tönnies and Max Weber; composers Richard Strauss, Alexander Scriabin, Gustav Mahler, and Frederick Delius; historians Oswald Spengler, Fernand Braudel and Paul Veyne, theologians Paul Tillich and Thomas J.J. Altizer; the occultists Aleister Crowley and Erwin Neutzsky-Wulff. Novelists Franz Kafka, Joseph Conrad, Thomas Mann, Hermann Hesse, André Malraux, Nikos Kazantzakis, André Gide, Knut Hamsun, August Strindberg, James Joyce, D. H. Lawrence, Vladimir Bartol and Pío Baroja; psychologists Sigmund Freud, Otto Gross, C. G. Jung, Alfred Adler, Abraham Maslow, Carl Rogers, Rollo May and Kazimierz Dąbrowski; poets John Davidson, Rainer Maria Rilke, Wallace Stevens and William Butler Yeats; painters Salvador Dalí, Wassily Kandinsky, Pablo Picasso, Mark Rothko; playwrights George Bernard Shaw, Antonin Artaud, August Strindberg, and Eugene O'Neill; and authors H. P. Lovecraft, Olaf Stapledon, Menno ter Braak, Richard Wright, Robert E. Howard, and Jack London. American writer H. L. Mencken avidly read and translated Nietzsche's works and has gained the sobriquet "the American Nietzsche". In his book on Nietzsche, Mencken portrayed the philosopher as a proponent of anti-egalitarian aristocratic revolution, a depiction in sharp contrast with left-wing interpretations of Nietzsche. Nietzsche was declared an honorary anarchist by Emma Goldman, and he influenced other anarchists such as Guy Aldred, Rudolf Rocker, Max Cafard and John Moore.

The popular conservative writer, philosopher, poet, journalist and theological apologist of Catholicism G. K. Chesterton expressed contempt for Nietzsche's ideas, deeming his philosophy basically a poison or death-wish of Western culture:

I do not even think that a cosmopolitan contempt for patriotism is merely a matter of opinion, any more than I think that a Nietzscheite contempt for compassion is merely a matter of opinion. I think they are both heresies so horrible that their treatment must not be so much mental as moral, when it is not simply medical. Men are not always dead of a disease and men are not always damned by a delusion; but so far as they are touched by it they are destroyed by it.

— May 31, 1919, Illustrated London News

Thomas Mann's essays mention Nietzsche with respect and even adoration, although one of his final essays, "Nietzsche's Philosophy in the Light of Recent History", looks at his favorite philosopher through the lens of Nazism and World War II and ends up placing Nietzsche at a more critical distance. Many of Nietzsche's ideas, particularly on artists and aesthetics, are incorporated and explored throughout Mann's works. The theme of the aesthetic justification of existence Nietzsche introduced from his earliest writings, in "The Birth of Tragedy" declaring sublime art as the only metaphysical consolation of existence; and in the context of fascism and Nazism, the Nietzschean aestheticization of politics void of morality and ordered by caste hierarchy in service of the creative caste, has posed many problems and questions for thinkers in contemporary times. One of the characters in Mann's 1947 novel Doktor Faustus represents Nietzsche fictionally. In 1938 the German existentialist Karl Jaspers wrote the following about the influence of Nietzsche and Søren Kierkegaard:

The contemporary philosophical situation is determined by the fact that two philosophers, Kierkegaard and Nietzsche, who did not count in their times and, for a long time, remained without influence in the history of philosophy, have continually grown in significance. Philosophers after Hegel have increasingly returned to face them, and they stand today unquestioned as the authentically great thinkers of their age. [...] The effect of both is immeasurably great, even greater in general thinking than in technical philosophy

— Jaspers, Reason and Existenz

Bertrand Russell in his History of Western Philosophy was scathing in his chapter on Nietzsche, asking whether his work might not be called the "mere power-phantasies of an invalid" and referring to Nietzsche as a "megalomaniac":

It is obvious that in his day-dreams he is a warrior, not a professor; all of the men he admires were military. His opinion of women, like every man's, is an objectification of his own emotion towards them, which is obviously one of fear. "Forget not thy whip"-- but nine women out of ten would get the whip away from him, and he knew it, so he kept away from women, and soothed his wounded vanity with unkind remarks. [...] [H]e is so full of fear and hatred that spontaneous love of mankind seems to him impossible. He has never conceived of the man who, with all the fearlessness and stubborn pride of the superman, nevertheless does not inflict pain because he has no wish to do so. Does any one suppose that Lincoln acted as he did from fear of hell? Yet to Nietzsche, Lincoln is abject, Napoleon magnificent. [...] I dislike Nietzsche because he likes the contemplation of pain, because he erects conceit into duty, because the men whom he most admires are conquerors, whose glory is cleverness in causing men to die. But I think the ultimate argument against his philosophy, as against any unpleasant but internally self-conscious ethic, lies not in an appeal to facts, but in an appeal to the emotions. Nietzsche despises universal love; I feel it the motive power to all that I desire as regards the world. His followers have had their innings, but we may hope that it is coming rapidly to an end.

— Russell, History of Western Philosophy

Likewise, the fictional valet Reginald Jeeves, created by author P. G. Wodehouse, is a fan of Baruch Spinoza, recommending his works to his employer, Bertie Wooster over those of Friedrich Nietzsche:

You would not enjoy Nietzsche, sir. He is fundamentally unsound.

— Wodehouse, Carry On Jeeves

Nietzsche after World War II

The appropriation of Nietzsche's work by the Nazis, combined with the rise of analytic philosophy, ensured that British and American academic philosophers would almost completely ignore him until at least 1950. Even George Santayana, an American philosopher whose life and work betray some similarity to Nietzsche's, dismissed Nietzsche in his 1916 Egotism in German Philosophy as a "prophet of Romanticism". Analytic philosophers, if they mentioned Nietzsche at all, characterized him as a literary figure rather than as a philosopher. Nietzsche's present stature in the English-speaking world owes much to the exegetical writings and improved Nietzsche translations by the Jewish-German, American philosopher Walter Kaufmann and the British scholar R.J. Hollingdale.

Nietzsche's influence on Continental philosophy increased dramatically after the Second World War, especially among the French intellectual Left and post-structuralists.

According to the philosopher René Girard, Nietzsche's greatest political legacy lies in his 20th-century French interpreters, among them Georges Bataille, Pierre Klossowski, Michel Foucault, Gilles Deleuze (and Félix Guattari), and Jacques Derrida. This philosophical movement (originating with the work of Bataille) has been dubbed French Nietzscheanism. Foucault's later writings, for example, revise Nietzsche's genealogical method to develop anti-foundationalist theories of power that divide and fragment rather than unite polities (as evinced in the liberal tradition of political theory). Deleuze, arguably the foremost of Nietzsche's Leftist interpreters, used the much-maligned "will to power" thesis in tandem with Marxian notions of commodity surplus and Freudian ideas of desire to articulate concepts such as the rhizome and other "outsides" to state power as traditionally conceived.

Gilles Deleuze and Pierre Klossowski wrote monographs drawing new attention to Nietzsche's work, and a 1972 conference at Cérisy-la-Salle ranks as the most important event in France for a generation's reception of Nietzsche. In Germany interest in Nietzsche was revived from the 1980s onwards, particularly by the German philosopher Peter Sloterdijk, who has devoted several essays to Nietzsche. Ernst Nolte the German historian, in his literature analyzing fascism and Nazism, presented Nietzsche as a force of the Counter-Enlightenment and foe of all modern "emancipation politics", and Nolte's judgment generated impassioned dialogue.

In recent years, Nietzsche has also influenced members of the analytical philosophy tradition, such as Bernard Williams in his last finished book, Truth And Truthfulness: An Essay In Genealogy (2002). Prior to that Arthur Danto, with his book, Nietzsche as Philosopher (1965), presented what was the first full-length study of Nietzsche by an analytical philosopher. Then later, Alexander Nehamas, came out with his book, Nietzsche: Life as Literature (1985).

United States labor law

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Uni...