Search This Blog

Saturday, May 9, 2026

Epistemology

From Wikipedia, the free encyclopedia

Epistemology is the branch of philosophy that examines the nature, origin, and limits of knowledge. Also called the theory of knowledge, it explores different types of knowledge, such as propositional knowledge about facts, practical knowledge in the form of skills, and knowledge by acquaintance as a familiarity through experience. Epistemologists study the concepts of belief, truth, and justification to understand the nature of knowledge. To discover how knowledge arises, they investigate sources of justification, such as perception, introspection, memory, reason, and testimony.

The school of skepticism questions the human ability to attain knowledge, while fallibilism says that knowledge is never certain. Empiricists hold that all knowledge comes from sense experience, whereas rationalists believe that some knowledge does not depend on it. Coherentists argue that a belief is justified if it is consistent with other beliefs. Foundationalists, by contrast, maintain that the justification of basic beliefs does not depend on other beliefs. Internalism and externalism debate whether justification is determined solely by mental states or also by external circumstances.

Separate branches of epistemology focus on knowledge in specific fields, like scientific, mathematical, moral, and religious knowledge. Naturalized epistemology relies on empirical methods and discoveries, whereas formal epistemology uses formal tools from logic. Social epistemology investigates the communal aspect of knowledge, and historical epistemology examines its historical conditions. Epistemology is closely related to psychology, which infers the beliefs people hold from their words and actions, while epistemology studies the norms governing the evaluation of beliefs. It also intersects with fields such as decision theory, education, and anthropology.

Early reflections on the nature, sources, and scope of knowledge are found in ancient Greek, Indian, and Chinese philosophy. The relation between reason and faith was a central topic in the medieval period. The modern era was characterized by the contrasting perspectives of empiricism and rationalism. Epistemologists in the 20th century examined the components, structure, and value of knowledge while integrating insights from the natural sciences and linguistics.

Definition

Epistemology is the philosophical study of knowledge and related concepts, such as justification. Also called theory of knowledge, it examines the nature and types of knowledge. It further investigates the sources of knowledge, like perception, inference, and testimony, to understand how knowledge is created. Another set of questions concerns the extent and limits of knowledge, addressing what people can and cannot know. Central concepts in epistemology include belief, truth, evidence, and reason. As one of the main branches of philosophy, epistemology stands alongside fields like ethics, logic, and metaphysics. The term can also refer to specific positions of philosophers within this branch, as in Plato's epistemology and Immanuel Kant's epistemology.

Epistemology explores how people should acquire beliefs. It determines which beliefs or forms of belief acquisition meet the standards or epistemic goals of knowledge and which ones fail, thereby providing an evaluation of beliefs. The fields of psychology and cognitive sociology are also interested in beliefs and related cognitive processes, but examine them from a different perspective. Unlike epistemology, they study the beliefs people actually have and how people acquire them instead of examining the evaluative norms of these processes. In this regard, epistemology is a normative discipline, whereas psychology and cognitive sociology are descriptive disciplines. Epistemology is relevant to many descriptive and normative disciplines, such as the other branches of philosophy and the sciences, by exploring the principles of how they may arrive at knowledge.

The word epistemology comes from the ancient Greek terms ἐπιστήμη (episteme, meaning knowledge or understanding) and λόγος (logos, meaning study of or reason), literally, the study of knowledge. Despite its ancient roots, the word itself was coined only in the 19th century to designate this field as a distinct branch of philosophy.

Central concepts

Epistemologists examine several foundational concepts to understand their essences and rely on them to formulate theories. Various epistemological disagreements have their roots in disputes about the nature and function of these concepts, like the controversies surrounding the definition of knowledge and the role of justification in it.

Knowledge

Knowledge is an awareness, familiarity, understanding, or skill. Its various forms all involve a cognitive success through which a person establishes epistemic contact with reality. Epistemologists typically understand knowledge as an aspect of individuals, generally as a cognitive mental state that helps them understand, interpret, and interact with the world. While this core sense is of particular interest to epistemologists, the term also has other meanings. For example, the epistemology of groups examines knowledge as a characteristic of a group of people who share ideas. The term can also refer to information stored in documents and computers.

Knowledge contrasts with ignorance, often simply defined as the absence of knowledge. Knowledge is usually accompanied by ignorance because people rarely have complete knowledge of a field, forcing them to rely on incomplete or uncertain information when making decisions. Even though many forms of ignorance can be mitigated through education and research, certain limits to human understanding result in inevitable ignorance. Some limitations are inherent in the human cognitive faculties themselves, such as the inability to know facts too complex for the human mind to conceive. Others depend on external circumstances when no access to the relevant information exists.

Epistemologists disagree on how much people know, for example, whether fallible beliefs can amount to knowledge or whether absolute certainty is required. The most stringent position is taken by radical skeptics, who argue that there is no knowledge at all.

Types

Black and white photo of a man wearing a dark suit, a white shirt, and a tie
Bertrand Russell originated the distinction between propositional knowledge and knowledge by acquaintance.

Epistemologists distinguish between different types of knowledge. Their primary interest is in knowledge of facts, called propositional knowledge. It is theoretical knowledge that can be expressed in declarative sentences using a that-clause, like "Ravi knows that kangaroos hop". For this reason, it is also called knowledge-that. Epistemologists often understand it as a relation between a knower and a known proposition, in the case above between the person Ravi and the proposition "kangaroos hop". It is use-independent since it is not tied to one specific purpose, unlike practical knowledge. It is a mental representation that embodies concepts and ideas to reflect reality. Because of its theoretical nature, it is typically held that only creatures with highly developed minds, such as humans, possess propositional knowledge.

Propositional knowledge contrasts with non-propositional knowledge in the form of knowledge-how and knowledge by acquaintance. Knowledge-how is a practical ability or skill, like knowing how to read or how to prepare lasagna. It is usually tied to a specific goal and not mastered in the abstract without concrete practice. To know something by acquaintance means to have an immediate familiarity with or awareness of it, usually as a result of direct experiential contact. Examples are "familiarity with the city of Perth", "knowing the taste of tsampa", and "knowing Marta Vieira da Silva personally".

Painting of a man with gray hair in a formal dark attire
The analytic–synthetic distinction has its roots in the philosophy of Immanuel Kant.

Another influential distinction in epistemology is between a posteriori and a priori knowledge. A posteriori knowledge is knowledge of empirical facts based on sensory experience, like "seeing that the sun is shining" and "smelling that a piece of meat has gone bad". This type of knowledge is associated with the empirical science and everyday affairs. A priori knowledge, by contrast, pertains to non-empirical facts and does not depend on evidence from sensory experience, like knowing that . It belongs to fields such as mathematics and logic. The distinction between a posteriori and a priori knowledge is central to the debate between empiricists and rationalists regarding whether all knowledge depends on sensory experience.

A closely related contrast is between analytic and synthetic truths. A sentence is analytically true if its truth depends only on the meanings of the words it uses. For instance, the sentence "all bachelors are unmarried" is analytically true because the word "bachelor" already includes the meaning "unmarried". A sentence is synthetically true if its truth depends on additional facts. For example, the sentence "snow is white" is synthetically true because its truth depends on the color of snow in addition to the meanings of the words snow and white. A priori knowledge is primarily associated with analytic sentences, whereas a posteriori knowledge is primarily associated with synthetic sentences. However, it is controversial whether this is true for all cases. Some philosophers, such as Willard Van Orman Quine, reject the distinction, saying that there are no analytic truths.

Analysis

The analysis of knowledge is the attempt to identify the essential components or conditions of all and only propositional knowledge states. According to the so-called traditional analysis, knowledge has three components: it is a belief that is justified and true. In the second half of the 20th century, this view was challenged by a series of thought experiments aiming to show that some justified true beliefs do not amount to knowledge. In one of them, a person is unaware of all the fake barns in their area. By coincidence, they stop in front of the only real barn and form a justified true belief that it is a real barn. Many epistemologists agree that this is not knowledge because the justification is not directly relevant to the truth. More specifically, this and similar counterexamples involve some form of epistemic luck, that is, a cognitive success that results from fortuitous circumstances rather than competence.

Venn diagram with circles for true beliefs, justified beliefs, and knowledge
The so-called traditional analysis says that knowledge is justified true belief. Edmund Gettier tried to show that some justified true beliefs do not amount to knowledge.

Following these thought experiments, philosophers proposed various alternative definitions of knowledge by modifying or expanding the traditional analysis. According to one view, the known fact has to cause the belief in the right way. Another theory states that the belief is the product of a reliable belief formation process. Further approaches require that the person would not have the belief if it was false, that the belief is not inferred from a falsehood, that the justification cannot be undermined, or that the belief is infallible. There is no consensus on which of the proposed modifications and reconceptualizations is correct. Some philosophers, such as Timothy Williamson, reject the basic assumption underlying the analysis of knowledge by arguing that propositional knowledge is a unique state that cannot be dissected into simpler components.

Value

The value of knowledge is the worth it holds by expanding understanding and guiding action. Knowledge can have instrumental value by helping a person achieve their goals. For example, knowledge of a disease helps a doctor cure their patient. The usefulness of a known fact depends on the circumstances. Knowledge of some facts may have little to no uses, like memorizing random phone numbers from an outdated phone book. Being able to assess the value of knowledge matters in choosing what information to acquire and share. It affects decisions like which subjects to teach at school and how to allocate funds to research projects.

Epistemologists are particularly interested in whether knowledge is more valuable than a mere true opinion. Knowledge and true opinion often have a similar usefulness since both accurately represent reality. For example, if a person wants to go to Larissa, a true opinion about the directions can guide them as effectively as knowledge. Considering this problem, Plato proposed that knowledge is better because it is more stable. Another suggestion focuses on practical reasoning, arguing that people put more trust in knowledge than in mere true opinions when drawing conclusions and deciding what to do. A different response says that knowledge has intrinsic value in addition to instrumental value. This view asserts that knowledge is always valuable, whereas true opinion is only valuable in circumstances where it is useful.

Belief and truth

Beliefs are mental states about what is the case, like believing that snow is white or that God exists. In epistemology, they are often understood as subjective attitudes that affirm or deny a proposition, which can be expressed in a declarative sentence. For instance, to believe that snow is white is to affirm the proposition "snow is white". According to this view, beliefs are representations of what the universe is like. They are stored in memory and retrieved when actively thinking about reality or deciding how to act. A different view understands beliefs as behavioral patterns or dispositions to act rather than as representational items stored in the mind. According to this perspective, to believe that there is mineral water in the fridge is nothing more than a group of dispositions related to mineral water and the fridge. Examples are the dispositions to answer questions about the presence of mineral water affirmatively and to go to the fridge when thirsty. Some theorists deny the existence of beliefs, saying that this concept borrowed from folk psychology oversimplifies much more complex psychological or neurological processes. Beliefs are central to various epistemological debates, which cover their status as a component of propositional knowledge, the question of whether people have control over and responsibility for their beliefs, and the issue of whether beliefs have degrees, called credences.

As propositional attitudes, beliefs are true or false depending on whether they affirm a true or a false proposition. According to the correspondence theory of truth, to be true means to stand in the right relation to the world by accurately describing what it is like. This means that truth is objective: a belief is true if it corresponds to a fact. The coherence theory of truth says that a belief is true if it belongs to a coherent system of beliefs. A result of this view is that truth is relative since it depends on other beliefs. Further theories of truth include pragmatist, semantic, pluralist, and deflationary theories. Truth plays a central role in epistemology as a goal of cognitive processes and an attribute of propositional knowledge.

Justification

In epistemology, justification is a property of beliefs that meet certain norms about what a person should believe. According to a common view, this means that the person has sufficient reasons for holding this belief because they have information that supports it. Another view states that a belief is justified if it is formed by a reliable belief formation process, such as perception. The terms reasonable, warranted, and supported are sometimes used as synonyms of the word justified. Justification distinguishes well-founded beliefs from superstition and lucky guesses. However, it does not guarantee truth. For example, a person with strong but misleading evidence may form a justified belief that is false.

Epistemologists often identify justification as a key component of knowledge. Usually, they are not only interested in whether a person has a sufficient reason to hold a belief, known as propositional justification, but also in whether the person holds the belief because or based on this reason, known as doxastic justification. For example, if a person has sufficient reason to believe that a neighborhood is dangerous but forms this belief based on superstition then they have propositional justification but lack doxastic justification.

Sources

Sources of justification are ways or cognitive capacities through which people acquire justification. Often-discussed sources include perception, introspection, memory, reason, and testimony, but there is no universal agreement to what extent they all provide valid justification. Perception relies on sensory organs to gain empirical information. Distinct forms of perception correspond to different physical stimuli, such as visual, auditory, haptic, olfactory, and gustatory perception. Perception is not merely the reception of sense impressions but an active process that selects, organizes, and interprets sensory signals. Introspection is a closely related process focused on internal mental states rather than external physical objects. For example, seeing a bus at a bus station belongs to perception while feeling tired belongs to introspection.

Rationalists understand reason as a source of justification for non-empirical facts, explaining how people can know about mathematical, logical, and conceptual truths. Reason is also responsible for inferential knowledge, in which one or more beliefs serve as premises to support another belief. Memory depends on information provided by other sources, which it retains and recalls, like remembering a phone number perceived earlier. Justification by testimony relies on information one person communicates to another person. This can happen by talking to each other but can also occur in other forms, like a letter, a newspaper, and a blog.

Other concepts

Rationality is closely related to justification and the terms rational belief and justified belief are sometimes used interchangeably. However, rationality has a wider scope that encompasses both a theoretical side, covering beliefs, and a practical side, covering decisions, intentions, and actions. There are different conceptions about what it means for something to be rational. According to one view, a mental state is rational if it is based on or responsive to good reasons. Another view emphasizes the role of coherence, stating that rationality requires that the different mental states of a person are consistent and support each other. A slightly different approach holds that rationality is about achieving certain goals. Two goals of theoretical rationality are accuracy and comprehensiveness, meaning that a person has as few false beliefs and as many true beliefs as possible.

Epistemologists rely on the concept of epistemic norms as criteria to assess the cognitive quality of beliefs, like their justification and rationality. They distinguish between deontic norms, which prescribe what people should believe, and axiological norms, which identify the goals and values of beliefs. Epistemic norms are closely linked to intellectual or epistemic virtues, which are character traits like open-mindedness and conscientiousness. Epistemic virtues help individuals form true beliefs and acquire knowledge. They contrast with epistemic vices and act as foundational concepts of virtue epistemology.

Epistemologists understand evidence for a belief as information that favors or supports it. They conceptualize evidence primarily in terms of mental states, such as sensory impressions or other known propositions. But in a wider sense, it can also include physical objects, like bloodstains examined by forensic analysts or financial records studied by investigative journalists. Evidence is often understood in terms of probability: evidence for a belief makes it more likely that the belief is true. A defeater is evidence against a belief or evidence that undermines another piece of evidence. For instance, witness testimony linking a suspect to a crime is evidence of their guilt, while an alibi is a defeater. Evidentialists analyze justification in terms of evidence by asserting that for a belief to be justified, it needs to rest on adequate evidence.

The presence of evidence usually affects doubt and certainty, which are subjective attitudes toward propositions that differ regarding their level of confidence. Doubt involves questioning the validity or truth of a proposition. Certainty, by contrast, is a strong affirmative conviction, indicating an absence of doubt about the proposition's truth. Doubt and certainty are central to ancient Greek skepticism and its goal of establishing that no belief is immune to doubt. They are also crucial in attempts to find a secure foundation of all knowledge, such as René Descartes' foundationalist epistemology.

While propositional knowledge is the main topic in epistemology, some theorists focus on understanding instead. Understanding is a more holistic notion that involves a wider grasp of a subject. To understand something, a person requires awareness of how different things are connected and why they are the way they are. For example, knowledge of isolated facts memorized from a textbook does not amount to understanding. According to one view, understanding is a unique epistemic good that, unlike propositional knowledge, is always intrinsically valuable. Wisdom is similar in this regard and is sometimes considered the highest epistemic good. It encompasses a reflective understanding with practical applications, helping people grasp and evaluate complex situations and lead a good life.

In epistemology, knowledge ascription is the act of attributing knowledge to someone, expressed in sentences like "Sarah knows that it will rain today". According to invariantism, knowledge ascriptions have fixed standards across different contexts. Contextualists, by contrast, argue that knowledge ascriptions are context-dependent. From this perspective, Sarah may know about the weather in the context of an everyday conversation even though she is not sufficiently informed to know it in the context of a rigorous meteorological debate. Contrastivism, another view, argues that knowledge ascriptions are comparative, meaning that to know something involves distinguishing it from relevant alternatives. For example, if a person spots a bird in the garden, they may know that it is a sparrow rather than an eagle, but they may not know that it is a sparrow rather than an indistinguishable sparrow hologram.

Major schools of thought

Skepticism and fallibilism

Philosophical skepticism questions the human ability to attain knowledge by challenging the foundations upon which knowledge claims rest. Some skeptics limit their criticism to specific domains of knowledge. For example, religious skeptics say that it is impossible to know about the existence of deities or the truth of other religious doctrines. Similarly, moral skeptics challenge the existence of moral knowledge and metaphysical skeptics say that humans cannot know ultimate reality. External world skepticism questions knowledge of external facts, whereas skepticism about other minds doubts knowledge of the mental states of others.

Global skepticism is the broadest form of skepticism, asserting that there is no knowledge in any domain. In ancient philosophy, this view was embraced by academic skeptics, whereas Pyrrhonian skeptics recommended the suspension of belief to attain tranquility. Few epistemologists have explicitly defended global skepticism. The influence of this position stems from attempts by other philosophers to show that their theory overcomes the challenge of skepticism. For example, René Descartes used methodological doubt to find facts that cannot be doubted.

One consideration in favor of global skepticism is the dream argument. It starts from the observation that, while people are dreaming, they are usually unaware of this. This inability to distinguish between dream and regular experience is used to argue that there is no certain knowledge since a person can never be sure that they are not dreaming. Some critics assert that global skepticism is self-refuting because denying the existence of knowledge is itself a knowledge claim. Another objection says that the abstract reasoning leading to skepticism is not convincing enough to overrule common sense.

Fallibilism is another response to skepticism. Fallibilists agree with skeptics that absolute certainty is impossible. They reject the assumption that knowledge requires absolute certainty, leading them to the conclusion that fallible knowledge exists. They emphasize the need to keep an open and inquisitive mind, acknowledging that doubt can never be fully excluded, even for well-established knowledge claims like thoroughly tested scientific theories.

Epistemic relativism is related to skepticism but differs in that it does not question the existence of knowledge in general. Instead, epistemic relativists only reject the notion of universal epistemic standards or absolute principles that apply equally to everyone. This means that what a person knows depends on subjective criteria or social conventions used to assess epistemic status.

Empiricism and rationalism

Oil painting of a man with gray hair wearing a brown attire
Oil painting showing a man from the front against a dark background, dressed in a red coat with gold embroidery, his left arm resting on a surface
John Locke and David Hume shaped the philosophy of empiricism.

The debate between empiricism and rationalism centers on the origins of human knowledge. Empiricism emphasizes that sense experience is the primary source of all knowledge. Some empiricists illustrate this view by describing the mind as a blank slate that only develops ideas about the external world through the sense data received from the sensory organs. According to them, the mind can attain various additional insights by comparing impressions, combining them, generalizing to form more abstract ideas, and deducing new conclusions from them. Empiricists say that all these mental operations depend on sensory material and do not function on their own.

Even though rationalists usually accept sense experience as one source of knowledge, they argue that certain forms of knowledge are directly accessed through reason without sense experience, like knowledge of mathematical and logical truths. Some forms of rationalism state that the mind possesses inborn ideas, accessible without sensory assistance. Others assert that there is an additional cognitive faculty, sometimes called rational intuition, through which people acquire nonempirical knowledge. Some rationalists limit their discussion to the origin of concepts, saying that the mind relies on inborn categories to understand the world and organize experience.

Foundationalism and coherentism

Diagram with sections for foundationalism, coherentism, and infinitism, each depicting the relations between beliefs
Diagram of foundationalism, coherentism, and infinitism with arrows symbolizing support between beliefs. According to foundationalism, some basic beliefs are justified without support from other beliefs. According to coherentism, justification requires that beliefs mutually support each other. According to infinitism, justification requires that beliefs form infinite support chains.

Foundationalists and coherentists disagree about the structure of knowledge. Foundationalism distinguishes between basic and non-basic beliefs. A belief is basic if it is justified directly, meaning that its validity does not depend on the support of other beliefs. A belief is non-basic if it is justified by another belief. For example, the belief that it rained last night is a non-basic belief if it is inferred from the observation that the street is wet. According to foundationalism, basic beliefs are the foundation on which all other knowledge is built while non-basic beliefs act as the superstructure resting on this foundation.

Coherentists reject the distinction between basic and non-basic beliefs, saying that the justification of any belief depends on other beliefs. They assert that a belief must align with other beliefs to amount to knowledge. This occurs when beliefs are consistent and support each other. According to coherentism, justification is a holistic aspect determined by the whole system of beliefs, which resembles an interconnected web.

Foundherentism is an intermediary position combining elements of both foundationalism and coherentism. It accepts the distinction between basic and non-basic beliefs while asserting that the justification of non-basic beliefs depends on coherence with other beliefs.

Infinitism presents a less common alternative perspective on the structure of knowledge. It agrees with coherentism that there are no basic beliefs while rejecting the view that beliefs can support each other in a circular manner. Instead, it argues that beliefs form infinite justification chains, in which each link of the chain supports the belief following it and is supported by the belief preceding it.

Internalism and externalism

Black and white photo of a bearded man wearing a suit and a tie
Alvin Goldman was an influential defender of externalism.

The disagreement between internalism and externalism is about the sources of justification. Internalists say that justification depends only on factors within the individual, such as perceptual experience, memories, and other beliefs. This view emphasizes the importance of the cognitive perspective of the individual in the form of their mental states. It is commonly associated with the idea that the relevant factors are accessible, meaning that the individual can become aware of their reasons for holding a justified belief through introspection and reflection.

Evidentialism is an influential internalist view, asserting that justification depends on the possession of evidence. In this context, evidence for a belief is any information in the individual's mind that supports the belief. For example, the perceptual experience of rain is evidence for the belief that it is raining. Evidentialists suggest various other forms of evidence, including memories, intuitions, and other beliefs. According to evidentialism, a belief is justified if the individual's evidence supports it and they hold the belief on the basis of this evidence.

Externalism, by contrast, asserts that at least some relevant factors of knowledge are external to the individual. For instance, when considering the belief that a cup of coffee stands on the table, externalists are not primarily interested in the subjective perceptual experience that led to this belief. Instead, they focus on objective factors, like the quality of the person's eyesight, their ability to differentiate coffee from other beverages, and the circumstances under which they observed the cup. A key motivation of many forms of externalism is that justification makes it more likely that a belief is true. Based on this view, justification is external to the extent that some factors contributing to this likelihood are not part of the believer's cognitive perspective.

Reliabilism is an externalist theory asserting that a reliable connection between belief and truth is required for justification. Some reliabilists explain this in terms of reliable processes. According to this view, a belief is justified if it is produced by a reliable process, like perception. A belief-formation process is deemed reliable if most of the beliefs it generates are true. An alternative view focuses on beliefs rather than belief-formation processes, saying that a belief is justified if it is a reliable indicator of the fact it presents. This means that the belief tracks the fact: the person believes it because it is true but would not believe it otherwise.

Virtue epistemology, another type of externalism, asserts that a belief is justified if it manifests intellectual virtues. Intellectual virtues are capacities or traits that perform cognitive functions and help people form true beliefs. Suggested examples include faculties, like vision, memory, and introspection, and character traits, like open-mindedness.

Branches and approaches

Some branches of epistemology are characterized by their research methods. Formal epistemology employs formal tools from logic and mathematics to investigate the nature of knowledge. For example, Bayesian epistemology represents beliefs as degrees of certainty and uses probability theory to formally define norms of rationality governing how certain people should be. Experimental epistemologists base their research on empirical evidence about common knowledge practices. Applied epistemology focuses on the practical application of epistemological principles to diverse real-world problems, like the reliability of knowledge claims on the internet, how to assess sexual assault allegations, and how racism may lead to epistemic injusticeMetaepistemologists study the nature, goals, and research methods of epistemology. As a metatheory, it does not directly advocate for specific epistemological theories but examines their fundamental concepts and background assumptions.

Particularism and generalism disagree about the right method of conducting epistemological research. Particularists start their inquiry by looking at specific cases. For example, to find a definition of knowledge, they rely on their intuitions about concrete instances of knowledge and particular thought experiments. They use these observations as methodological constraints that any theory of general principles needs to follow. Generalists proceed in the opposite direction. They prioritize general epistemic principles, saying that it is not possible to accurately identify and describe specific cases without a grasp of these principles. Other methods in contemporary epistemology aim to extract philosophical insights from ordinary language or look at the role of knowledge in making assertions and guiding actions.

Phenomenological epistemology emphasizes the importance of first-person experience. It distinguishes between the natural and the phenomenological attitudes. The natural attitude focuses on objects belonging to common sense and natural science. The phenomenological attitude focuses on the experience of objects and aims to provide a presuppositionless description of how objects appear to the observer.

Naturalized epistemology is closely associated with the natural sciences, relying on their methods and theories to examine knowledge. Arguing that epistemological theories should rest on empirical observation, it is critical of a priori reasoning. Evolutionary epistemology is a naturalistic approach that understands cognition as a product of evolution, examining knowledge and the cognitive faculties responsible for it through the lens of natural selectionSocial epistemology focuses on the social dimension of knowledge. While traditional epistemology is mainly interested in the knowledge possessed by individuals, social epistemology covers knowledge acquisition, transmission, and evaluation within groups, with specific emphasis on how people rely on each other when seeking knowledge.

Pragmatist epistemology is a form of fallibilism that emphasizes the close relation between knowing and acting. It sees the pursuit of knowledge as an ongoing process guided by common sense and experience while always open to revision. This approach reinterprets some core epistemological notions, for example, by conceptualizing beliefs as habits that shape actions rather than representations that mirror the world. Motivated by pragmatic considerations, epistemic conservatism is a view about belief revision. It prioritizes pre-existing beliefs, asserting that a person should only change their beliefs if they have a good reason to. One argument for epistemic conservatism rests on the recognition that the cognitive resources of humans are limited, making it impractical to constantly reexamine every belief.

Postmodern epistemology critiques the conditions of knowledge in advanced societies. This concerns in particular the metanarrative of a constant progress of scientific knowledge leading to a universal and foundational understanding of reality. Similarly, feminist epistemology adopts a critical perspective, focusing on the effect of gender on knowledge. Among other topics, it explores how preconceptions about gender influence who has access to knowledge, how knowledge is produced, and which types of knowledge are valued in society. Some postmodern and feminist thinkers adopt a constructivist approach, arguing that the way people view the world is not a simple reflection of external reality but a social construction. This view emphasizes the creative role of interpretation while undermining objectivity since social constructions can vary across societies. Another critical approach, found in decolonial scholarship, opposes the global influence of Western knowledge systems. It seeks to undermine Western hegemony and decolonize knowledge.

The decolonial outlook is also present in African epistemology. Grounded in African ontology, it emphasizes the interconnectedness of reality as a continuum between knowing subject and known object. It understands knowledge as a holistic phenomenon that includes sensory, emotional, intuitive, and rational aspects, extending beyond the limits of the physical domain.

Another epistemological tradition is found in ancient Indian philosophy. Its diverse schools of thought examine different sources of knowledge, called pramāṇa. Perception, inference, and testimony are sources discussed by most schools. Other sources only considered by some schools are non-perception, which leads to knowledge of absences, and presumption. Buddhist epistemology focuses on immediate experience, understood as the presentation of unique particulars without secondary cognitive processes, like thought and desire. Nyāya epistemology is a causal theory of knowledge, understanding sources of knowledge as reliable processes that cause episodes of truthful awareness. It sees perception as the primary source of knowledge and emphasizes its importance for successful action. Mīmāṃsā epistemology considers the holy scriptures known as the Vedas as a key source of knowledge, addressing the problem of their right interpretation. Jain epistemology states that reality is many-sided, meaning that no single viewpoint can capture the entirety of truth.

Historical epistemology examines how the understanding of knowledge and related concepts has changed over time. It asks whether the main issues in epistemology are perennial and to what extent past epistemological theories are relevant to contemporary debates. It is particularly concerned with scientific knowledge and practices associated with it. It contrasts with the history of epistemology, which presents, reconstructs, and evaluates epistemological theories of philosophers in the past.

Knowledge in particular domains

Some branches of epistemology focus on knowledge within specific academic disciplines. The epistemology of science examines how scientific knowledge is generated and what problems arise in the process of validating, justifying, and interpreting scientific claims. A key issue concerns the problem of how individual observations can support universal scientific laws. Other topics include the nature of scientific evidence and the aims of science. The epistemology of mathematics studies the origin of mathematical knowledge. In exploring how mathematical theories are justified, it investigates the role of proofs and whether there are empirical sources of mathematical knowledge.

Distinct areas of epistemology are dedicated to specific sources of knowledge. Examples are the epistemology of perception, the epistemology of memory, and the epistemology of testimony. In the epistemology of perception, direct and indirect realists debate the connection between the perceiver and the perceived object. Direct realists say that this connection is direct, meaning that there is no difference between the object present in perceptual experience and the physical object causing this experience. According to indirect realism, the connection is indirect, involving mental entities, like ideas or sense data, that mediate between the perceiver and the external world. The contrast between direct and indirect realism is important for explaining the nature of illusions.

Epistemological issues are found in most areas of philosophy. The epistemology of logic examines how people know that an argument is valid. For example, it explores how logicians justify that modus ponens is a correct rule of inference or that all contradictions are false. Epistemologists of metaphysics investigate whether knowledge of the basic structure of reality is possible and what sources this knowledge could have. Knowledge of moral statements, like the claim that lying is wrong, belongs to the epistemology of ethics. It studies the role of ethical intuitions, coherence among moral beliefs, and the problem of moral disagreement. The ethics of belief is a closely related field exploring the intersection of epistemology and ethics. It examines the norms governing belief formation and asks whether violating them is morally wrong. Religious epistemology studies the role of knowledge and justification for religious doctrines and practices. It evaluates the reliability of evidence from religious experience and holy scriptures while also asking whether the norms of reason should be applied to religious faith.

Epistemologists of language explore the nature of linguistic knowledge. One of their topics is the role of tacit knowledge, for example, when native speakers have mastered the rules of grammar but are unable to explicitly articulate them. Epistemologists of modality examine knowledge about what is possible and necessary. Epistemic problems that arise when two people have diverging opinions on a topic are covered by the epistemology of disagreement. Epistemologists of ignorance are interested in epistemic faults and gaps in knowledge.

Epistemology and psychology were not defined as distinct fields until the 19th century; earlier investigations about knowledge often do not fit neatly into today's academic categories. Both contemporary disciplines study beliefs and the mental processes responsible for their formation and change. One key contrast is that psychology describes what beliefs people have and how they acquire them, thereby explaining why someone has a specific belief. The focus of epistemology is on evaluating beliefs, leading to a judgment about whether a belief is justified and rational in a particular case. Epistemology also shares a close connection with cognitive science, which understands mental events as processes that transform informationArtificial intelligence relies on the insights of epistemology and cognitive science to implement concrete solutions to problems associated with knowledge representation and automatic reasoning.

Logic is the study of correct reasoning. For epistemology, it is relevant to inferential knowledge, which arises when a person reasons from one known fact to another. This is the case, for example, when inferring that it rained based on the observation that the streets are wet. Whether an inferential belief amounts to knowledge depends on the form of reasoning used, in particular, that the process does not violate the laws of logic. Another overlap between the two fields is found in the epistemic approach to fallacies. Fallacies are faulty arguments based on incorrect reasoning. The epistemic approach to fallacies explains why they are faulty, stating that arguments aim to expand knowledge. According to this view, an argument is a fallacy if it fails to do so. A further intersection is found in epistemic logic, which uses formal logical devices to study epistemological concepts like knowledge and belief.

Both decision theory and epistemology are interested in the foundations of rational thought and the role of beliefs. Unlike many approaches in epistemology, the main focus of decision theory lies less in the theoretical and more in the practical side, exploring how beliefs are translated into action. Decision theorists examine the reasoning involved in decision-making and the standards of good decisions, identifying beliefs as a central aspect of decision-making. One of their innovations is to distinguish between weaker and stronger beliefs, which helps them consider the effects of uncertainty on decisions.

Epistemology and education have a shared interest in knowledge, with one difference being that education focuses on the transmission of knowledge, exploring the roles of both learner and teacher. Learning theory examines how people acquire knowledge. Behavioral learning theories explain the process in terms of behavior changes, for example, by associating a certain response with a particular stimulusCognitive learning theories study how the cognitive processes that affect knowledge acquisition transform information. Pedagogy looks at the transmission of knowledge from the teacher's perspective, exploring the teaching methods they may employ. In teacher-centered methods, the teacher serves as the main authority delivering knowledge and guiding the learning process. In student-centered methods, the teacher primarily supports and facilitates the learning process, allowing students to take a more active role. The beliefs students have about knowledge, called personal epistemology, influence their intellectual development and learning success.

The anthropology of knowledge examines how knowledge is acquired, stored, retrieved, and communicated. It studies the social and cultural circumstances that affect how knowledge is reproduced and changes, covering the role of institutions like university departments and scientific journals as well as face-to-face discussions and online communications. This field has a broad concept of knowledge, encompassing various forms of understanding and culture, including practical skills. Unlike epistemology, it is not interested in whether a belief is true or justified but in how understanding is reproduced in society. A closely related field, the sociology of knowledge has a similar conception of knowledge. It explores how physical, demographic, economic, and sociocultural factors impact knowledge. This field examines in what sociohistorical contexts knowledge emerges and the effects it has on people, for example, how socioeconomic conditions are related to the dominant ideology in a society.

History

Early reflections on the nature and sources of knowledge are found in ancient history. In ancient Greek philosophy, Plato (427–347 BCE) studied what knowledge is, examining how it differs from true opinion by being based on good reasons. He proposed that learning is a form of recollection in which the soul remembers what it already knew but had forgotten. Plato's student Aristotle (384–322 BCE) was particularly interested in scientific knowledge, exploring the role of sensory experience and the process of making inferences from general principles. Aristotle's ideas influenced the Hellenistic schools of philosophy, which began to arise in the 4th century BCE and included Epicureanism, Stoicism, and skepticism. The Epicureans had an empiricist outlook, stating that sensations are always accurate and act as the supreme standard of judgments. The Stoics defended a similar position but confined their trust to lucid and specific sensations, which they regarded as true. The skeptics questioned that knowledge is possible, recommending instead suspension of judgment to attain a state of tranquility. Emerging in the 3rd century CE and inspired by Plato's philosophy, Neoplatonism distinguished knowledge from true belief, arguing that knowledge is infallible and limited to the realm of immaterial forms.

Photo of a statue of a monk sitting in the lotus position
The Buddhist philosopher Dharmakirti developed a causal theory of knowledge.

The Upanishads, philosophical scriptures composed in ancient India between 700 and 300 BCE, examined how people acquire knowledge, including the role of introspection, comparison, and deduction. In the 6th century BCE, the school of Ajñana developed a radical skepticism questioning the possibility and usefulness of knowledge. By contrast, the school of Nyaya, which emerged around 200 CE, asserted that knowledge is possible. It provided a systematic treatment of how people acquire knowledge, distinguishing between valid and invalid sources. When Buddhist philosophers became interested in epistemology, they relied on concepts developed in Nyaya and other traditions. Buddhist philosopher Dharmakirti (6th or 7th century CE) analyzed the process of knowing as a series of causally related events.

Ancient Chinese philosophers understood knowledge as an interconnected phenomenon fundamentally linked to ethical behavior and social involvement. Many saw wisdom as the goal of attaining knowledge. Mozi (470–391 BCE) proposed a pragmatic approach to knowledge using historical records, sensory evidence, and practical outcomes to validate beliefs. Mencius (c. 372–289 BCE) explored analogical reasoning as a source of knowledge and employed this method to criticize Mozi. Xunzi (c. 310–220 BCE) aimed to combine empirical observation and rational inquiry. He emphasized the importance of clarity and standards of reasoning without excluding the role of feeling and emotion.

The relation between reason and faith was a central topic in the medieval period. In Arabic–Persian philosophy, al-Farabi (c. 870–950) and Averroes (1126–1198) discussed how philosophy and theology interact, debating which one is a better vehicle to truth. Al-Ghazali (c. 1056–1111) criticized many core teachings of previous Islamic philosophers, saying that they relied on unproven assumptions that did not amount to knowledge. Similarly in Western philosophy, Anselm of Canterbury (1033–1109) proposed that theological teaching and philosophical inquiry are in harmony and complement each other. Formulating a more critical approach, Peter Abelard (1079–1142) argued against unquestioned theological authorities and said that all things are open to rational doubt. Influenced by Aristotle, Thomas Aquinas (1225–1274) developed an empiricist theory, stating that "nothing is in the intellect unless it first appeared in the senses". According to an early form of direct realism proposed by William of Ockham (c. 1285–1349), perception of mind-independent objects happens directly without intermediaries. Meanwhile, in 14th-century India, Gaṅgeśa developed a reliabilist theory of knowledge and considered the problems of testimony and fallacies. In China, Wang Yangming (1472–1529) explored the unity of knowledge and action, holding that moral knowledge is inborn and can be attained by overcoming self-interest.

Painting of a bearded man with long hair wearing a dark formal attire
René Descartes used methodological doubt to seek certain foundations for philosophy.

The course of modern philosophy was shaped by René Descartes (1596–1650), who stated that philosophy must begin from a position of indubitable knowledge of first principles. Inspired by skepticism, he aimed to find absolutely certain knowledge by encountering truths that cannot be doubted. He thought that this is the case for the assertion "I think, therefore I am", from which he constructed the rest of his philosophical system. Descartes, together with Baruch Spinoza (1632–1677) and Gottfried Wilhelm Leibniz (1646–1716), belonged to the school of rationalism, which asserts that the mind possesses innate ideas independent of experience. John Locke (1632–1704) rejected this view in favor of an empiricism according to which the mind is a blank slate. This means that all ideas depend on experience, either as "ideas of sense", which are directly presented through the senses, or as "ideas of reflection", which the mind creates by reflecting on its own activities. David Hume (1711–1776) used this idea to explore the limits of what people can know. He said that knowledge of facts is never certain, adding that knowledge of relations between ideas, like mathematical truths, can be certain but contains no information about the world. Immanuel Kant (1724–1804) sought a middle ground between rationalism and empiricism by identifying a type of knowledge overlooked by Hume. For Kant, this knowledge pertains to principles that underlie and structure all experience, such as spatial and temporal relations and fundamental categories of understanding.

In the 19th century and influenced by Kant's philosophy, Georg Wilhelm Friedrich Hegel (1770–1831) rejected empiricism by arguing that sensory impressions alone cannot amount to knowledge since all knowledge is actively structured by the knowing subject. John Stuart Mill (1806–1873), by contrast, defended a wide-sweeping form of empiricism and explained knowledge of general truths through inductive reasoningCharles Peirce (1839–1914) thought that all knowledge is fallible, emphasizing that knowledge seekers should remain open to revising their beliefs in light of new evidence. He used this idea to argue against Cartesian foundationalism, which seeks absolutely certain truths.

In the 20th century, fallibilism was further explored by J. L. Austin (1911–1960) and Karl Popper (1902–1994). In continental philosophy, Edmund Husserl (1859–1938) applied the skeptical idea of suspending judgment to the study of experience. By not judging whether an experience is accurate, he tried to describe its internal structure instead. Influenced by earlier empiricists, logical positivists, like A. J. Ayer (1910–1989), said that all knowledge is either empirical or analytic, rejecting any form of metaphysical knowledge. Bertrand Russell (1872–1970) developed an empiricist sense-datum theory, distinguishing between direct knowledge by acquaintance of sense data and indirect knowledge by description, which is inferred from knowledge by acquaintance. Common sense had a central place in G. E. Moore's (1873–1958) epistemology. He used trivial observations, like the fact that he has two hands, to argue against abstract philosophical theories that deviate from common sense. Ordinary language philosophy, as practiced by the late Ludwig Wittgenstein (1889–1951), is a similar approach that tries to extract epistemological insights from how ordinary language is used.

Edmund Gettier (1927–2021) conceived counterexamples against the idea that knowledge is justified true belief. These counterexamples prompted many philosophers to suggest alternative definitions of knowledge. Developed by philosophers such as Alvin Goldman (1938–2024), reliabilism emerged as one of the alternatives, asserting that knowledge requires reliable sources and shifting the focus away from justification. Virtue epistemologists, such as Ernest Sosa (1940–present) and Linda Zagzebski (1946–present), analyse belief formation in terms of the intellectual virtues or cognitive competencies involved in the process. Naturalized epistemology, as conceived by Willard Van Orman Quine (1908–2000), employs concepts and ideas from the natural sciences to formulate its theories. Other developments in late 20th-century epistemology were the emergence of social, feminist, and historical epistemology.

Fringe science

From Wikipedia, the free encyclopedia

Fringe science refers to ideas whose attributes include being highly speculative or relying on premises already refuted. The chance of ideas rejected by editors and published outside the mainstream being correct is remote. When the general public does not distinguish between science and imitators, it risks exploitation, and in some cases, a "yearning to believe or a generalized suspicion of experts is a very potent incentive to accepting some pseudoscientific claims".

The term "fringe science" covers everything from novel hypotheses, which can be tested utilizing the scientific method, to wild ad hoc hypotheses and mumbo jumbo. This has resulted in a tendency to dismiss all fringe science as the domain of pseudoscientists, hobbyists, and quacks.

A concept that was once accepted by the mainstream scientific community may become fringe science because of a later evaluation of previous research. For example, focal infection theory, which held that focal infections of the tonsils or teeth are a primary cause of systemic disease, was once considered to be medical fact. It has since been dismissed because of a lack of evidence.

Description

The boundary between fringe science and pseudoscience is disputed. Friedlander writes that there is no widespread understanding of what separates science from nonscience or pseudoscience. Pseudoscience, however, is something that is not scientific but is incorrectly characterised as science.

The term may be considered pejorative. For example, Lyell D. Henry Jr. wrote, "Fringe science [is] a term also suggesting kookiness."

Continental drift was rejected for decades lacking conclusive evidence before plate tectonics was accepted.

The confusion between science and pseudoscience, between honest scientific error and genuine scientific discovery, is not new, and it is a permanent feature of the scientific landscape .... Acceptance of new science can come slowly.

Examples

Historical

Some historical ideas that are considered to have been refuted by mainstream science are:

  • Wilhelm Reich's work with orgone, a physical energy he claimed to have discovered, contributed to his alienation from the psychiatric community. He was eventually sentenced to two years in a federal prison, where he died. At that time and continuing today, scientists disputed his claim that he had scientific evidence for the existence of orgone. Nevertheless, amateurs and a few fringe researchers continued to believe that orgone is real.
  • Focal infection theory (FIT), as the primary cause of systemic disease, rapidly became accepted by mainstream dentistry and medicine after World War I. This acceptance was largely based upon what later turned out to be fundamentally flawed studies. As a result, millions of people were subjected to needless dental extractions and surgeries. The original studies supporting FIT began falling out of favor in the 1930s. By the late 1950s, it was regarded as a fringe theory.
  • The Clovis First theory held that the Clovis culture was the first culture in North America. It was long regarded as a mainstream theory until mounting evidence of a pre-Clovis culture discredited it.

Modern

Relatively recent fringe sciences include:

  • Aubrey de Grey, featured in a 2006 60 Minutes special report, is studying human longevity. He calls his work "strategies for engineered negligible senescence" (SENS). Many mainstream scientists believe his research is fringe science (especially his view of the importance of nuclear epimutations and his timeline for antiaging therapeutics). In a 2005 article in Technology Review (part of a larger series), it was stated that "SENS is highly speculative. Many of its proposals have not been reproduced, nor could they be reproduced with today's scientific knowledge and technology. Echoing Myhrvold, we might charitably say that de Grey's proposals exist in a kind of antechamber of science, where they wait (possibly in vain) for independent verification. SENS does not compel the assent of many knowledgeable scientists; but neither is it demonstrably wrong."
  • A nuclear fusion reaction called cold fusion, which occurs near room temperature and pressure, was reported by chemists Martin Fleischmann and Stanley Pons in March 1989. Numerous research efforts at the time were unable to replicate their results. Subsequently, several scientists have worked on cold fusion or have participated in international conferences on it. In 2004, the United States Department of Energy commissioned a panel on cold fusion to reexamine the concept. They wanted to determine whether their policies should be altered because of new evidence.
  • The theory of abiogenic petroleum origin holds that petroleum was formed from deep carbon deposits, perhaps dating to the formation of the Earth. The ubiquity of hydrocarbons in the Solar System may be evidence that there may be more petroleum on Earth than commonly thought and that petroleum may originate from carbon-bearing fluids that migrate upward from the Earth's mantle. Abiogenic hypotheses saw a revival in the last half of the twentieth century by Russian and Ukrainian scientists. More interest was generated in the West after the 1992 publication by Thomas Gold of the journal article, "The Deep, Hot Biosphere". Gold's version of the theory is partly based on the existence of a biosphere composed of thermophile bacteria in the Earth's crust, which might explain the existence of specific biomarkers in extracted petroleum.
  • Young Earth creationism, asserts the Universe is under 10,000 years old, with variations, such as Gap creationism, and Old Earth creationism, offering similar propositions.
  • Modern flat Earth advocates assert that Samuel Rowbotham was correct, in his, 1849, pamphlet, and 1865 book, Zetetic Astronomy: The Earth not a Globe, to reject the Hellenistic (323 BCE–31 BCE) spherical earth model, for a flat disc world, centred on the North pole.
  • Out of India theorem - asserts Indo-European languages evolved in India, and spread to Europe through waves of pre-historic Indian immigrants.

Accepted as mainstream

Some theories that were once rejected as fringe science but were eventually accepted as mainstream science include:

Responding to fringe science

Michael W. Friedlander has suggested some guidelines for responding to fringe science, which, he argues, is a more difficult problem than scientific misconduct. His suggested methods include impeccable accuracy, checking cited sources, not overstating orthodox science, thorough understanding of the Wegener continental drift example, examples of orthodox science investigating radical proposals, and prepared examples of errors from fringe scientists.

Friedlander suggests that fringe science is necessary so mainstream science will not atrophy. Scientists must evaluate the plausibility of each new fringe claim, and certain fringe discoveries "will later graduate into the ranks of accepted" — while others "will never receive confirmation".

Margaret Wertheim profiled many "outsider scientists" in her book Physics on the Fringe, who receive little or no attention from professional scientists. She describes all of them as trying to make sense of the world using the scientific method but in the face of being unable to understand modern science's complex theories. She also finds it fair that credentialed scientists do not bother spending a lot of time learning about and explaining problems with the fringe theories of uncredentialed scientists since the authors of those theories have not taken the time to understand the mainstream theories they aim to disprove.

Controversies

As Donald E. Simanek asserts, "Too often speculative and tentative hypotheses of cutting edge science are treated as if they were scientific truths, and so accepted by a public eager for answers." However, the public is ignorant that "As science progresses from ignorance to understanding it must pass through a transitional phase of confusion and uncertainty."

The media also play a role in propagating the belief that certain fields of science are controversial. In their 2003 paper "Optimising Public Understanding of Science and Technology in Europe: A Comparative Perspective", Jan Nolin et al. write that "From a media perspective it is evident that controversial science sells, not only because of its dramatic value, but also since it is often connected to high-stake societal issues."

Prehistory of nakedness and clothing

From Wikipedia, the free encyclopedia

Nakedness and clothing use are characteristics of humans related by evolutionary and social prehistory. The major loss of body hair distinguishes humans from other primates. Current evidence indicates that anatomically-modern humans were naked in prehistory for at least 90,000 years before they invented clothing. Today, isolated Indigenous peoples in tropical climates continue to be without clothing in many everyday activities.

Evolution of hairlessness

Humans' closest living relatives, like chimpanzees and bonobos, have both extensive areas of fur and also bare patches

The general hairlessness of humans in comparison to related species may be due to loss of functionality in the pseudogene called KRT41P (which helps produce keratin) in the human lineage about 240,000 years ago. On an individual basis, mutations in the gene HR can lead to complete hair loss, though this is not typical in humans. Humans may also lose their hair as a result of hormonal imbalance due to drugs or pregnancy.

To comprehend why humans have significantly less body hair than other primates, one must understand that mammalian body hair is not merely an aesthetic characteristic; it protects the skin from wounds, bites, heat, cold, and ultraviolet radiation. Additionally, it can be used as a communication tool and as a camouflage.

The first member of the genus Homo to be hairless was Homo erectus, originating about 1.6 million years ago. The dissipation of body heat remains the most widely accepted evolutionary explanation for the loss of body hair in early members of the genus Homo, the surviving member of which is modern humans. Less hair and an increase in sweat glands made it easier for their bodies to cool when they moved from living in shady forest to open savanna. This change in environment also resulted in a change in diet, from largely vegetarian to hunting-gathering. Pursuing game on the savanna also increased the need for regulation of body heat.

The anthropologist and paleo-biologist Nina Jablonski posits that the ability to dissipate excess body heat through eccrine sweating helped make possible the dramatic enlargement of the brain, the most temperature-sensitive organ in human body. Thus the loss of fur was also a factor in further adaptations, both physical and behavioral, that differentiated humans from other primates. Some of these changes are thought to be the result of sexual selection: by selecting more hairless mates, humans accelerated changes initiated by natural selection. Sexual selection may also account for the remaining human hair in the pubic area and armpits, which are sites for pheromones, while hair on the head continued to provide protection from the sun. Anatomically-modern humans, whose traits include hairlessness, evolved 260,000 to 350,000 years ago.

Phenotypic changes

Humans are the only primate species to have undergone significant hair loss, and of the approximately 5000 extant species of mammal, only a handful are effectively hairless. This list includes elephants, rhinoceroses, hippopotamuses, walruses, some species of pigs, whales and other cetaceans, and naked mole rats. Most mammals have light skin that is covered by fur, and biologists believe that early human ancestors started out this way also. Dark skin probably evolved after humans lost their body fur, because the naked skin was vulnerable to the strong UV radiation as explained in the Out of Africa hypothesis. Therefore, evidence of the time when human skin darkened has been used to date the loss of human body hair, assuming that the dark skin was needed after the fur was gone.

With the loss of fur, darker, high-melanin skin evolved as a protection from ultraviolet radiation damage. As humans migrated outside of the tropics, varying degrees of depigmentation evolved in order to permit UVB-induced synthesis of previtamin D3. The relative lightness of female compared to male skin in a given population may be due to the greater need for women to produce more vitamin D during lactation.

The sweat glands in humans could have evolved to spread from the hands and feet as the body hair changed, or the hair change could have occurred to facilitate sweating. Horses and humans are two of the few animals capable of sweating on most of their body, yet horses are larger and still have fully developed fur. In humans, the skin hairs lie flat in hot conditions, as the arrector pili muscles relax, preventing heat from being trapped by a layer of still air between the hairs, and increasing heat loss by convection.

Sexual selection hypothesis

Another hypothesis for the thick body hair on humans proposes that Fisherian runaway sexual selection played a role (as well as in the selection of long head hair), (see terminal and vellus hair), as well as a much larger role of testosterone in men. Sexual selection is the only theory thus far that explains the sexual dimorphism seen in the hair patterns of men and women. On average, men have more body hair than women. Males have more terminal hair, especially on the face, chest, abdomen, and back, and females have more vellus hair, which is less visible. The halting of hair development at a juvenile stage, vellus hair, would also be consistent with the neoteny evident in humans, especially in females, and thus they could have occurred at the same time. This theory, however, has significant holdings in today's cultural norms. There is no evidence that sexual selection would proceed to such a drastic extent over a million years ago when a full, lush coat of hair would most likely indicate health and would therefore be more likely to be selected for, not against.

Water-dwelling hypothesis

The aquatic ape hypothesis (AAH) includes hair loss as one of several characteristics of modern humans that could indicate adaptations to an aquatic environment. Serious consideration may be given by contemporary anthropologists to some hypotheses related to AAH, but hair loss is not one of them.

Parasite hypothesis

A divergent explanation of humans' relative hairlessness holds that ectoparasites (such as ticks) residing in fur became problematic as humans became hunters living in larger groups with a "home base". Nakedness would also make the lack of parasites apparent to prospective mates. However, this theory is inconsistent with the abundance of parasites that continue to exist in the remaining patches of human hair.

The "ectoparasite" explanation of modern human nakedness is based on the principle that a hairless primate would harbor fewer parasites. When our ancestors adopted group-dwelling social arrangements roughly 1.8 Mya (million years ago), ectoparasite loads increased dramatically. Early humans became the only one of the 193 primate species to have fleas, which can be attributed to the close living arrangements of large groups of individuals. While primate species have communal sleeping arrangements, these groups are always on the move and thus are less likely to harbor ectoparasites. However, humans have as many follicles as other primates, but the hair is shorter and finer. This "peach fuzz" may have been retained because it both allows humans to detect the presence of ectoparasites while inhibiting their movement on the skin.

It was expected that dating the split of the ancestral human louse into two species, the head louse and the pubic louse, would date the loss of body hair in human ancestors. However, it turned out that the human pubic louse does not descend from the ancestral human louse, but from the gorilla louse, diverging 3.3 million years ago. This suggests that humans had lost body hair (but retained head hair) and developed thick pubic hair prior to this date, were living in or close to the forest where gorillas lived, and acquired pubic lice from butchering gorillas or sleeping in their nests.[26][27] The evolution of the body louse from the head louse, on the other hand, places the date of clothing much later, some 100,000 years ago.

Origin of clothing

A necklace reconstructed from perforated sea snail shells from Upper Palaeolithic Europe, dated between 39,000 and 25,000 BCE. The practice of body adornment is associated with the emergence of behavioral modernity.

A 2010 study published in Molecular Biology and Evolution indicates that the habitual wearing of clothing began at some point in time between 83,000 and 170,000 years ago based upon a genetic analysis indicating when clothing lice diverged from their head louse ancestors. That information suggests that the use of clothing likely originated with anatomically-modern humans in Africa prior to their migration to colder climates, which made the migration possible.

Some of the technology for what is now called clothing may have originated to make other types of adornment, including jewelry, body paint, tattoos, and other body modifications, "dressing" the naked body without concealing it. According to Mark Leary and Nicole R. Buttermore, body adornment is one of the changes that occurred in the late Paleolithic (40,000 to 60,000 years ago) in which humans became not only anatomically modern but also behaviorally modern, and capable of self-reflection and symbolic interaction. More recent studies place the use of adornment at 77,000 years ago in South Africa, and 90,000—100,000 years ago in Palestine and Algeria. While modesty may be a factor, often overlooked purposes for body coverings are camouflage used by hunters, body armor, and costumes used to impersonate "spirit-beings".

The origin of complex, fitted clothing required the invention of fine stone knives for cutting skins into pieces, and the eyed needle for sewing. This was done by Cro-Magnons, who migrated to Europe around 35,000 years ago. The Neanderthal occupied the same region, but became extinct in part because they could not make fitted garments, but draped themselves with crudely cut skins—based upon their simple stone tools—which did not provide the warmth needed to survive as the climate grew colder in the Last Glacial Period. In addition to being less functional, the simple wrappings would not have been habitually worn by Neanderthal because they were more tolerant to the cold than Homo sapiens and would not have acquired the secondary functions of decoration and promoting modesty.

The earliest archeological evidence of fabric clothing is inferred from representations in figurines in the southern Levant, dated between 11,700 and 10,500 years ago. The surviving examples of woven cloth are linen from Egypt dated 5000 BCE, while knotted or twisted flax fibers have been found as early as 7000 BCE.

Adults are rarely completely naked in modern societies and cover at least their genitals, but adornments and clothing often emphasize, enhance, or otherwise call attention to the sexuality of the body.

Behavioral modernity

From Wikipedia, the free encyclopedia
Upper Paleolithic (16,000-year-old) cave painting from Lascaux cave in France

Behavioral modernity refers to a suite of behavioral and cognitive traits associated with humans (Homo sapiens), reflecting capacities such as abstract and symbolic thought, planning depth, cumulative culture, and complex social learning. These traits are often inferred archaeologically through evidence including symbolic artifacts (e.g., art, ornamentation), ritualized behavior, music and dance, sophisticated hunting strategies, and advanced lithic technologies such as blade production. Rather than representing an absolute boundary between Homo sapiens and other hominins, behavioral modernity is increasingly understood as a mosaic of traits that emerged gradually and were expressed variably across time and populations.

Evolution

The Venus of Hohle Fels figurine was carved about 40,000 years ago and is a product of behavioral modernity

Anatomically modern humans possessed much of the necessary neural architecture by at least ~300 thousand years ago, but early populations were small and fragmented, limiting the persistence and transmission of complex behaviors. As a result, archaeological signals of symbolism, art, and advanced technologies appear sporadically in Africa between ~150–75 kya, reflecting intermittent expression, even though the cognitive capacity was probably present. Widespread, continuous manifestations of these behaviors became archaeologically visible only after populations grew denser and social networks expanded, appearing across continents, one instance being during the Upper Paleolithic in Europe.

In this view, behavioral modernity is primarily cultural and learned, shaped by high-fidelity social learning, cumulative culture, and demographic thresholds, while resting on an evolved cognitive substrate that predates its full material expression. Differences between Homo sapiens and other hominins are therefore understood as differences of degree, stability, and cultural accumulation, not the presence or absence of a single cognitive mutation.

Underlying these behaviors and technological innovations are cognitive and cultural foundations that have been documented experimentally and ethnographically by evolutionary and cultural anthropologists. These human universal patterns include cumulative cultural adaptation, social norms, language, and extensive help and cooperation beyond close kin.

Within the tradition of evolutionary anthropology and related disciplines, it has been argued that the development of these modern behavioral traits, in combination with the climatic conditions of the Last Glacial Period and Last Glacial Maximum causing population bottlenecks, contributed to the evolutionary success of Homo sapiens worldwide relative to Neanderthals, Denisovans, and other archaic humans.

There are many other hypotheses on the evolution of behavioral modernity. These approaches tend to fall into two camps, cognitive and gradualist:

The late Upper Paleolithic model hypothesizes that modern human behavior arose through cognitive, genetic changes in Africa abruptly around 40,000–50,000 years ago around the time of the Out-of-Africa migration, dubbed the "cognitive revolution" or the "Upper Paleolithic revolution", prompting the movement of some modern humans out of Africa and across the world.

Gradualist models focus on how modern human behavior may have arisen through gradual steps, with the archaeological signatures of such behavior appearing only through demographic or subsistence-based changes. Many cite evidence of behavioral modernity earlier (by at least about 150,000–75,000 years ago and possibly earlier), namely in the African Middle Stone Age. Anthropologists Sally McBrearty and Alison S. Brooks have been notable proponents of gradualism—challenging Europe-centered models by situating more change in the African Middle Stone Age—though this model is more difficult to substantiate due to the general thinning of the fossil record further back in time.

Definition

A Māori man performing haka, a ceremonial dance. He is displaying several hallmarks of behavioral modernity including the use of jewelry, application of body paint, music and dance, and symbolic behavior.

To classify what should be included in modern human behavior, it is necessary to define behaviors that are universal among living human groups. Some examples of these human universals are abstract thought, planning, trade, cooperative labor, body decoration, and the control and use of fire. Along with these traits, humans possess much reliance on social learning. This cumulative cultural change or cultural "ratchet" separates human culture from social learning in animals. In addition, a reliance on social learning may be responsible in part for humans' rapid adaptation to many environments outside of Africa. Since cultural universals are found in all cultures, including isolated indigenous groups, these traits must have evolved or have been invented in Africa prior to the exodus.

Archaeologically, a number of empirical traits have been used as indicators of modern human behavior. While these are often debated a few are generally agreed upon. Archaeological evidence of behavioral modernity includes:

Critiques

Several critiques have been placed against the traditional concept of behavioral modernity, both methodologically and philosophically. Anthropologist John Shea outlines a variety of problems with this concept, arguing instead for "behavioral variability", which, according to the author, better describes the archaeological record. The use of trait lists, according to Shea, runs the risk of taphonomic bias, where some sites may yield more artifacts than others despite similar populations; as well, trait lists can be ambiguous in how behaviors may be empirically recognized in the archaeological record. In particular, Shea cautions that population pressure, cultural change, or optimality models, like those in human behavioral ecology, might better predict changes in tool types or subsistence strategies than a change from "archaic" to "modern" behavior. Some researchers argue that a greater emphasis should be placed on identifying only those artifacts which are unquestionably, or purely, symbolic as a metric for modern human behavior.

Since 2018, recent dating methods utilized on various cave art sites in Spain and France have shown that Neanderthals performed symbolic artistic expression, consisting of red "lines, dots, and hand stencils" found in caves, prior to contact with anatomically modern humans. This is contrary to previous suggestions that Neanderthals lacked these capabilities.

Hypotheses and models

Late Upper Paleolithic model or "Upper Paleolithic Revolution"

The Late Upper Paleolithic Model, or Upper Paleolithic Revolution, refers to the idea that, though anatomically modern humans first appear around 150,000 years ago (as was once believed), they were not cognitively or behaviorally "modern" until around 50,000 years ago, leading to their expansion out of Africa and into Europe and Asia. These authors note that traits used as a metric for behavioral modernity do not appear as a package until around 40–50,000 years ago. Anthropologist Richard Klein specifically describes that evidence of fishing, tools made from bone, hearths, significant artifact diversity, and elaborate graves are all absent before this point. According to both Shea and Klein, art only becomes common beyond this switching point, signifying a change from archaic to modern humans. Most researchers argue that a neurological or genetic change, perhaps one enabling complex language, such as FOXP2, caused this revolutionary change in humans. The role of FOXP2 as a driver of evolutionary selection has been called into question following recent research results.

The African Middle Stone Age period gives us some of the earliest evidence of Behavioral Modernity. In Southern Africa, groups of people would bypass closer deposits of rich, deep-red ochre in favor of mining more distant ones. After mining, pieces of ochre would show evidence of grinding to make a powder, which indicates that ochre was being ground for a reason other than simply for decoration. In North Africa, similar early evidence of behavioral modernity can be seen in the 82,000-year-old Taforalt cave where the perforation of marine shells was for the construction of necklaces. It was noted that the cave was very inland, which implies the presence of a trade network in the area where people were able to acquire shells of the coastal Nassarius species. People wore the beads along with the necklace, and evidence of wear showed that the bead was a personal ornament, which indicates behavioral modernity. In Africa, the evidence presents behavioral modernity in the form of symbolism, personal ornamentation, and the movement of trade in necklace beads. All this occurred in Africa 10s of thousands of years before these behaviors appeared in Europe. These results dispute older models that relocated the origin of modern behavior to the Upper Paleolithic period and the 'sudden' appearance of the behaviors to Africa.

Building on the FOXP2 gene hypothesis, cognitive scientist Philip Lieberman has argued that proto-language behaviour existed prior to 50,000 BP, albeit in a more primitive form. Lieberman has advanced fossil evidence, such as neck and throat dimensions, to demonstrate that so-called "anatomically modern" humans from 100,000 BP continued to evolve their SVT (supralaryngeal vocal tract), which already possessed a horizontal portion (SVTh) capable of producing many phonemes which were mostly consonants. According to his hypothesis, Neanderthals and early Homo sapiens would have been able to communicate using sounds and gestures.

From 100,000 BP, Homo sapiens necks continued to lengthen to a point, by around 50,000 BP, where Homo sapiens necks were long enough to accommodate a vertical portion to their SVT (SVTv), which is now a universal trait among humans. This SVTv enabled the enunciation of quantal vowels: [i]; [u]; and [a]. These quantal vowels could then be immediately put to use by the already sophisticated neuro-motor-control features of the FOXP2 gene to generate more nuanced sounds and in effect increase by orders of magnitude the number of distinct sounds that can be produced, allowing for fully symbolic language.

Goody (1986) draws an analogy between the development of spoken language and that of writing: the shift from pictographic or ideographic symbols into a fully abstract logographic writing system (such as hieroglyphs), or from a logographic system into an abjad or alphabet, led to dramatic changes in human civilization.

Contrasted with this view of a spontaneous leap in cognition among ancient humans, some anthropologists like Alison S. Brooks, primarily working in African archaeology, point to the gradual accumulation of "modern" behaviors, starting well before the 50,000-year benchmark of the Upper Paleolithic Revolution models. Howiesons Poort, Blombos, and other South African archaeological sites, for example, show evidence of marine resource acquisition, trade, the making of bone tools, blade and microlithic technology, and abstract ornamentation at least by 80,000 years ago. Given evidence from Africa and the Middle East, a variety of hypotheses have been put forth to describe an earlier, gradual transition from simple to more complex human behavior. Some authors have pushed back the appearance of fully modern behavior to around 80,000 years ago or earlier in order to incorporate the South African data.

Others focus on the slow accumulation of different technologies and behaviors across time. These researchers describe how anatomically modern humans could have been cognitively the same, and what we define as behavioral modernity is just the result of thousands of years of cultural adaptation and learning. Archaeologist Francesco d'Errico, and others, have looked at Neanderthal culture, rather than early human behavior exclusively, for clues into behavioral modernity. Noting that Neanderthal assemblages often portray traits similar to those listed for modern human behavior, researchers stress that the foundations for behavioral modernity may in fact, lie deeper in our hominin ancestors. If both modern humans and Neanderthals express abstract art and complex tools then "modern human behavior" cannot be a derived trait for our species. They argue that the original "human revolution" hypothesis reflects a profound Eurocentric bias. Recent archaeological evidence, they argue, proves that humans evolving in Africa some 300,000 or even 400,000 years ago were already becoming cognitively and behaviourally "modern". These features include blade and microlithic technology, bone tools, increased geographic range, specialized hunting, the use of aquatic resources, long-distance trade, systematic processing and use of pigment, and art and decoration. These items do not occur suddenly together as predicted by the "human revolution" model, but at sites that are widely separated in space and time. This suggests a gradual assembling of the package of modern human behaviours in Africa, and its later export to other regions of the Old World.

Between these extremes is the view—currently supported by archaeologists Chris Henshilwood, Curtis Marean, Ian Watts and others—that there was indeed some kind of "human revolution" but that it occurred in Africa and spanned tens of thousands of years. The term "revolution," in this context, would mean not a sudden mutation but a historical development along the lines of the industrial revolution or the Neolithic revolution. In other words, it was a relatively accelerated process, too rapid for ordinary Darwinian "descent with modification" yet too gradual to be attributed to a single genetic or other sudden event. These archaeologists point in particular to the relatively explosive emergence of ochre crayons and shell necklaces, apparently used for cosmetic purposes. These archaeologists see symbolic organisation of human social life as the key transition in modern human evolution. Recently discovered at sites such as Blombos Cave and Pinnacle Point, South Africa, pierced shells, pigments and other striking signs of personal ornamentation have been dated within a time-window of 70,000–160,000 years ago in the African Middle Stone Age, suggesting that the emergence of Homo sapiens coincided, after all, with the transition to modern cognition and behaviour. While viewing the emergence of language as a "revolutionary" development, this school of thought generally attributes it to cumulative social, cognitive and cultural evolutionary processes as opposed to a single genetic mutation.

A further view, taken by archaeologists such as Francesco d'Errico and João Zilhão, is a multi-species perspective arguing that evidence for symbolic culture, in the form of utilised pigments and pierced shells, are also found in Neanderthal sites, independently of any "modern" human influence.

Cultural evolutionary models may also shed light on why although evidence of behavioral modernity exists before 50,000 years ago, it is not expressed consistently until that point. With small population sizes, human groups would have been affected by demographic and cultural evolutionary forces that may not have allowed for complex cultural traits. According to some authors, until population density became significantly high, complex traits could not have been maintained effectively. Some genetic evidence supports a dramatic increase in population size before human migration out of Africa. High local extinction rates within a population also can significantly decrease the amount of diversity in neutral cultural traits, regardless of cognitive ability.

Archaeological evidence

Africa

Research from 2017 indicates that Homo sapiens originated in Africa between around 350,000 and 260,000 years ago. There is some evidence for the beginning of modern behavior among early African H. sapiens around that period.

Before the Out of Africa theory was generally accepted, there was no consensus on where the human species evolved and, consequently, where modern human behavior arose. Now, however, African archaeology has become extremely important in discovering the origins of humanity. The first Cro-Magnon expansion into Europe around 48,000 years ago is generally accepted as already "modern", and it is now generally believed that behavioral modernity appeared in Africa before 50,000 years ago, either significantly earlier, or possibly as a late Upper Paleolithic "revolution" soon before which prompted migration out of Africa.

A variety of evidence of abstract imagery, widened subsistence strategies, and other "modern" behaviors have been discovered in Africa, especially South, North, and East Africa. The Blombos Cave site in South Africa, for example, is famous for rectangular slabs of ochre engraved with geometric designs. Using multiple dating techniques, the site was dated to be around 77,000 and 100,000 to 75,000 years old. Ostrich egg shell containers engraved with geometric designs dating to 60,000 years ago were found at Diepkloof, South Africa. Beads and other personal ornamentation have been found from Morocco which might be as much as 130,000 years old; as well, the Cave of Hearths in South Africa has yielded a number of beads dating from significantly prior to 50,000 years ago, and shell beads dating to about 75,000 years ago have been found at Blombos Cave, South Africa.

Specialized projectile weapons as well have been found at various sites in Middle Stone Age Africa, including bone and stone arrowheads at South African sites such as Sibudu Cave (along with an early bone needle also found at Sibudu) dating approximately 72,000–60,000 years ago on some of which poisons may have been used, and bone harpoons at the Central African site of Katanda dating to about 90,000 years ago. Traces of toxic plant alkaloids on microlithic arrowheads in KwaZulu-Natal, South Africa, dated to 60,000 years ago. Evidence also exists for the systematic heat treating of silcrete stone to increase its flake-ability for the purpose of toolmaking, beginning approximately 164,000 years ago at the South African site of Pinnacle Point and becoming common there for the creation of microlithic tools at about 72,000 years ago.

In 2008, an ochre processing workshop likely for the production of paints was uncovered dating to c. 100,000 years ago at Blombos Cave, South Africa. Analysis shows that a liquefied pigment-rich mixture was produced and stored in the two abalone shells, and that ochre, bone, charcoal, grindstones, and hammer-stones also formed a composite part of the toolkits. Evidence for the complexity of the task includes procuring and combining raw materials from various sources (implying they had a mental template of the process they would follow), possibly using pyrotechnology to facilitate fat extraction from bone, using a probable recipe to produce the compound, and the use of shell containers for mixing and storage for later use. Modern behaviors, such as the making of shell beads, bone tools and arrows, and the use of ochre pigment, are evident at a Kenyan site by 78,000–67,000 years ago. Evidence of early stone-tipped projectile weapons (a characteristic tool of Homo sapiens), the stone tips of javelins or throwing spears, were discovered in 2013 at the Ethiopian site of Gademotta, and date to around 279,000 years ago.

Expanding subsistence strategies beyond big-game hunting and the consequential diversity in tool types has been noted as signs of behavioral modernity. A number of South African sites have shown an early reliance on aquatic resources from fish to shellfish. Pinnacle Point, in particular, shows exploitation of marine resources as early as 120,000 years ago, perhaps in response to more arid conditions inland. Establishing a reliance on predictable shellfish deposits, for example, could reduce mobility and facilitate complex social systems and symbolic behavior. Blombos Cave and Site 440 in Sudan both show evidence of fishing as well. Taphonomic change in fish skeletons from Blombos Cave have been interpreted as capture of live fish, clearly an intentional human behavior.

Humans in North Africa (Nazlet Sabaha, Egypt) are known to have dabbled in chert mining, as early as ≈100,000 years ago, for the construction of stone tools.

Evidence was found in 2018, dating to about 320,000 years ago, at the Kenyan site of Olorgesailie, of the early emergence of modern behaviors including: long-distance trade networks (involving goods such as obsidian), the use of pigments, and the possible making of projectile points. It is observed by the authors of three 2018 studies on the site that the evidence of these behaviors is approximately contemporary to the earliest known Homo sapiens fossil remains from Africa (such as at Jebel Irhoud and Florisbad), and they suggest that complex and modern behaviors had already begun in Africa around the time of the emergence of anatomically modern Homo sapiens.

In 2019, further evidence of early complex projectile weapons in Africa was found at Aduma, Ethiopia, dated 100,000–80,000 years ago, in the form of points considered likely to belong to darts delivered by spear throwers.

Olduvai Hominid 1 wore facial piercings.

Europe

While traditionally described as evidence for the later Upper Paleolithic Model, European archaeology has shown that the issue is more complex. A variety of stone tool technologies are present at the time of human expansion into Europe and show evidence of modern behavior. Despite the problems of conflating specific tools with cultural groups, the Aurignacian tool complex, for example, is generally taken as a purely modern human signature. The discovery of "transitional" complexes, like "proto-Aurignacian", have been taken as evidence of human groups progressing through "steps of innovation". If, as this might suggest, human groups were already migrating into eastern Europe around 40,000 years and only afterward show evidence of behavioral modernity, then either the cognitive change must have diffused back into Africa or was already present before migration.

In light of a growing body of evidence of Neanderthal culture and tool complexes some researchers have put forth a "multiple species model" for behavioral modernity. Neanderthals were often cited as being an evolutionary dead-end, apish cousins who were less advanced than their human contemporaries. Personal ornaments were relegated as trinkets or poor imitations compared to the cave art produced by H. sapiens. Despite this, European evidence has shown a variety of personal ornaments and artistic artifacts produced by Neanderthals; for example, the Neanderthal site of Grotte du Renne has produced grooved bear, wolf, and fox incisors, ochre and other symbolic artifacts. Although few and controversial, circumstantial evidence of Neanderthal ritual burials has been uncovered. There are two options to describe this symbolic behavior among Neanderthals: they copied cultural traits from arriving modern humans or they had their own cultural traditions comparative with behavioral modernity. If they just copied cultural traditions, which is debated by several authors, they still possessed the capacity for complex culture described by behavioral modernity. As discussed above, if Neanderthals also were "behaviorally modern" then it cannot be a species-specific derived trait.

Asia

Most debates surrounding behavioral modernity have been focused on Africa or Europe but an increasing amount of focus has been placed on East Asia. This region offers a unique opportunity to test hypotheses of multi-regionalism, replacement, and demographic effects. Unlike Europe, where initial migration occurred around 50,000 years ago, human remains have been dated in China to around 100,000 years ago. This early evidence of human expansion calls into question behavioral modernity as an impetus for migration.

Stone tool technology is particularly of interest in East Asia. Following Homo erectus migrations out of Africa, Acheulean technology never seems to appear beyond present-day India and into China. Analogously, Mode 3, or Levallois technology, is not apparent in China following later hominin dispersals. This lack of more advanced technology has been explained by serial founder effects and low population densities out of Africa. Although tool complexes comparative to Europe are missing or fragmentary, other archaeological evidence shows behavioral modernity. For example, the peopling of the Japanese archipelago offers an opportunity to investigate the early use of watercraft. Although one site, Kanedori in Honshu, does suggest the use of watercraft as early as 84,000 years ago, there is no other evidence of hominins in Japan until 50,000 years ago.

The Zhoukoudian cave system near Beijing has been excavated since the 1930s and has yielded precious data on early human behavior in East Asia. Although disputed, there is evidence of possible human burials and interred remains in the cave dated to around 34–20,000 years ago. These remains have associated personal ornaments in the form of beads and worked shell, suggesting symbolic behavior. Along with possible burials, numerous other symbolic objects like punctured animal teeth and beads, some dyed in red ochre, have all been found at Zhoukoudian. Although fragmentary, the archaeological record of eastern Asia shows evidence of behavioral modernity before 50,000 years ago but, like the African record, it is not fully apparent until that time

Interplanetary Internet

From Wikipedia, the free encyclopedia The speed of light, illustrated here by a beam of light traveling ...