Search This Blog

Friday, May 30, 2025

Epistemology

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Epistemology

Epistemology is the branch of philosophy that examines the nature, origin, and limits of knowledge. Also called "the theory of knowledge", it explores different types of knowledge, such as propositional knowledge about facts, practical knowledge in the form of skills, and knowledge by acquaintance as a familiarity through experience. Epistemologists study the concepts of belief, truth, and justification to understand the nature of knowledge. To discover how knowledge arises, they investigate sources of justification, such as perception, introspection, memory, reason, and testimony.

The school of skepticism questions the human ability to attain knowledge while fallibilism says that knowledge is never certain. Empiricists hold that all knowledge comes from sense experience, whereas rationalists believe that some knowledge does not depend on it. Coherentists argue that a belief is justified if it coheres with other beliefs. Foundationalists, by contrast, maintain that the justification of basic beliefs does not depend on other beliefs. Internalism and externalism debate whether justification is determined solely by mental states or also by external circumstances.

Separate branches of epistemology focus on knowledge in specific fields, like scientific, mathematical, moral, and religious knowledge. Naturalized epistemology relies on empirical methods and discoveries, whereas formal epistemology uses formal tools from logic. Social epistemology investigates the communal aspect of knowledge, and historical epistemology examines its historical conditions. Epistemology is closely related to psychology, which describes the beliefs people hold, while epistemology studies the norms governing the evaluation of beliefs. It also intersects with fields such as decision theory, education, and anthropology.

Early reflections on the nature, sources, and scope of knowledge are found in ancient Greek, Indian, and Chinese philosophy. The relation between reason and faith was a central topic in the medieval period. The modern era was characterized by the contrasting perspectives of empiricism and rationalism. Epistemologists in the 20th century examined the components, structure, and value of knowledge while integrating insights from the natural sciences and linguistics.

Definition

Epistemology is the philosophical study of knowledge and related concepts, such as justification. Also called theory of knowledge, it examines the nature and types of knowledge. It further investigates the sources of knowledge, like perception, inference, and testimony, to understand how knowledge is created. Another set of questions concerns the extent and limits of knowledge, addressing what people can and cannot know. Central concepts in epistemology include belief, truth, evidence, and reason. As one of the main branches of philosophy, epistemology stands alongside fields like ethics, logic, and metaphysics. The term can also refer specific positions of philosophers within this branch, as in Plato's epistemology and Immanuel Kant's epistemology.

Epistemology explores how people should acquire beliefs. It determines which beliefs or forms of belief acquisition meet the standards or epistemic goals of knowledge and which ones fail, thereby providing an evaluation of beliefs. The fields of psychology and cognitive sociology are also interested in beliefs and related cognitive processes, but examine them from a different perspective. Unlike epistemology, they study the beliefs people actually have and how people acquire them instead of examining the evaluative norms of these processes. In this regard, epistemology is a normative discipline, whereas psychology and cognitive sociology are descriptive disciplines. Epistemology is relevant to many descriptive and normative disciplines, such as the other branches of philosophy and the sciences, by exploring the principles of how they may arrive at knowledge.

The word epistemology comes from the ancient Greek terms ἐπιστήμη (episteme, meaning knowledge or understanding) and λόγος (logos, meaning study of or reason), literally, the study of knowledge. Despite its ancient roots, the word itself was only coined in the 19th century to designate this field as a distinct branch of philosophy.

Central concepts

Epistemologists examine several foundational concepts to understand their essences and rely on them to formulate theories. Various epistemological disagreements have their roots in disputes about the nature and function of these concepts, like the controversies surrounding the definition of knowledge and the role of justification in it.

Knowledge

Knowledge is an awareness, familiarity, understanding, or skill. Its various forms all involve a cognitive success through which a person establishes epistemic contact with reality. Epistemologists typically understand knowledge as an aspect of individuals, generally as a cognitive mental state that helps them understand, interpret, and interact with the world. While this core sense is of particular interest to epistemologists, the term also has other meanings. For example, the epistemology of groups examines knowledge as a characteristic of a group of people who share ideas. The term can also refer to information stored in documents and computers.

Knowledge contrasts with ignorance, often simply defined as the absence of knowledge. Knowledge is usually accompanied by ignorance because people rarely have complete knowledge of a field, forcing them to rely on incomplete or uncertain information when making decisions. Even though many forms of ignorance can be mitigated through education and research, certain limits to human understanding result in inevitable ignorance. Some limitations are inherent in the human cognitive faculties themselves, such as the inability to know facts too complex for the human mind to conceive. Others depend on external circumstances when no access to the relevant information exists.

Epistemologists disagree on how much people know, for example, whether fallible beliefs can amount to knowledge or whether absolute certainty is required. The most stringent position is taken by radical skeptics, who argue that there is no knowledge at all.

Types

Black and white photo of a man wearing a dark suit, a white shirt, and a tie
Bertrand Russell originated the distinction between propositional knowledge and knowledge by acquaintance.

Epistemologists distinguish between different types of knowledge. Their primary interest is in knowledge of facts, called propositional knowledge. It is theoretical knowledge that can be expressed in declarative sentences using a that-clause, like "Ravi knows that kangaroos hop". For this reason, it is also called knowledge-that. Epistemologists often understand it as a relation between a knower and a known proposition, in the case above between the person Ravi and the proposition "kangaroos hop". It is use-independent since it is not tied to one specific purpose, unlike practical knowledge. It is a mental representation that embodies concepts and ideas to reflect reality. Because of its theoretical nature, it is typically held that only creatures with highly developed minds, such as humans, possess propositional knowledge.

Propositional knowledge contrasts with non-propositional knowledge in the form of knowledge-how and knowledge by acquaintance. Knowledge-how is a practical ability or skill, like knowing how to read or how to prepare lasagna. It is usually tied to a specific goal and not mastered in the abstract without concrete practice. To know something by acquaintance means to have an immediate familiarity with or awareness of it, usually as a result of direct experiential contact. Examples are "familiarity with the city of Perth", "knowing the taste of tsampa", and "knowing Marta Vieira da Silva personally".

Painting of a man with gray hair in a formal dark attire
The analytic–synthetic distinction has its roots in the philosophy of Immanuel Kant.

Another influential distinction in epistemology is between a posteriori and a priori knowledge. A posteriori knowledge is knowledge of empirical facts based on sensory experience, like "seeing that the sun is shining" and "smelling that a piece of meat has gone bad". This type of knowledge is associated with the empirical science and everyday affairs. A priori knowledge, by contrast, pertains to non-empirical facts and does not depend on evidence from sensory experience, like knowing that . It belongs to fields such as mathematics and logic. The distinction between a posteriori and a priori knowledge is central to the debate between empiricists and rationalists regarding whether all knowledge depends on sensory experience.

A closely related contrast is between analytic and synthetic truths. A sentence is analytically true if its truth depends only on the meanings of the words it uses. For instance, the sentence "all bachelors are unmarried" is analytically true because the word "bachelor" already includes the meaning "unmarried". A sentence is synthetically true if its truth depends on additional facts. For example, the sentence "snow is white" is synthetically true because its truth depends on the color of snow in addition to the meanings of the words snow and white. A priori knowledge is primarily associated with analytic sentences, whereas a posteriori knowledge is primarily associated with synthetic sentences. However, it is controversial whether this is true for all cases. Some philosophers, such as Willard Van Orman Quine, reject the distinction, saying that there are no analytic truths.

Analysis

The analysis of knowledge is the attempt to identify the essential components or conditions of all and only propositional knowledge states. According to the so-called traditional analysis, knowledge has three components: it is a belief that is justified and true. In the second half of the 20th century, this view was challenged by a series of thought experiments aiming to show that some justified true beliefs do not amount to knowledge. In one of them, a person is unaware of all the fake barns in their area. By coincidence, they stop in front of the only real barn and form a justified true belief that it is a real barn. Many epistemologists agree that this is not knowledge because the justification is not directly relevant to the truth. More specifically, this and similar counterexamples involve some form of epistemic luck, that is, a cognitive success that results from fortuitous circumstances rather than competence.

Venn diagram with circles for true beliefs, justified beliefs, and knowledge
The so-called traditional analysis says that knowledge is justified true belief. Edmund Gettier tried to show that some justified true beliefs do not amount to knowledge.

Following these thought experiments, philosophers proposed various alternative definitions of knowledge by modifying or expanding the traditional analysis. According to one view, the known fact has to cause the belief in the right way. Another theory states that the belief is the product of a reliable belief formation process. Further approaches require that the person would not have the belief if it was false, that the belief is not inferred from a falsehood, that the justification cannot be undermined, or that the belief is infallible. There is no consensus on which of the proposed modifications and reconceptualizations is correct. Some philosophers, such as Timothy Williamson, reject the basic assumption underlying the analysis of knowledge by arguing that propositional knowledge is a unique state that cannot be dissected into simpler components.

Value

The value of knowledge is the worth it holds by expanding understanding and guiding action. Knowledge can have instrumental value by helping a person achieve their goals. For example, knowledge of a disease helps a doctor cure their patient. The usefulness of a known fact depends on the circumstances. Knowledge of some facts may have little to no uses, like memorizing random phone numbers from an outdated phone book. Being able to assess the value of knowledge matters in choosing what information to acquire and share. It affects decisions like which subjects to teach at school and how to allocate funds to research projects.

Epistemologists are particularly interested in whether knowledge is more valuable than a mere true opinion. Knowledge and true opinion often have a similar usefulness since both accurately represent reality. For example, if a person wants to go to Larissa, a true opinion about the directions can guide them as effectively as knowledge. Considering this problem, Plato proposed that knowledge is better because it is more stable. Another suggestion focuses on practical reasoning, arguing that people put more trust in knowledge than in mere true opinions when drawing conclusions and deciding what to do. A different response says that knowledge has intrinsic value in addition to instrumental value. This view asserts that knowledge is always valuable, whereas true opinion is only valuable in circumstances where it is useful.

Belief and truth

Beliefs are mental states about what is the case, like believing that snow is white or that God exists. In epistemology, they are often understood as subjective attitudes that affirm or deny a proposition, which can be expressed in a declarative sentence. For instance, to believe that snow is white is to affirm the proposition "snow is white". According to this view, beliefs are representations of what the universe is like. They are stored in memory and retrieved when actively thinking about reality or deciding how to act. A different view understands beliefs as behavioral patterns or dispositions to act rather than as representational items stored in the mind. According to this perspective, to believe that there is mineral water in the fridge is nothing more than a group of dispositions related to mineral water and the fridge. Examples are the dispositions to answer questions about the presence of mineral water affirmatively and to go to the fridge when thirsty. Some theorists deny the existence of beliefs, saying that this concept borrowed from folk psychology oversimplifies much more complex psychological or neurological processes. Beliefs are central to various epistemological debates, which cover their status as a component of propositional knowledge, the question of whether people have control over and responsibility for their beliefs, and the issue of whether beliefs have degrees, called credences.

As propositional attitudes, beliefs are true or false depending on whether they affirm a true or a false proposition. According to the correspondence theory of truth, to be true means to stand in the right relation to the world by accurately describing what it is like. This means that truth is objective: a belief is true if it corresponds to a fact. The coherence theory of truth says that a belief is true if it belongs to a coherent system of beliefs. A result of this view is that truth is relative since it depends on other beliefs. Further theories of truth include pragmatist, semantic, pluralist, and deflationary theories. Truth plays a central role in epistemology as a goal of cognitive processes and an attribute of propositional knowledge.

Justification

In epistemology, justification is a property of beliefs that meet certain norms about what a person should believe. According to a common view, this means that the person has sufficient reasons for holding this belief because they have information that supports it. Another view states that a belief is justified if it is formed by a reliable belief formation process, such as perception. The terms reasonable, warranted, and supported are sometimes used as synonyms of the word justified. Justification distinguishes well-founded beliefs from superstition and lucky guesses. However, it does not guarantee truth. For example, a person with strong but misleading evidence may form a justified belief that is false.

Epistemologists often identify justification as a key component of knowledge. Usually, they are not only interested in whether a person has a sufficient reason to hold a belief, known as propositional justification, but also in whether the person holds the belief because or based on this reason, known as doxastic justification. For example, if a person has sufficient reason to believe that a neighborhood is dangerous but forms this belief based on superstition then they have propositional justification but lack doxastic justification.

Sources

Sources of justification are ways or cognitive capacities through which people acquire justification. Often-discussed sources include perception, introspection, memory, reason, and testimony, but there is no universal agreement to what extent they all provide valid justification. Perception relies on sensory organs to gain empirical information. Distinct forms of perception correspond to different physical stimuli, such as visual, auditory, haptic, olfactory, and gustatory perception. Perception is not merely the reception of sense impressions but an active process that selects, organizes, and interprets sensory signals. Introspection is a closely related process focused on internal mental states rather than external physical objects. For example, seeing a bus at a bus station belongs to perception while feeling tired belongs to introspection.

Rationalists understand reason as a source of justification for non-empirical facts, explaining how people can know about mathematical, logical, and conceptual truths. Reason is also responsible for inferential knowledge, in which one or more beliefs serve as premises to support another belief. Memory depends on information provided by other sources, which it retains and recalls, like remembering a phone number perceived earlier. Justification by testimony relies on information one person communicates to another person. This can happen by talking to each other but can also occur in other forms, like a letter, a newspaper, and a blog.

Other concepts

Rationality is closely related to justification and the terms rational belief and justified belief are sometimes used interchangeably. However, rationality has a wider scope that encompasses both a theoretical side, covering beliefs, and a practical side, covering decisions, intentions, and actions. There are different conceptions about what it means for something to be rational. According to one view, a mental state is rational if it is based on or responsive to good reasons. Another view emphasizes the role of coherence, stating that rationality requires that the different mental states of a person are consistent and support each other. A slightly different approach holds that rationality is about achieving certain goals. Two goals of theoretical rationality are accuracy and comprehensiveness, meaning that a person has as few false beliefs and as many true beliefs as possible.

Epistemologists rely on the concept of epistemic norms as criteria to assess the cognitive quality of beliefs, like their justification and rationality. They distinguish between deontic norms, which prescribe what people should believe, and axiological norms, which identify the goals and values of beliefs. Epistemic norms are closely linked to intellectual or epistemic virtues, which are character traits like open-mindedness and conscientiousness. Epistemic virtues help individuals form true beliefs and acquire knowledge. They contrast with epistemic vices and act as foundational concepts of virtue epistemology.

Epistemologists understand evidence for a belief as information that favors or supports it. They conceptualize evidence primarily in terms of mental states, such as sensory impressions or other known propositions. But in a wider sense, it can also include physical objects, like bloodstains examined by forensic analysts or financial records studied by investigative journalists. Evidence is often understood in terms of probability: evidence for a belief makes it more likely that the belief is true. A defeater is evidence against a belief or evidence that undermines another piece of evidence. For instance, witness testimony linking a suspect to a crime is evidence of their guilt, while an alibi is a defeater. Evidentialists analyze justification in terms of evidence by asserting that for a belief to be justified, it needs to rest on adequate evidence.

The presence of evidence usually affects doubt and certainty, which are subjective attitudes toward propositions that differ regarding their level of confidence. Doubt involves questioning the validity or truth of a proposition. Certainty, by contrast, is a strong affirmative conviction, indicating an absence of doubt about the proposition's truth. Doubt and certainty are central to ancient Greek skepticism and its goal of establishing that no belief is immune to doubt. They are also crucial in attempts to find a secure foundation of all knowledge, such as René Descartes' foundationalist epistemology.

While propositional knowledge is the main topic in epistemology, some theorists focus on understanding instead. Understanding is a more holistic notion that involves a wider grasp of a subject. To understand something, a person requires awareness of how different things are connected and why they are the way they are. For example, knowledge of isolated facts memorized from a textbook does not amount to understanding. According to one view, understanding is a unique epistemic good that, unlike propositional knowledge, is always intrinsically valuable. Wisdom is similar in this regard and is sometimes considered the highest epistemic good. It encompasses a reflective understanding with practical applications, helping people grasp and evaluate complex situations and lead a good life.

In epistemology, knowledge ascription is the act of attributing knowledge to someone, expressed in sentences like "Sarah knows that it will rain today". According to invariantism, knowledge ascriptions have fixed standards across different contexts. Contextualists, by contrast, argue that knowledge ascriptions are context-dependent. From this perspective, Sarah may know about the weather in the context of an everyday conversation even though she is not sufficiently informed to know it in the context of a rigorous meteorological debate. Contrastivism, another view, argues that knowledge ascriptions are comparative, meaning that to know something involves distinguishing it from relevant alternatives. For example, if a person spots a bird in the garden, they may know that it is a sparrow rather than an eagle, but they may not know that it is a sparrow rather than an indistinguishable sparrow hologram.

Major schools of thought

Skepticism and fallibilism

Philosophical skepticism questions the human ability to attain knowledge by challenging the foundations upon which knowledge claims rest. Some skeptics limit their criticism to specific domains of knowledge. For example, religious skeptics say that it is impossible to know about the existence of deities or the truth of other religious doctrines. Similarly, moral skeptics challenge the existence of moral knowledge and metaphysical skeptics say that humans cannot know ultimate reality. External world skepticism questions knowledge of external facts, whereas skepticism about other minds doubts knowledge of the mental states of others.

Global skepticism is the broadest form of skepticism, asserting that there is no knowledge in any domain. In ancient philosophy, this view was embraced by academic skeptics, whereas Pyrrhonian skeptics recommended the suspension of belief to attain tranquility. Few epistemologists have explicitly defended global skepticism. The influence of this position stems from attempts by other philosophers to show that their theory overcomes the challenge of skepticism. For example, René Descartes used methodological doubt to find facts that cannot be doubted.

One consideration in favor of global skepticism is the dream argument. It starts from the observation that, while people are dreaming, they are usually unaware of this. This inability to distinguish between dream and regular experience is used to argue that there is no certain knowledge since a person can never be sure that they are not dreaming. Some critics assert that global skepticism is self-refuting because denying the existence of knowledge is itself a knowledge claim. Another objection says that the abstract reasoning leading to skepticism is not convincing enough to overrule common sense.

Fallibilism is another response to skepticism. Fallibilists agree with skeptics that absolute certainty is impossible. They reject the assumption that knowledge requires absolute certainty, leading them to the conclusion that fallible knowledge exists. They emphasize the need to keep an open and inquisitive mind, acknowledging that doubt can never be fully excluded, even for well-established knowledge claims like thoroughly tested scientific theories.

Epistemic relativism is related to skepticism but differs in that it does not question the existence of knowledge in general. Instead, epistemic relativists only reject the notion of universal epistemic standards or absolute principles that apply equally to everyone. This means that what a person knows depends on subjective criteria or social conventions used to assess epistemic status.

Empiricism and rationalism

Oil painting of a man with gray hair wearing a brown attire
Oil painting showing a man from the front against a dark background, dressed in a red coat with gold embroidery, his left arm resting on a surface
John Locke and David Hume shaped the philosophy of empiricism.

The debate between empiricism and rationalism centers on the origins of human knowledge. Empiricism emphasizes that sense experience is the primary source of all knowledge. Some empiricists illustrate this view by describing the mind as a blank slate that only develops ideas about the external world through the sense data received from the sensory organs. According to them, the mind can attain various additional insights by comparing impressions, combining them, generalizing to form more abstract ideas, and deducing new conclusions from them. Empiricists say that all these mental operations depend on sensory material and do not function on their own.

Even though rationalists usually accept sense experience as one source of knowledge, they argue that certain forms of knowledge are directly accessed through reason without sense experience, like knowledge of mathematical and logical truths. Some forms of rationalism state that the mind possesses inborn ideas, accessible without sensory assistance. Others assert that there is an additional cognitive faculty, sometimes called rational intuition, through which people acquire nonempirical knowledge. Some rationalists limit their discussion to the origin of concepts, saying that the mind relies on inborn categories to understand the world and organize experience.

Foundationalism and coherentism

Diagram with sections for foundationalism, coherentism, and infinitism, each depicting the relations between beliefs
Diagram of foundationalism, coherentism, and infinitism with arrows symbolizing support between beliefs. According to foundationalism, some basic beliefs are justified without support from other beliefs. According to coherentism, justification requires that beliefs mutually support each other. According to infinitism, justification requires that beliefs form infinite support chains.

Foundationalists and coherentists disagree about the structure of knowledge. Foundationalism distinguishes between basic and non-basic beliefs. A belief is basic if it is justified directly, meaning that its validity does not depend on the support of other beliefs. A belief is non-basic if it is justified by another belief. For example, the belief that it rained last night is a non-basic belief if it is inferred from the observation that the street is wet. According to foundationalism, basic beliefs are the foundation on which all other knowledge is built while non-basic beliefs act as the superstructure resting on this foundation.

Coherentists reject the distinction between basic and non-basic beliefs, saying that the justification of any belief depends on other beliefs. They assert that a belief must align with other beliefs to amount to knowledge. This occurs when beliefs are consistent and support each other. According to coherentism, justification is a holistic aspect determined by the whole system of beliefs, which resembles an interconnected web.

Foundherentism is an intermediary position combining elements of both foundationalism and coherentism. It accepts the distinction between basic and non-basic beliefs while asserting that the justification of non-basic beliefs depends on coherence with other beliefs.

Infinitism presents a less common alternative perspective on the structure of knowledge. It agrees with coherentism that there are no basic beliefs while rejecting the view that beliefs can support each other in a circular manner. Instead, it argues that beliefs form infinite justification chains, in which each link of the chain supports the belief following it and is supported by the belief preceding it.

Internalism and externalism

Black and white photo of a bearded man wearing a suit and a tie
Alvin Goldman was an influential defender of externalism.

The disagreement between internalism and externalism is about the sources of justification. Internalists say that justification depends only on factors within the individual, such as perceptual experience, memories, and other beliefs. This view emphasizes the importance of the cognitive perspective of the individual in the form of their mental states. It is commonly associated with the idea that the relevant factors are accessible, meaning that the individual can become aware of their reasons for holding a justified belief through introspection and reflection.

Evidentialism is an influential internalist view, asserting that justification depends on the possession of evidence. In this context, evidence for a belief is any information in the individual's mind that supports the belief. For example, the perceptual experience of rain is evidence for the belief that it is raining. Evidentialists suggest various other forms of evidence, including memories, intuitions, and other beliefs. According to evidentialism, a belief is justified if the individual's evidence supports it and they hold the belief on the basis of this evidence.

Externalism, by contrast, asserts that at least some relevant factors of knowledge are external to the individual. For instance, when considering the belief that a cup of coffee stands on the table, externalists are not primarily interested in the subjective perceptual experience that led to this belief. Instead, they focus on objective factors, like the quality of the person's eyesight, their ability to differentiate coffee from other beverages, and the circumstances under which they observed the cup. A key motivation of many forms of externalism is that justification makes it more likely that a belief is true. Based on this view, justification is external to the extent that some factors contributing to this likelihood are not part of the believer's cognitive perspective.

Reliabilism is an externalist theory asserting that a reliable connection between belief and truth is required for justification. Some reliabilists explain this in terms of reliable processes. According to this view, a belief is justified if it is produced by a reliable process, like perception. A belief-formation process is deemed reliable if most of the beliefs it generates are true. An alternative view focuses on beliefs rather than belief-formation processes, saying that a belief is justified if it is a reliable indicator of the fact it presents. This means that the belief tracks the fact: the person believes it because it is true but would not believe it otherwise.

Virtue epistemology, another type of externalism, asserts that a belief is justified if it manifests intellectual virtues. Intellectual virtues are capacities or traits that perform cognitive functions and help people form true beliefs. Suggested examples include faculties, like vision, memory, and introspection, and character traits, like open-mindedness.

Branches and approaches

Some branches of epistemology are characterized by their research methods. Formal epistemology employs formal tools from logic and mathematics to investigate the nature of knowledge. For example, Bayesian epistemology represents beliefs as degrees of certainty and uses probability theory to formally define norms of rationality governing how certain people should be. Experimental epistemologists base their research on empirical evidence about common knowledge practices. Applied epistemology focuses on the practical application of epistemological principles to diverse real-world problems, like the reliability of knowledge claims on the internet, how to assess sexual assault allegations, and how racism may lead to epistemic injusticeMetaepistemologists study the nature, goals, and research methods of epistemology. As a metatheory, it does not directly advocate for specific epistemological theories but examines their fundamental concepts and background assumptions.

Particularism and generalism disagree about the right method of conducting epistemological research. Particularists start their inquiry by looking at specific cases. For example, to find a definition of knowledge, they rely on their intuitions about concrete instances of knowledge and particular thought experiments. They use these observations as methodological constraints that any theory of general principles needs to follow. Generalists proceed in the opposite direction. They prioritize general epistemic principles, saying that it is not possible to accurately identify and describe specific cases without a grasp of these principles. Other methods in contemporary epistemology aim to extract philosophical insights from ordinary language or look at the role of knowledge in making assertions and guiding actions.

Phenomenological epistemology emphasizes the importance of first-person experience. It distinguishes between the natural and the phenomenological attitudes. The natural attitude focuses on objects belonging to common sense and natural science. The phenomenological attitude focuses on the experience of objects and aims to provide a presuppositionless description of how objects appear to the observer.

Naturalized epistemology is closely associated with the natural sciences, relying on their methods and theories to examine knowledge. Arguing that epistemological theories should rest on empirical observation, it is critical of a priori reasoning. Evolutionary epistemology is a naturalistic approach that understands cognition as a product of evolution, examining knowledge and the cognitive faculties responsible for it through the lens of natural selectionSocial epistemology focuses on the social dimension of knowledge. While traditional epistemology is mainly interested in the knowledge possessed by individuals, social epistemology covers knowledge acquisition, transmission, and evaluation within groups, with specific emphasis on how people rely on each other when seeking knowledge.

Pragmatist epistemology is a form of fallibilism that emphasizes the close relation between knowing and acting. It sees the pursuit of knowledge as an ongoing process guided by common sense and experience while always open to revision. This approach reinterprets some core epistemological notions, for example, by conceptualizing beliefs as habits that shape actions rather than representations that mirror the world. Motivated by pragmatic considerations, epistemic conservatism is a view about belief revision. It prioritizes pre-existing beliefs, asserting that a person should only change their beliefs if they have a good reason to. One argument for epistemic conservatism rests on the recognition that the cognitive resources of humans are limited, making it impractical to constantly reexamine every belief.

Photo of a woman with glasses and long hair wearing a green blouse
The work of Elizabeth S. Anderson combines the perspectives of feminist, social, and naturalized epistemology.

Postmodern epistemology critiques the conditions of knowledge in advanced societies. This concerns in particular the metanarrative of a constant progress of scientific knowledge leading to a universal and foundational understanding of reality. Similarly, feminist epistemology adopts a critical perspective, focusing on the effect of gender on knowledge. Among other topics, it explores how preconceptions about gender influence who has access to knowledge, how knowledge is produced, and which types of knowledge are valued in society. Some postmodern and feminist thinkers adopt a constructivist approach, arguing that the way people view the world is not a simple reflection of external reality but a social construction. This view emphasizes the creative role of interpretation while undermining objectivity since social constructions can vary across societies. Another critical approach, found in decolonial scholarship, opposes the global influence of Western knowledge systems. It seeks to undermine Western hegemony and decolonize knowledge.

The decolonial outlook is also present in African epistemology. Grounded in African ontology, it emphasizes the interconnectedness of reality as a continuum between knowing subject and known object. It understands knowledge as a holistic phenomenon that includes sensory, emotional, intuitive, and rational aspects, extending beyond the limits of the physical domain.

Another epistemological tradition is found in ancient Indian philosophy. Its diverse schools of thought examine different sources of knowledge, called pramāṇa. Perception, inference, and testimony are sources discussed by most schools. Other sources only considered by some schools are non-perception, which leads to knowledge of absences, and presumption. Buddhist epistemology focuses on immediate experience, understood as the presentation of unique particulars without secondary cognitive processes, like thought and desire. Nyāya epistemology is a causal theory of knowledge, understanding sources of knowledge as reliable processes that cause episodes of truthful awareness. It sees perception as the primary source of knowledge and emphasizes its importance for successful action. Mīmāṃsā epistemology considers the holy scriptures known as the Vedas as a key source of knowledge, addressing the problem of their right interpretation. Jain epistemology states that reality is many-sided, meaning that no single viewpoint can capture the entirety of truth.

Historical epistemology examines how the understanding of knowledge and related concepts has changed over time. It asks whether the main issues in epistemology are perennial and to what extent past epistemological theories are relevant to contemporary debates. It is particularly concerned with scientific knowledge and practices associated with it. It contrasts with the history of epistemology, which presents, reconstructs, and evaluates epistemological theories of philosophers in the past.

Knowledge in particular domains

Some branches of epistemology focus on knowledge within specific academic disciplines. The epistemology of science examines how scientific knowledge is generated and what problems arise in the process of validating, justifying, and interpreting scientific claims. A key issue concerns the problem of how individual observations can support universal scientific laws. Other topics include the nature of scientific evidence and the aims of science. The epistemology of mathematics studies the origin of mathematical knowledge. In exploring how mathematical theories are justified, it investigates the role of proofs and whether there are empirical sources of mathematical knowledge.

Distinct areas of epistemology are dedicated to specific sources of knowledge. Examples are the epistemology of perception, the epistemology of memory, and the epistemology of testimony. In the epistemology of perception, direct and indirect realists debate the connection between the perceiver and the perceived object. Direct realists say that this connection is direct, meaning that there is no difference between the object present in perceptual experience and the physical object causing this experience. According to indirect realism, the connection is indirect, involving mental entities, like ideas or sense data, that mediate between the perceiver and the external world. The contrast between direct and indirect realism is important for explaining the nature of illusions.

Epistemological issues are found in most areas of philosophy. The epistemology of logic examines how people know that an argument is valid. For example, it explores how logicians justify that modus ponens is a correct rule of inference or that all contradictions are false. Epistemologists of metaphysics investigate whether knowledge of the basic structure of reality is possible and what sources this knowledge could have. Knowledge of moral statements, like the claim that lying is wrong, belongs to the epistemology of ethics. It studies the role of ethical intuitions, coherence among moral beliefs, and the problem of moral disagreement. The ethics of belief is a closely related field exploring the intersection of epistemology and ethics. It examines the norms governing belief formation and asks whether violating them is morally wrong. Religious epistemology studies the role of knowledge and justification for religious doctrines and practices. It evaluates the reliability of evidence from religious experience and holy scriptures while also asking whether the norms of reason should be applied to religious faith.

Epistemologists of language explore the nature of linguistic knowledge. One of their topics is the role of tacit knowledge, for example, when native speakers have mastered the rules of grammar but are unable to explicitly articulate them. Epistemologists of modality examine knowledge about what is possible and necessary. Epistemic problems that arise when two people have diverging opinions on a topic are covered by the epistemology of disagreement. Epistemologists of ignorance are interested in epistemic faults and gaps in knowledge.

Epistemology and psychology were not defined as distinct fields until the 19th century; earlier investigations about knowledge often do not fit neatly into today's academic categories. Both contemporary disciplines study beliefs and the mental processes responsible for their formation and change. One key contrast is that psychology describes what beliefs people have and how they acquire them, thereby explaining why someone has a specific belief. The focus of epistemology is on evaluating beliefs, leading to a judgment about whether a belief is justified and rational in a particular case. Epistemology also shares a close connection with cognitive science, which understands mental events as processes that transform informationArtificial intelligence relies on the insights of epistemology and cognitive science to implement concrete solutions to problems associated with knowledge representation and automatic reasoning.

Logic is the study of correct reasoning. For epistemology, it is relevant to inferential knowledge, which arises when a person reasons from one known fact to another. This is the case, for example, when inferring that it rained based on the observation that the streets are wet. Whether an inferential belief amounts to knowledge depends on the form of reasoning used, in particular, that the process does not violate the laws of logic. Another overlap between the two fields is found in the epistemic approach to fallacies. Fallacies are faulty arguments based on incorrect reasoning. The epistemic approach to fallacies explains why they are faulty, stating that arguments aim to expand knowledge. According to this view, an argument is a fallacy if it fails to do so. A further intersection is found in epistemic logic, which uses formal logical devices to study epistemological concepts like knowledge and belief.

Both decision theory and epistemology are interested in the foundations of rational thought and the role of beliefs. Unlike many approaches in epistemology, the main focus of decision theory lies less in the theoretical and more in the practical side, exploring how beliefs are translated into action. Decision theorists examine the reasoning involved in decision-making and the standards of good decisions, identifying beliefs as a central aspect of decision-making. One of their innovations is to distinguish between weaker and stronger beliefs, which helps them consider the effects of uncertainty on decisions.

Epistemology and education have a shared interest in knowledge, with one difference being that education focuses on the transmission of knowledge, exploring the roles of both learner and teacher. Learning theory examines how people acquire knowledge. Behavioral learning theories explain the process in terms of behavior changes, for example, by associating a certain response with a particular stimulusCognitive learning theories study how the cognitive processes that affect knowledge acquisition transform information. Pedagogy looks at the transmission of knowledge from the teacher's perspective, exploring the teaching methods they may employ. In teacher-centered methods, the teacher serves as the main authority delivering knowledge and guiding the learning process. In student-centered methods, the teacher primarily supports and facilitates the learning process, allowing students to take a more active role. The beliefs students have about knowledge, called personal epistemology, influence their intellectual development and learning success.

The anthropology of knowledge examines how knowledge is acquired, stored, retrieved, and communicated. It studies the social and cultural circumstances that affect how knowledge is reproduced and changes, covering the role of institutions like university departments and scientific journals as well as face-to-face discussions and online communications. This field has a broad concept of knowledge, encompassing various forms of understanding and culture, including practical skills. Unlike epistemology, it is not interested in whether a belief is true or justified but in how understanding is reproduced in society. A closely related field, the sociology of knowledge has a similar conception of knowledge. It explores how physical, demographic, economic, and sociocultural factors impact knowledge. This field examines in what sociohistorical contexts knowledge emerges and the effects it has on people, for example, how socioeconomic conditions are related to the dominant ideology in a society.

History

Early reflections on the nature and sources of knowledge are found in ancient history. In ancient Greek philosophy, Plato (427–347 BCE) studied what knowledge is, examining how it differs from true opinion by being based on good reasons. He proposed that learning is a form of recollection in which the soul remembers what it already knew but had forgotten. Plato's student Aristotle (384–322 BCE) was particularly interested in scientific knowledge, exploring the role of sensory experience and the process of making inferences from general principles. Aristotle's ideas influenced the Hellenistic schools of philosophy, which began to arise in the 4th century BCE and included Epicureanism, Stoicism, and skepticism. The Epicureans had an empiricist outlook, stating that sensations are always accurate and act as the supreme standard of judgments. The Stoics defended a similar position but confined their trust to lucid and specific sensations, which they regarded as true. The skeptics questioned that knowledge is possible, recommending instead suspension of judgment to attain a state of tranquility. Emerging in the 3rd century CE and inspired by Plato's philosophy, Neoplatonism distinguished knowledge from true belief, arguing that knowledge is infallible and limited to the realm of immaterial forms.

Photo of a statue of a monk sitting in the lotus position
The Buddhist philosopher Dharmakirti developed a causal theory of knowledge.

The Upanishads, philosophical scriptures composed in ancient India between 700 and 300 BCE, examined how people acquire knowledge, including the role of introspection, comparison, and deduction. In the 6th century BCE, the school of Ajñana developed a radical skepticism questioning the possibility and usefulness of knowledge. By contrast, the school of Nyaya, which emerged in the 2nd century BCE, asserted that knowledge is possible. It provided a systematic treatment of how people acquire knowledge, distinguishing between valid and invalid sources. When Buddhist philosophers became interested in epistemology, they relied on concepts developed in Nyaya and other traditions. Buddhist philosopher Dharmakirti (6th or 7th century CE) analyzed the process of knowing as a series of causally related events.

Ancient Chinese philosophers understood knowledge as an interconnected phenomenon fundamentally linked to ethical behavior and social involvement. Many saw wisdom as the goal of attaining knowledge. Mozi (470–391 BCE) proposed a pragmatic approach to knowledge using historical records, sensory evidence, and practical outcomes to validate beliefs. Mencius (c. 372–289 BCE) explored analogical reasoning as a source of knowledge and employed this method to criticize Mozi. Xunzi (c. 310–220 BCE) aimed to combine empirical observation and rational inquiry. He emphasized the importance of clarity and standards of reasoning without excluding the role of feeling and emotion.

The relation between reason and faith was a central topic in the medieval period. In Arabic–Persian philosophy, al-Farabi (c. 870–950) and Averroes (1126–1198) discussed how philosophy and theology interact, debating which one is a better vehicle to truth. Al-Ghazali (c. 1056–1111) criticized many core teachings of previous Islamic philosophers, saying that they relied on unproven assumptions that did not amount to knowledge. Similarly in Western philosophy, Anselm of Canterbury (1033–1109) proposed that theological teaching and philosophical inquiry are in harmony and complement each other. Formulating a more critical approach, Peter Abelard (1079–1142) argued against unquestioned theological authorities and said that all things are open to rational doubt. Influenced by Aristotle, Thomas Aquinas (1225–1274) developed an empiricist theory, stating that "nothing is in the intellect unless it first appeared in the senses". According to an early form of direct realism proposed by William of Ockham (c. 1285–1349), perception of mind-independent objects happens directly without intermediaries. Meanwhile, in 14th-century India, Gaṅgeśa developed a reliabilist theory of knowledge and considered the problems of testimony and fallacies. In China, Wang Yangming (1472–1529) explored the unity of knowledge and action, holding that moral knowledge is inborn and can be attained by overcoming self-interest.

Painting of a bearded man with long hair wearing a dark formal attire
René Descartes used methodological doubt to seek certain foundations for philosophy.

The course of modern philosophy was shaped by René Descartes (1596–1650), who stated that philosophy must begin from a position of indubitable knowledge of first principles. Inspired by skepticism, he aimed to find absolutely certain knowledge by encountering truths that cannot be doubted. He thought that this is the case for the assertion "I think, therefore I am", from which he constructed the rest of his philosophical system. Descartes, together with Baruch Spinoza (1632–1677) and Gottfried Wilhelm Leibniz (1646–1716), belonged to the school of rationalism, which asserts that the mind possesses innate ideas independent of experience. John Locke (1632–1704) rejected this view in favor of an empiricism according to which the mind is a blank slate. This means that all ideas depend on experience, either as "ideas of sense", which are directly presented through the senses, or as "ideas of reflection", which the mind creates by reflecting on its own activities. David Hume (1711–1776) used this idea to explore the limits of what people can know. He said that knowledge of facts is never certain, adding that knowledge of relations between ideas, like mathematical truths, can be certain but contains no information about the world. Immanuel Kant (1724–1804) sought a middle ground between rationalism and empiricism by identifying a type of knowledge overlooked by Hume. For Kant, this knowledge pertains to principles that underlie and structure all experience, such as spatial and temporal relations and fundamental categories of understanding.

In the 19th century and influenced by Kant's philosophy, Georg Wilhelm Friedrich Hegel (1770–1831) rejected empiricism by arguing that sensory impressions alone cannot amount to knowledge since all knowledge is actively structured by the knowing subject. John Stuart Mill (1806–1873), by contrast, defended a wide-sweeping form of empiricism and explained knowledge of general truths through inductive reasoningCharles Peirce (1839–1914) thought that all knowledge is fallible, emphasizing that knowledge seekers should remain open to revising their beliefs in light of new evidence. He used this idea to argue against Cartesian foundationalism, which seeks absolutely certain truths.

In the 20th century, fallibilism was further explored by J. L. Austin (1911–1960) and Karl Popper (1902–1994). In continental philosophy, Edmund Husserl (1859–1938) applied the skeptical idea of suspending judgment to the study of experience. By not judging whether an experience is accurate, he tried to describe its internal structure instead. Influenced by earlier empiricists, logical positivists, like A. J. Ayer (1910–1989), said that all knowledge is either empirical or analytic, rejecting any form of metaphysical knowledge. Bertrand Russell (1872–1970) developed an empiricist sense-datum theory, distinguishing between direct knowledge by acquaintance of sense data and indirect knowledge by description, which is inferred from knowledge by acquaintance. Common sense had a central place in G. E. Moore's (1873–1958) epistemology. He used trivial observations, like the fact that he has two hands, to argue against abstract philosophical theories that deviate from common sense. Ordinary language philosophy, as practiced by the late Ludwig Wittgenstein (1889–1951), is a similar approach that tries to extract epistemological insights from how ordinary language is used.

Edmund Gettier (1927–2021) conceived counterexamples against the idea that knowledge is justified true belief. These counterexamples prompted many philosophers to suggest alternative definitions of knowledge. Developed by philosophers such as Alvin Goldman (1938–2024), reliabilism emerged as one of the alternatives, asserting that knowledge requires reliable sources and shifting the focus away from justification. Virtue epistemologists, such as Ernest Sosa (1940–present) and Linda Zagzebski (1946–present), analyse belief formation in terms of the intellectual virtues or cognitive competencies involved in the process. Naturalized epistemology, as conceived by Willard Van Orman Quine (1908–2000), employs concepts and ideas from the natural sciences to formulate its theories. Other developments in late 20th-century epistemology were the emergence of social, feminist, and historical epistemology.

Best of all possible worlds

From Wikipedia, the free encyclopedia

The phrase "the best of all possible worlds" (French: Le meilleur des mondes possibles; German: Die beste aller möglichen Welten) was coined by the German polymath and Enlightenment philosopher Gottfried Leibniz in his 1710 work Essais de Théodicée sur la bonté de Dieu, la liberté de l'homme et l'origine du mal (Essays of Theodicy on the Goodness of God, the Freedom of Man and the Origin of Evil),[1] more commonly known simply as the Theodicy. The claim that the actual world is the best of all possible worlds is the central argument in Leibniz's theodicy, or his attempt to solve the problem of evil.[1]

Leibniz

In Leibniz's works, the argument about the best of all possible worlds appears in the context of his theodicy, a word that he coined by combining the Greek words Theos, 'God', and dikē, 'justice'.[2] Its object was to solve the problem of evil, that is, to reconcile the existence of evil and suffering in the world with the existence of a perfectly good, all-powerful and all-knowing God, who would seem required to prevent it; as such, the name comes from Leibniz's conceiving of the project as the vindication of God's justice, namely against the charges of injustice brought against him by such evils. Proving that this is the best of all possible worlds would dispel such charges by showing that, no matter how it may intuitively appear to us from our limited point of view, any other world – such as, namely, one without the evils which trouble our lives – would, in fact, have been worse than the current one, all things considered.

Leibniz's argument for this conclusion may be gathered from the paragraphs 53–55 of his Monadology, which run as follows:

53. Now as there are an infinity of possible universes in the ideas of God, and but one of them can exist, there must be a sufficient reason for the choice of God which determines him to select one rather than another.

54. And this reason is to be found only in the fitness or in the degree of perfection which these worlds possess, each possible thing having the right to claim existence in proportion to the perfection which it involves.

55. This is the cause for the existence of the greatest good; namely, that the wisdom of God permits him to know it, his goodness causes him to choose it, and his power enables him to produce it.

Since this is a very compact exposition, the remainder of this section will explain the argument in more words. While the text refers to "possible universes", this article will often adopt the more common usage "possible worlds", which refers to the same thing, which is explained next. As Leibniz said in the Theodicy, this term should not be misunderstood as referring only to a single planet or reality, since it refers to the sum of everything that exists:

I call 'World' the whole succession and the whole agglomeration of all existent things, lest it be said that several worlds could have existed in different times and different places.

Possible worlds

Possible worlds, according to Leibniz's theory, are combinations of beings which are possible together, that is, compossible.

A being is possible, for Leibniz, when it is logically possible, i.e., when its definition involves no contradiction.[7] For example, a married bachelor is impossible because a "bachelor" is, by definition, an unmarried man, which contradicts "married". But a unicorn, if defined as a horse with a horn, contains no contradiction, so that such a being is possible, even if none exist in the actual world.

Beings are possible together, in turn, when they do not enter into contradiction with each other. For instance, it is logically possible that a meteor might have fallen from the sky onto Wikipedia founder Jimmy Wales's head soon after he was born, killing him. But it is not logically possible that what happens in a given world (e.g. that Jimmy Wales founded Wikipedia) also does not happen in the same world (i.e. that Jimmy Wales did not found Wikipedia). While both of these events are logically possible in themselves, they are not logically possible together, or compossible – so, they cannot form part of the same possible world.

Leibniz claims in §53, then, that there are infinitely many of these possible worlds, or combinations of compossible beings, in the ideas of God. These are the worlds which God could possibly bring into existence, since not even God, according to Leibniz, could create a world which contains a contradiction.

Sufficient reason

Although God cannot create a self-contradictory world, he is all-powerful and all-knowing, as emphasized in §55. He cannot be prevented from creating a world by not knowing about it, or by lacking the power to make it. Given these assumptions, it might seem that God could create just any one of the worlds. And since there are infinitely many possible worlds, it might seem that, just as there is no greatest among the infinitely many numbers, there is no best of the possible worlds.

Leibniz rejects these possibilities by appealing to the Principle of Sufficient Reason (PSR), a central principle of his philosophical system. This principle, which he was the first to name, was once described by him as the principle "that nothing happens without a reason"; in the Monadology, which is the work at hand, he described it as follows:

31. Our reasoning is based upon two great principles: first, that of contradiction, by means of which we decide that to be false which involves contradiction and that to be true which contradicts or is opposed to the false. 32. And second, the principle of sufficient reason, in virtue of which we believe that no fact can be real or existing and no statement true unless it has a sufficient reason why it should be thus and not otherwise. Most frequently, however, these reasons cannot be known by us.

Since Leibniz adopted his principle, he could not admit that God chose to create this world rather than another – that God's choice was "thus and not otherwise" – for no reason, or "arbitrarily".

Leibniz then claims that the only possible reason for the choice between these possible worlds is "the fitness or the degree of perfection" which they possess – i.e., the quality which makes worlds better than others, so that the world with the greatness "fitness" or "perfection" is the best one. As the philosophers Michael Murray and Sean Greenberg interpreted it, this claim may be understood by the consideration that basing the choice on any other quality about the worlds would have been arbitrary, contrary to the PSR.

Leibniz claims that God's choice is caused not only by its being the most reasonable, but also by God's perfect goodness, a traditional claim about God which Leibniz accepted. As Leibniz says in §55, God's goodness causes him to produce the best world. Hence, the best possible world, or "greatest good" as Leibniz called it in this work, must be the one that exists.

Evil in the best world

Leibniz, following a long metaphysical tradition that goes back at least to Augustine, conceived of the perfection of the universe as its "metaphysical goodness", which is identical with "being", or "reality". The best world is the one with the greatest "degree of reality", the greatest "quantity of essence", the greatest "perfection" and "intelligibility". According to this tradition, "evil, though real, is not a 'thing', but rather a direction away from the goodness of the One"; evil is the absence of good, and accordingly, it is technically wrong to say that God created evil, properly speaking. Rather, he created a world which was imperfectly good.

According to the privation theory of evil, all examples of evils are analysed as consisting in the absence of some good that ought to be there, or is natural to a thing – for instance, disease is the absence of health, blindness is the absence of sight, and vice is the absence of virtue. Evil may be said to exist in the same way the hole of a donut exists: the donut was created, but the hole itself was not made, it was just never filled in – it is an absence. And just as the hole could not exist without the donut, evil is parasitic upon good, since it is the corruption of a good nature. "God is infinite, and the devil is limited; the good may and does go to infinity, while evil has its bounds."

Leibniz did, nevertheless, concede that God has created a world with evil in it, and could have created a world without it. He claimed, however, that the existence of evil does not necessarily mean a worse world, so that this is still the best world that God could have made. In fact, Leibniz claimed that the presence of evil may make for a better world, insofar as "it may happen that the evil is accompanied by a greater good" – as he said, "an imperfection in the part may be required for a perfection in the whole".

In light of the conceptual tools that have already been explained, this claim may be phrased as stating that there are goods in the universe which would not be compossible with the prevention of certain evils. This claim, which may seem counterintuitive, was elucidated by Leibniz in various ways. For instance, in the Theodicy, he used certain analogies to emphasize how the contrast provided by evil may increase the good, and make it more discernible:

Use has ever been made of comparisons taken from the pleasures of the senses when these are mingled with that which borders on pain, to prove that there is something of like nature in intellectual pleasures. A little acid, sharpness or bitterness is often more pleasing than sugar; shadows enhance colours; and even a dissonance in the right place gives relief to harmony. We wish to be terrified by rope-dancers on the point of falling and we wish that tragedies shall well-nigh cause us to weep. Do men relish health enough, or thank God enough for it, without having ever been sick? And is it not most often necessary that a little evil render the good more discernible, that is to say, greater?

In other works, Leibniz also used his broader theory that there are no "purely extrinsic denominations" – everything that may be said about something is essential to it. So, according to Leibniz, it is technically wrong to say that "I would be better off" in another possible world: each individual is world-bound, so that, if God had not actualized this specific world, I would not exist at all. And even if, due to my great personal suffering, I should think that it would be better for me to not exist, it would nevertheless be worse for the rest of the universe, since this world is the best possible world, as was proved.[2]

Uses outside of theodicy

Leibniz also applied his theory of the best of all possible worlds to solve the problem of induction. Out of all possible worlds, God has chosen "the most perfect, that is to say, the one which is at the same time the simplest in hypotheses and the richest in phenomena". (Discourse on Metaphysics, §6) This justifies human beings in choosing to believe, out of the available theories, those which are simplest and have the most explanatory power.

Before Leibniz

The philosopher Calvin Normore has claimed that, according to the Stoics, this is the best of all possible worlds, and that this opinion was shared by Peter Abelard.

Avicenna argued that divine providence ensures that this is the best of all possible worlds.

Thomas Aquinas, in article 6 of question 25 of the first part of his Summa Theologiae, had affirmed that God can always make better what he has made, but only by making more things; "the present creation being supposed, cannot be better."

After Leibniz

18th century

Following the devastating Lisbon Earthquake (1 November 1755), which occurred decades after the publication of the Theodicy (1710), Leibniz's philosophical optimism and theodicy incurred considerable criticism both from his fellow Enlightenment philosophers and from Christian theologians. Critics of Leibniz argue that the world contains an amount of suffering too great to permit belief in philosophical optimism.

The claim that we live in the best of all possible worlds drew scorn most notably from Voltaire, who lampooned it in his comic novella Candide by having the character Dr. Pangloss (a parody of Leibniz and Maupertuis) repeat it like a mantra when great catastrophes keep happening to him and the titular protagonist. Derived from this character, the adjective "Panglossian" describes a person who believes that the actual world is the best possible one, or is otherwise excessively optimistic.

19th century

The physiologist Emil du Bois-Reymond, in his "Leibnizian Thoughts in Modern Science" (1870), wrote that Leibniz thought of God as a mathematician:

As is well known, the theory of the maxima and minima of functions was indebted to him for the greatest progress through the discovery of the method of tangents. Well, he conceives God in the creation of the world like a mathematician who is solving a minimum problem, or rather, in our modern phraseology, a problem in the calculus of variations – the question being to determine among an infinite number of possible worlds, that for which the sum of necessary evil is a minimum.

Du Bois-Reymond believed that Charles Darwin supported a version of Leibniz's perfect world, since every organism can be understood as relatively adapted to its environment at any point in its evolution.

Arthur Schopenhauer argued, contrary to Leibniz, that our world must be the worst of all possible worlds, because if it were only a little worse, it could not continue to exist.

20th century

The Theodicy was deemed illogical by the philosopher Bertrand Russell. Russell argues that moral and physical evil must result from metaphysical evil (imperfection). But imperfection is merely limitation; if existence is good, as Leibniz maintains, then the mere existence of evil requires that evil also be good. In addition, libertarian Christian theology (not related to political libertarianism) defines sin as not necessary but contingent, the result of free will. Russell maintains that Leibniz failed to logically show that metaphysical necessity (divine will) and human free will are not incompatible or contradictory. He also claims that when Leibniz analyzes the propositions, he is "ambiguous or doubtful..." (O'Briant). That is, Leibniz does not sound sure, and is unsure of himself when he writes his premises; and they do not work together without making Leibniz sound unsure of himself.

21st century

The philosopher Alvin Plantinga criticized Leibniz's theodicy by arguing that there probably is not such a thing as the best of all possible worlds, since one can always conceive a better world, such as a world with one more morally righteous person.

The philosopher William C. Lane defended Leibniz from Plantinga's criticism and also claimed that Leibniz's theory has pandeistic consequences:

If divine becoming were complete, God's kenosis – God's self-emptying for the sake of love – would be total. In this pandeistic view, nothing of God would remain separate and apart from what God would become. Any separate divine existence would be inconsistent with God's unreserved participation in the lives and fortunes of the actualized phenomena."

Leibniz's theodicy has been defended by Justin Daeley, who argues that God must create the best, and James Franklin, who argues that goods and evils in creation are interconnected with mathematical necessity and hence cannot be separated by divine power.

Is–ought problem

From Wikipedia, the free encyclopedia
David Hume raised the is–ought problem in his Treatise of Human Nature.

The is–ought problem, as articulated by the Scottish philosopher and historian David Hume, arises when one makes claims about what ought to be that are based solely on statements about what is. Hume found that there seems to be a significant difference between descriptive statements (about what is) and prescriptive statements (about what ought to be), and that it is not obvious how one can coherently transition from descriptive statements to prescriptive ones.

Hume's law or Hume's guillotine is the thesis that an ethical or judgmental conclusion cannot be inferred from purely descriptive factual statements.

A similar view is defended by G. E. Moore's open-question argument, intended to refute any identification of moral properties with natural properties, which is asserted by ethical naturalists, who do not deem the naturalistic fallacy a fallacy.

The is–ought problem is closely related to the fact–value distinction in epistemology. Though the terms are often used interchangeably, academic discourse concerning the latter may encompass aesthetics in addition to ethics.

Overview

Hume discusses the problem in book III, part I, section I of his book, A Treatise of Human Nature (1739):

In every system of morality, which I have hitherto met with, I have always remarked, that the author proceeds for some time in the ordinary way of reasoning, and establishes the being of a God, or makes observations concerning human affairs; when of a sudden I am surprised to find, that instead of the usual copulations of propositions, is, and is not, I meet with no proposition that is not connected with an ought, or an ought not. This change is imperceptible; but is, however, of the last consequence. For as this ought, or ought not, expresses some new relation or affirmation, it's necessary that it should be observed and explained; and at the same time that a reason should be given, for what seems altogether inconceivable, how this new relation can be a deduction from others, which are entirely different from it. But as authors do not commonly use this precaution, I shall presume to recommend it to the readers; and am persuaded, that this small attention would subvert all the vulgar systems of morality, and let us see, that the distinction of vice and virtue is not founded merely on the relations of objects, nor is perceived by reason.

Hume calls for caution against such inferences in the absence of any explanation of how the ought-statements follow from the is-statements. But how exactly can an "ought" be derived from an "is"? The question, prompted by Hume's small paragraph, has become one of the central questions of ethical theory, and Hume is usually assigned the position that such a derivation is impossible.

In modern times, "Hume's law" often denotes the informal thesis that, if a reasoner only has access to non-moral factual premises, the reasoner cannot logically infer the truth of moral statements; or, more broadly, that one cannot infer evaluative statements (including aesthetic statements) from non-evaluative statements. An alternative definition of Hume's law is that "If P implies Q, and Q is moral, then P is moral". This interpretation-driven definition avoids a loophole with the principle of explosion. Other versions state that the is–ought gap can technically be formally bridged without a moral premise, but only in ways that are formally "vacuous" or "irrelevant", and that provide no "guidance". For example, one can infer from "The Sun is yellow" that "Either the Sun is yellow, or it is wrong to murder". But this provides no relevant moral guidance; absent a contradiction, one cannot deductively infer that "it is wrong to murder" solely from non-moral premises alone, adherents argue.

Implications

The apparent gap between "is" statements and "ought" statements, when combined with Hume's fork, renders "ought" statements of dubious validity. Hume's fork is the idea that all items of knowledge are based either on logic and definitions, or else on observation. If the is–ought problem holds, then "ought" statements do not seem to be known in either of these two ways, and it would seem that there can be no moral knowledge. Moral skepticism and non-cognitivism work with such conclusions.

Responses

Oughts and goals

Ethical naturalists contend that moral truths exist, and that their truth value relates to facts about physical reality. Many modern naturalistic philosophers see no impenetrable barrier in deriving "ought" from "is", believing it can be done whenever we analyze goal-directed behavior. They suggest that a statement of the form "In order for agent A to achieve goal B, A reasonably ought to do C" exhibits no category error and may be factually verified or refuted. "Oughts" exist, then, in light of the existence of goals. A counterargument to this response is that it merely pushes back the "ought" to the subjectively valued "goal" and thus provides no fundamentally objective basis to one's goals which, consequentially, provides no basis of distinguishing moral value of fundamentally different goals. A dialectical naturalist response to this objection is that although it is true that individual goals have a degree of subjectivity, the process through which the existence of goals is made possible is not subjective—that is, the advent of organisms capable of subjectivity, having occurred through the objective process of evolution. This dialectical approach goes further to state that subjectivity should be conceptualized as objectivity at its highest point, having been the result of an unfolding developmental process.

This is similar to work done by the moral philosopher Alasdair MacIntyre, who attempts to show that because ethical language developed in the West in the context of a belief in a human telos—an end or goal—our inherited moral language, including terms such as good and bad, have functioned, and function, to evaluate the way in which certain behaviors facilitate the achievement of that telos. In an evaluative capacity, therefore, good and bad carry moral weight without committing a category error. For instance, a pair of scissors that cannot easily cut through paper can legitimately be called bad since it cannot fulfill its purpose effectively. Likewise, if a person is understood as having a particular purpose, then behaviour can be evaluated as good or bad in reference to that purpose. In plainer words, a person is acting good when that person fulfills that person's purpose.

Even if the concept of an "ought" is meaningful, this need not involve morality. This is because some goals may be morally neutral, or (if they exist) against what is moral. A poisoner might realize his victim has not died and say, for example, "I ought to have used more poison," since his goal is to murder. The next challenge of a moral realist is thus to explain what is meant by a "moral ought".

Discourse ethics

Proponents of discourse ethics argue that the very act of discourse implies certain "oughts", that is, certain presuppositions that are necessarily accepted by the participants in discourse, and can be used to further derive prescriptive statements. They therefore argue that it is incoherent to argumentatively advance an ethical position on the basis of the is–ought problem, which contradicts these implied assumptions.

Moral oughts

As MacIntyre explained, someone may be called a good person if people have an inherent purpose. Many ethical systems appeal to such a purpose. This is true of some forms of moral realism, which states that something can be wrong, even if every thinking person believes otherwise (the idea of brute fact about morality). The ethical realist might suggest that humans were created for a purpose (e.g. to serve God), especially if they are an ethical non-naturalist. If the ethical realist is instead an ethical naturalist, they may start with the fact that humans have evolved and pursue some sort of evolutionary ethics (which risks “committing” the moralistic fallacy). Not all moral systems appeal to a human telos or purpose. This is because it is not obvious that people even have any sort of natural purpose, or what that purpose would be. Although many scientists do recognize teleonomy (a tendency in nature), few philosophers appeal to it (this time, to avoid the naturalistic fallacy).

Goal-dependent oughts run into problems even without an appeal to an innate human purpose. Consider cases where one has no desire to be good—whatever it is. If, for instance, a person wants to be good, and good means washing one's hands, then it seems one morally ought to wash their hands. The bigger problem in moral philosophy is what happens if someone does not want to be good, whatever its origins? Put simply, in what sense ought we to hold the goal of being good? It seems one can ask "how am I rationally required to hold 'good' as a value, or to pursue it?"

The issue above mentioned is a result of an important ethical relativist critique. Even if "oughts" depend on goals, the ought seems to vary with the person's goal. This is the conclusion of the ethical subjectivist, who says a person can only be called good according to whether they fulfill their own, self-assigned goal. Alasdair MacIntyre himself suggests that a person's purpose comes from their culture, making him a sort of ethical relativist. Ethical relativists acknowledge local, institutional facts about what is right, but these are facts that can still vary by society. Thus, without an objective "moral goal", a moral ought is difficult to establish. G. E. M. Anscombe was particularly critical of the word "ought" for this reason; understood as "We need such-and-such, and will only get it this way"—for somebody may need something immoral, or else find that their noble need requires immoral action. Anscombe would even go as far to suggest that "the concepts of obligation, and duty—moral obligation and moral duty, that is to say—and of what is morally right and wrong, and of the moral sense of 'ought,' ought to be jettisoned if this is psychologically possible".

If moral goals depend on private assumptions or public agreement, so may morality as a whole. For example, Canada might call it good to maximize global welfare, where a citizen, Alice, calls it good to focus on herself, and then her family, and finally her friends (with little empathy for strangers). It does not seem that Alice can be objectively or rationally bound—without regard to her personal values nor those of groups of other people—to act a certain way. In other words, we may not be able to say "You just should do this". Moreover, persuading her to help strangers would necessarily mean appealing to values she already possesses (or else we would never even have a hope of persuading her). This is another interest of normative ethics—questions of binding forces.

There may be responses to the above relativistic critiques. As mentioned above, ethical realists that are non-natural can appeal to God's purpose for humankind. On the other hand, naturalistic thinkers may posit that valuing people's well-being is somehow "obviously" the purpose of ethics, or else the only relevant purpose worth talking about. This is the move made by natural law, scientific moralists and some utilitarians.

Institutional facts

John Searle also attempts to derive "ought" from "is". He tries to show that the act of making a promise places one under an obligation by definition, and that such an obligation amounts to an "ought". This view is still widely debated, and to answer criticisms, Searle has further developed the concept of institutional facts, for example, that a certain building is in fact a bank and that certain paper is in fact money, which would seem to depend upon general recognition of those institutions and their value.

Indefinables

Indefinables are concepts so global that they cannot be defined; rather, in a sense, they themselves, and the objects to which they refer, define our reality and our ideas. Their meanings cannot be stated in a true definition, but their meanings can be referred to instead by being placed with their incomplete definitions in self-evident statements, the truth of which can be tested by whether or not it is impossible to think the opposite without a contradiction. Thus, the truth of indefinable concepts and propositions using them is entirely a matter of logic.

An example of the above is that of the concepts "finite parts" and "wholes"; they cannot be defined without reference to each other and thus with some amount of circularity, but we can make the self-evident statement that "the whole is greater than any of its parts", and thus establish a meaning particular to the two concepts.

These two notions being granted, it can be said that statements of "ought" are measured by their prescriptive truth, just as statements of "is" are measured by their descriptive truth; and the descriptive truth of an "is" judgment is defined by its correspondence to reality (actual or in the mind), while the prescriptive truth of an "ought" judgment is defined according to a more limited scope—its correspondence to right desire (conceivable in the mind and able to be found in the rational appetite, but not in the more "actual" reality of things independent of the mind or rational appetite).

To some, this may immediately suggest the question: "How can we know what is a right desire if it is already admitted that it is not based on the more actual reality of things independent of the mind?" The beginning of the answer is found when we consider that the concepts "good", "bad", "right" and "wrong" are indefinables. Thus, right desire cannot be defined properly, but a way to refer to its meaning may be found through a self-evident prescriptive truth.

That self-evident truth which the moral cognitivist claims to exist upon which all other prescriptive truths are ultimately based is: One ought to desire what is really good for one and nothing else. The terms "real good" and "right desire" cannot be defined apart from each other, and thus their definitions would contain some degree of circularity, but the stated self-evident truth indicates a meaning particular to the ideas sought to be understood, and it is (the moral cognitivist might claim) impossible to think the opposite without a contradiction. Thus combined with other descriptive truths of what is good (goods in particular considered in terms of whether they suit a particular end and the limits to the possession of such particular goods being compatible with the general end of the possession of the total of all real goods throughout a whole life), a valid body of knowledge of right desire is generated.

Functionalist counterexamples

Several counterexamples have been offered by philosophers claiming to show that there are cases when an "ought" logically follows from an "is." First of all, Hilary Putnam, by tracing back the quarrel to Hume's dictum, claims fact/value entanglement as an objection, since the distinction between them entails a value. A. N. Prior points out, from the statement "He is a sea captain," it logically follows, "He ought to do what a sea captain ought to do." Alasdair MacIntyre points out, from the statement "This watch is grossly inaccurate and irregular in time-keeping and too heavy to carry about comfortably," the evaluative conclusion validly follows, "This is a bad watch." John Searle points out, from the statement "Jones promised to pay Smith five dollars," it logically follows that "Jones ought to pay Smith five dollars." The act of promising by definition places the promiser under obligation.

Moral realism

Philippa Foot adopts a moral realist position, criticizing the idea that when evaluation is superposed on fact there has been a "committal in a new dimension." She introduces, by analogy, the practical implications of using the word "injury." Not just anything counts as an injury. There must be some impairment. If one supposes a man wants the things the injury prevents him from obtaining, has one fallen into the old naturalist fallacy? She states the following:

It may seem that the only way to make a necessary connection between "injury" and the things that are to be avoided, is to say that it is only used in an "action-guiding sense" when applied to something the speaker intends to avoid. But we should look carefully at the crucial move in that argument, and query the suggestion that someone might happen not to want anything for which he would need the use of hands or eyes. Hands and eyes, like ears and legs, play a part in so many operations that a man could only be said not to need them if he had no wants at all.

Foot argues that the virtues, like hands and eyes in the analogy, play so large a part in so many operations that it is implausible to suppose that a committal in a non-naturalist dimension is necessary to demonstrate their goodness.

Philosophers who have supposed that actual action was required if "good" were to be used in a sincere evaluation have got into difficulties over weakness of will, and they should surely agree that enough has been done if we can show that any man has reason to aim at virtue and avoid vice. But is this impossibly difficult if we consider the kinds of things that count as virtue and vice? Consider, for instance, the cardinal virtues, prudence, temperance, courage and justice. Obviously any man needs prudence, but does he not also need to resist the temptation of pleasure when there is harm involved? And how could it be argued that he would never need to face what was fearful for the sake of some good? It is not obvious what someone would mean if he said that temperance or courage were not good qualities, and this not because of the "praising" sense of these words, but because of the things that courage and temperance are.

Misunderstanding

Hilary Putnam argues that philosophers who accept Hume's "is–ought" distinction reject his reasons in making it, and thus undermine the entire claim.

Various scholars have also indicated that, in the very work where Hume argues for the is–ought problem, Hume himself derives an "ought" from an "is". Such seeming inconsistencies in Hume have led to an ongoing debate over whether Hume actually held to the is–ought problem in the first place, or whether he meant that ought inferences can be made but only with good argumentation.

Who's on First?

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Who's...