The relationship between religion and schizophrenia is of particular interest to psychiatrists because of the similarities between religious experiences and psychotic episodes. Religious experiences often involve reports of auditory and/or visual
phenomena, which is similar to those with schizophrenia who also
commonly report hallucinations and delusions. These symptoms may
resemble the events found within a religious experience. However, the people who report these religious visual and audio
hallucinations also claim to have not perceived them with their five
senses, rather, they conclude these hallucinations were an entirely
internal process.
This differs from schizophrenia, where the person is unaware that
their own thoughts or inner feelings are not happening outside of them.
They report hearing, seeing, smelling, feeling, or tasting something
that deludes them to believe it is real. They are unable to distinguish
between reality and hallucinations because they experience these
hallucinations with their bodily senses that leads them to perceive
these events as happening outside of their mind. In general, religion has been found to have "both a protective and a risk increasing effect" for schizophrenia.
A common report from those with schizophrenia
is some type of religious belief that many medical practitioners
consider to be delusional—such as the belief that they are possessed by
demons, that a god is talking to them, that they themselves are divine beings, or that they are prophets. Active and adaptive coping skills in subjects with residual
schizophrenia are associated with a sound spiritual, religious, or
personal belief system.
Trans-cultural studies have found that such beliefs are much more common in patients who also identify as Christian and/or reside in predominately Christian areas such as Europe or North America. By comparison, patients in Japan much more commonly have delusions surrounding matters of shame and slander, and in Pakistan matters of paranoia regarding relatives and neighbors.
Schizophrenia is a complex psychotic disorder
in which symptoms include emotional blunting, intellectual
deterioration, social isolation, disorganized speech and behavior,
delusions, and hallucinations. The causes of schizophrenia are unclear,
but it seems that genetics play a heavy role, as individuals with a family history are far more likely to suffer from schizophrenia.
The disorder can be triggered and exacerbated by social and
environmental factors, with episodes becoming more apparent in periods
of high stress.
Neurologists have found that the schizophrenic brain has larger
ventricles (fluid-filled cavities) compared to a non-schizophrenic
brain. This is hypothesized to be due to loss of nerve cells. Symptoms usually appear around the onset of early adulthood.
It is rare for a child to be diagnosed with schizophrenia, in
part because of the difficulty in establishing what erroneous thoughts
and beliefs can be attributed to childhood development and which thoughts and beliefs can be attributed to schizophrenia. With psychiatric medication (usually antipsychotics) and therapy, individuals with schizophrenia can live successful and productive lives.
Role of religion in schizophrenia treatment
It has been shown in longitudinal studies that those suffering from
schizophrenia have varying degrees of success when religion plays a
significant role in their recovery. It would seem that the use of religion can either be a helpful method
of coping with the disorder, or it can be a significant hindrance in
recovery. Especially for those who are active in a religious community, religion can also be a very valuable tool in coping with the disorder. It can be difficult, however, to distinguish if a religious experience
is genuine to the spiritual person, or if it is a positive symptom of
the illness.
This is where a skilled and reliable therapist can help. Provided that a therapist
is open to the use of religion in one's treatment, and that the patient
is open and receiving said treatment, it is entirely possible to tie
religion in with professional therapeutic aids and medication
in order to meet a desirable goal. Those who are involved in their
church and are spiritual on a daily basis, while getting psychiatric
treatment have reported fewer symptoms and a better quality of life.
They learn to see their religion as a source of hope rather than a
tormenting reality.
Religion as a trigger for schizophrenia
Schizophrenia can be triggered by a variety of environmental factors,
including significant stress, intensely emotional situations, and
disturbing or uncomfortable experiences. It is possible that religion
can sometimes be a trigger for schizophrenia in those who are
vulnerable. Religious imagery is often very grandiose, and beckons a
large personal change within an individual. This could potentially lead
to a psychotic episode due to the shift in realistic thinking. A
sufferer may believe that they themselves are a deity or messiah.
These symptoms may cause violent behavior, either toward others or the patient themselves. In some instances, they may also experience distressing symptoms if
they believe a god is inducing illness as punishment. The patient may
refuse treatment based on religious speculation. In certain instances,
one might believe that the delusions and hallucinations are a divine
experience, and therefore deny medical treatment.
The neuroscience of religion, also known as "neurotheology" or "spiritual neuroscience," seeks to explain the biological and neurological processes behind religious experience. Researchers in this field study correlations of the biological neural phenomena, in addition to subjective experiences of spirituality,
in order to explain how brain activity functions in response to
religious and spiritual practices and beliefs. This contrasts with the psychology of religion, which studies the behavioral responses to religious practices. Some people do warn of the limitations of neurotheology, as they worry that it may simplify the socio-cultural complexity of religion down to neurological factors.
Researchers that study the field of the neuroscience of religion
use a formulation of scientific techniques to understand the
correlations between brain pathways in response to spiritually based
stimuli. The approach is interdisciplinary with neurological and evolutionary studies in order to understand the broader subjective experiences under which traditionally categorized spiritual or religious practices are organized. This is done through a multilateral approach of scientific and cultural studies. Such studies include but are not limited to fMRI and EEG scans, theological studies, and anthropological
studies. By using these approaches, researchers can better understand
how spirituality and religion affect the chemistry of human brains and
in turn how brain activity may affect experiences of transcendence and spirituality.
Terminology
Aldous Huxley
was a writer and philosopher who wrote over 50 books and novels on
topics ranging from dystopia science fiction to philosophical mysticism.
Neurotheology
Aldous Huxley coined the term "neurotheology" for the first time in his utopian novel Island. In this, he described the discipline as a combination of cognitive neuroscience of religious experience and spirituality. The term has also been used in a less scientific context, as a subcategory of philosophy. In some cases, according to the mainstream scientific community, it is considered a pseudoscience.
Biocultural
In Armin W. Geertz article on Brain, Body and Culture: A Biocultural Theory of Religion, the term "biocultural" refers to the simultaneous intersection of humans as both biological and cultural animals. In his article, Geertz discusses the connection between the human brain
and the rest of the body, stating that the brain does not work
independently, but rather in unison with other sense organs in the body.
Essentially, arguing that the "cognition functions in the embodiment of
the brain." With this, he says that religio-spiritual practices (such as dancing,
chanting, or the use of psychoactive substances) that engage the other
senses, have physical effects on brain chemistry. This varies
cross-culturally, as different cultural and religious practices engage
in different methods to induce senses of divine transcendence. This, in
turn, demonstrates the connection between biology and cultural contexts,
since neither are uniform.
Religion
Spiritual practices and religious rituals have been around for
hundreds of thousands of years with some dating as far back as 300,000
in the Rising Star Cave with the discovery of Homo Naledi. Dave Vliegenthart's article Can Neurotheology Explain Religion?
aims at answering the question of neurotheology as a legitimate way of
explaining religious experiences. In this he defines the term "religion"
as a "state of consciousness in which reality is deemed religious and
thought and experienced through the lens of a particular human
mind-set." This is categorized under feelings of intuition, higher or altered
states of consciousness, or a connection to a divine being. Through
attempts to achieve religious ecstasy,
people have tried to connect to divine or ethereal beings as a way to
breed human connection in addition to achieving higher wisdom. This goal
of attaining eternal knowledge or harmony with the universe is
demonstrated cross culturally, as mentioned above in Geertz's work on
biocultural studies.
Consciousness
According to an article in Scientific American, "consciousness" is everything a person experiences: a personal sense of reality based on experiences of one's own real life events. The article discusses the neuronal correlates of consciousness and the
neurological process that go behind the brain's formations of conscious
thinking, saying how the senses relay information through the spinal
cord to the cerebellum in order to translate physical experience into
neurological interpretation. For hundreds of thousands of years humans have been trying to find ways
to alter their states of consciousness. This varies widely across
cultural groups, religious practices, and more so when looking from
individual to individual. In Ancient Greece, maenads would attempt this by ecstatic and frenzied dance. In Sufi Mysticism, also known as Rumism, there is a similar practice of the whirling dervishes
where spinning in circles to music is done in order to create a
connection with the Divine. In some more extreme cases, it may include
forms of asceticism such as fasting, celibacy, or extreme isolation.
History, Developments, and Theoretical Work
In an attempt to focus and clarify what was a growing interest in
this field, 1994 educator and businessman Laurence O. McKinney published
the first book on the subject, titled Neurotheology: Virtual Religion in the 21st Century. In addition to being written for a popular audience, it was also promoted in the theological journal Zygon. According to McKinney, "neurotheology" sources the basis of religious
inquiry in relatively recent developmental neurophysiology. McKinney's
theory emphasizes how pre-frontal development in humans creates an
illusion of chronological time as a fundamental part of normal adult
cognition past the age of three. The inability of the adult brain to
retrieve earlier images experienced by an infantile brain creates
questions such as "Where did I come from?" and "Where does it all go?"
He suggests that this neurological process led to the creation of
various religious explanations. Moreover, studies behind the experience
of death as a peaceful regression into timelessness as the brain dies
won praise from readers as varied as writer Arthur C. Clarke, eminent theologian Harvey Cox, and the Dalai Lama and sparked a new interest in the field. Similarly, radical Catholic theologian Eugen Drewermann developed a two-volume critique of traditional conceptions of God and the soul in which he reinterpreted religion based on contemporary neuroscientific research.
The neuroscientist Andrew B. Newberg
has claimed that "intensely focused spiritual contemplation triggers an
alteration in the activity of the brain that leads one to perceive
transcendent religious experiences as solid, tangible reality. In other
words, the sensation that Buddhists call oneness with the universe." The orientation
area requires sensory input to do its calculus. "If you block sensory
inputs to this region, as you do during the intense concentration of meditation, you prevent the brain from forming the distinction between self
and not-self," says Newberg. With no information from the senses
arriving, the left orientation area cannot find any boundary between the
self and the world. As a result, the brain seems to have no choice but
"to perceive the self as endless and intimately interwoven with everyone
and everything." "The right orientation area, equally bereft of sensory
data, defaults to a feeling of infinite space. The meditators feel that
they have touched infinity." Still, it has also been argued "that neurotheology should be conceived and practiced within a theological framework."
Experimental Work
In 1969, British biologist Alister Hardy
founded a Religious Experience Research Centre (RERC) at Oxford after
retiring from his post as Linacre Professor of Zoology. Citing William James's The Varieties of Religious Experience (1902), he set out to collect first-hand accounts of numinous experiences. He was awarded the Templeton Prize before his death in 1985. His successor David Hay suggested in God's Biologist: A Life of Alister Hardy (2011) that the RERC later dispersed as investigators turned to newer techniques of scientific investigation.
Magnetic Stimulation Studies
During the 1980s, Michael Persinger stimulated the temporal lobes of human subjects with a weak magnetic field using an apparatus that popularly became known as the "God helmet" and reported that many of his subjects claimed to experience a "sensed presence" during stimulation. This work has been criticised, though some researchers have published a replication of one God Helmet experiment.
Granqvist et al. claimed that Persinger's work was not double-blind, and failed to replicate
Persinger's experiments double-blinded. They concluded that the
magnetic field had no effect on any religious or spiritual experience
reported by the participants, but was predicted entirely by their
suggestibility and personality traits. Following the publication of this
study, Persinger et al. dispute this. One published attempt to create a "haunted room" using environmental
"complex" electromagnetic fields based on Persinger's theoretical and
experimental work did not produce the sensation of a "sensed presence"
and found that reports of unusual experiences were uncorrelated with the
presence or absence of these fields. As in the study by Granqvist et al., reports of unusual experiences were instead predicted by the personality characteristics and suggestibility of participants. One experiment with a commercial version of the God helmet found no
difference in response to graphic images whether the device was on or
off.
Vilayanur S. Ramachandran explored the neural basis of the hyperreligiosity seen in TLE using the galvanic skin response
(GSR), which correlates with emotional arousal, to determine whether
the hyperreligiosity seen in TLE was due to an overall heightened
emotional state or was specific to religious stimuli. Ramachandran
presented two subjects with neutral, sexually arousing and religious
words while measuring GSR. Ramachandran was able to show that patients
with TLE showed enhanced emotional responses to the religious words,
diminished responses to the sexually charged words, and normal responses
to the neutral words. This study was presented as an abstract at a
neuroscience conference and referenced in Ramachandran's book, Phantoms in the Brain, which was not published as a peer-reviewed scientific article.
Research by Mario Beauregard at the University of Montreal, using fMRI on Carmelite
nuns, has purported to show that religious and spiritual experiences
include several brain regions and not a single 'God spot'. As Beauregard
has said, "There is no God spot in the brain. Spiritual experiences are
complex, like intense experiences with other human beings." The neuroimaging was conducted when the nuns were asked to recall
past mystical states, not while actually undergoing them; "subjects
were asked to remember and relive (eyes closed) the most intense
mystical experience ever felt in their lives as a member of the
Carmelite Order." A 2011 study by researchers at the Duke University Medical Center found hippocampal atrophy is associated with older adults who report life-changing religious experiences, as well as those who are "born-again Protestants, Catholics, and those with no religious affiliation".
A 2016 study using fMRI found "a recognizable feeling central to ... (Mormon)... devotional practice was reproducibly associated with activation in nucleus accumbens, ventromedial prefrontal cortex, and frontal attentional regions. Nucleus accumbens
activation preceded peak spiritual feelings by 1–3 s and was replicated
in four separate tasks. ... The association of abstract ideas and brain
reward circuitry may interact with frontal attentional and emotive
salience processing, suggesting a mechanism whereby doctrinal concepts
may come to be intrinsically rewarding and motivate behavior in
religious individuals."
Psychopharmacology
Some scientists working in the field hypothesize that the basis of spiritual experience arises in neurological physiology. Speculative suggestions have been made that an increase of dimethyltryptamine (DMT) levels in the pineal gland contribute to spiritual experiences. It has also been suggested that stimulation of the temporal lobe by psychoactive ingredients of magic mushrooms mimics religious experiences. This hypothesis has found laboratory validation with respect to psilocybin.
Cognitive science of religion is the study of religious thought, theory, and behavior from the perspective of the cognitive sciences.
Scholars in this field seek to explain how human minds acquire,
generate, and transmit religious thoughts, practices, and schemas by
means of ordinary cognitive capacities.
History
Although religion has been the subject of serious scientific study
since at least the late nineteenth century, the study of religion as a
cognitive phenomenon is relatively recent. While it often relies upon
earlier research within anthropology of religion and sociology of religion,
cognitive science of religion considers the results of that work within
the context of evolutionary and cognitive theories. As such, cognitive
science of religion was only made possible by the cognitive revolution of the 1950s and the development, starting in the 1970s, of sociobiology and other approaches explaining human behaviour in evolutionary terms, especially evolutionary psychology.
While Dan Sperber foreshadowed cognitive science of religion in his 1975 book Rethinking Symbolism, the earliest research to fall within the scope of the discipline was published during the 1980s. Stewart E. Guthrie's "A cognitive theory of religion" was significant for examining anthropomorphism in religion. This work ultimately led to the development of the concept of the hyperactive agency detection device, which is a key concept within cognitive science of religion. The work of Scott Atran on Cognitive Foundations of Natural History: Towards an Anthropology of Science
contrasted the cognitive processing of attention-arresting, and
therefore memorable and culturally transmissible, aspects of
counter-intuitive "mythico-religious beliefs" (e.g., bodiless beings)
with counter-intuitive aspects of scientific thinking that also
initially violate common-sense ontological assumptions about the
structure of the world (e.g., invisible creatures).
The field was formally established in the 1990s. During that
decade, a large number of highly influential and foundational books and
articles were published. These included Rethinking Religion: Connecting Cognition and Culture and Bringing Ritual to Mind: Psychological Foundations of Cultural Forms by E. Thomas Lawson and Robert McCauley, Naturalness of Religious Ideas by Pascal Boyer, Inside the Cult and Arguments and Icons by Harvey Whitehouse, and Guthrie's book-length development of his earlier theories in Faces in the Clouds.
In the 1990s, these and other researchers, who had been working
independently in a variety of different disciplines, discovered each
other's work and found valuable parallels between their approaches, with
the result that something of a self-aware research tradition began to
coalesce.[citation needed] By 2000, the field was well-enough defined for Justin L. Barrett to coin the term 'cognitive science of religion' in his article "Exploring the natural foundations of religion".
The field remains somewhat loosely defined, bringing together
researchers from various subfields. Much of the cohesion in the field
comes not from shared detailed theoretical commitments but from a shared
methodological perspective: the willingness to view religion in
cognitive and evolutionary terms.
Despite a lack of agreement concerning the theoretical basis for work
in cognitive science of religion, it is possible to outline some
tendencies. Most significant of these is reliance upon the theories
developed within evolutionary psychology.
That particular approach to evolutionary explanations of human
behaviour is particularly suitable to the cognitive byproduct
explanation of religion that is most popular among cognitive scientists
of religion. This is because of the focus on byproduct and ancestral trait
explanations within evolutionary psychology. A particularly significant
concept associated with this approach is modularity of mind,
used as it is to underpin accounts of the mental mechanisms seen to be
responsible for religious beliefs. Important examples of work that falls
under this rubric are provided by research carried out by Pascal Boyer and Justin L. Barrett.
These theoretical commitments are not shared by all cognitive
scientists of religion, however. Ongoing debates regarding the
comparative advantages of different evolutionary explanations for human
behaviour find a reflection within cognitive science of religion with dual inheritance theory
recently gaining adherents among researchers in the field, including
Armin Geertz and Ara Norenzayan. The perceived advantage of this
theoretical framework is its ability to deal with more complex
interactions between cognitive and cultural phenomena, but it comes at
the cost of experimental design having to take into consideration a
richer range of possibilities.
Main concepts
Cognitive byproduct
The view that religious beliefs and practices should be understood as
nonfunctional but as produced by human cognitive mechanisms that are
functional outside of the context of religion. Examples of this are the
hyperactive agent detection device and the minimally counterintuitive concepts or the process of initiation explaining Buddhism and Taoism. The cognitive byproduct explanation of religion is an application of the concept of spandrel and of the concept of exaptation explored by Stephen Jay Gould among others. The view that religious beliefs and practices are evolutionary spandrels has a number of critics.
Minimally counterintuitive concepts
Concepts that mostly fit human preconceptions but break with them in
one or two striking ways. These concepts are both easy to remember
(thanks to the counterintuitive elements) and easy to use (thanks to
largely agreeing with what people expect). Examples include talking
trees and noncorporeal agents. Pascal Boyer argues that many religious entities fit into this category. Upal labelled the fact that minimally counterintuitive ideas are better
remembered than intuitive and maximally counterintuitive ideas as the minimal counterintuitiveness effect or the MCI-effect.
Hyperactive agency detection device
Cognitive scientistJustin L. Barrett
postulates that this mental mechanism, whose function is to identify
the activity of agents, may contribute to belief in the presence of the
supernatural. Given the relative costs of failing to spot an agent, the
mechanism is said to be hyperactive, producing a large number of false positive errors. Stewart E. Guthrie and others have claimed these errors can explain the appearance of supernatural concepts.
Pro-social adaptation
According to the prosocial adaptation account of religion, religious
beliefs and practices should be understood as having the function of
eliciting adaptive prosocial behaviour and avoiding the free rider problem. Within the cognitive science of religion this approach is primarily pursued by Richard Sosis. David Sloan Wilson
is another major proponent of this approach and interprets religion as a
group-level adaptation, but his work is generally seen as falling
outside the cognitive science of religion.
Costly signaling
Practices that, due to their inherent cost, can be relied upon to provide an honest signal
regarding the intentions of the agent. Richard Sosis has suggested that
religious practices can be explained as costly signals of the
willingness to cooperate. A similar line of argument has been pursued by
Lyle Steadman and Craig Palmer. Alternatively, D. Jason Slone has
argued that religiosity may be a costly signal used as a mating strategy insofar as religiosity serves as a proxy for "family values".
Dual inheritance
In the context of cognitive science of religion, dual inheritance
theory can be understood as attempting to combine the cognitive
byproduct and prosocial adaptation accounts using the theoretical
approach developed by Robert Boyd and Peter Richerson,
among others. The basic view is that while belief in supernatural
entities is a cognitive byproduct, cultural traditions have recruited
such beliefs to motivate prosocial behaviour. A sophisticated statement
of this approach can be found in Scott Atran and Joseph Henrich (2010).
Humanity's closest living relatives are common chimpanzees and bonobos. These primates share a common ancestor
with humans who lived between six and eight million years ago. It is
for this reason that chimpanzees and bonobos are viewed as the best
available surrogate for this common ancestor. Barbara King argues that
while non-human primates are not religious, they do exhibit some traits
that would have been necessary for the evolution of religion. These
traits include high intelligence, a capacity for symbolic communication, a sense of social norms, and realization of "self" continuity.
Elephants
perform rituals for their dead. They demonstrate long periods of
silence and mourning at the point of death; later, elephants return to
grave sites and caress the remains. Some evidence suggests that many species grieve death and loss.
Relevant prerequisites for human religion
Increased brain size
In this set of theories, the religious mind is one consequence of a
brain that is large enough to formulate religious and philosophical
ideas. During human evolution, the hominid brain tripled in size, peaking 500,000 years ago. Much of the brain's expansion took place in the neocortex. The cerebral neocortex is presumed to be responsible for the neural computations underlying complex phenomena such as perception, thought, language, attention, episodic memory and voluntary movement. According to Dunbar's theory, the relative neocortex size of any species correlates with the level of social complexity of the particular species. The neocortex size correlates with a number of social variables that
include social group size and complexity of mating behaviors. In chimpanzees the neocortex occupies 50% of the brain, whereas in modern humans it occupies 80% of the brain.
Robin Dunbar argues that the critical event in the evolution of the neocortex took place at the speciation of archaic Homo sapiens
about 500,000 years ago. His study indicates that only after the
speciation event is the neocortex large enough to process complex social
phenomena such as language and religion. The study is based on a regression analysis of neocortex size plotted against a number of social behaviors of living and extinct hominids.
Stephen Jay Gould
suggests that religion may have grown out of evolutionary changes that
favored larger brains as a means of cementing group coherence among
savanna hunters, after that larger brain enabled reflection on the
inevitability of personal mortality.
Tool use
Lewis Wolpert
argues that causal beliefs that emerged from tool use played a major
role in the evolution of belief. The manufacture of complex tools
requires creating a mental image of an object that does not exist
naturally before actually making the artifact. Furthermore, one must
understand how the tool would be used, that requires an understanding of
causality. Accordingly, the level of sophistication of stone tools is a useful indicator of causal beliefs. Wolpert contends use of tools composed of more than one component, such
as hand axes, represents an ability to understand cause and effect.
However, recent studies of other primates indicate that causality may
not be a uniquely human trait. For example, chimpanzees have been known
to escape from pens closed with multiple latches, which was previously
thought could only have been figured out by humans who understood
causality. Chimpanzees are also known to mourn the dead, and notice
things that have only aesthetic value, like sunsets, both of which may
be considered to be components of religion or spirituality. The difference between the comprehension of causality by humans and
chimpanzees is one of degree. The degree of comprehension in an animal
depends upon the size of the prefrontal cortex: the greater the size of
the prefrontal cortex the deeper the comprehension.
Religion requires a system of symbolic communication, such as language, to be transmitted from one individual to another. Philip Lieberman states "human religious thought and moral sense clearly rest on a cognitive-linguistic base". From this premise science writer Nicholas Wade states:
"Like most behaviors that are found in societies throughout the
world, religion must have been present in the ancestral human population
before the dispersal from Africa 50,000 years ago. Although religious
rituals usually involve dance and music, they are also very verbal,
since the sacred truths have to be stated. If so, religion, at least in
its modern form, cannot pre-date the emergence of language. It has been
argued earlier that language attained its modern state shortly before
the exodus from Africa. If religion had to await the evolution of
modern, articulate language, then it too would have emerged shortly
before 50,000 years ago."
Another view distinguishes individual religious belief from
collective religious belief. While the former does not require prior
development of language, the latter does. The individual human brain has
to explain a phenomenon in order to comprehend and relate to it. This
activity predates by far the emergence of language and may have caused
it. The theory is, belief in the supernatural emerges from hypotheses
arbitrarily assumed by individuals to explain natural phenomena that
cannot be explained otherwise. The resulting need to share individual
hypotheses with others leads eventually to collective religious belief. A
socially accepted hypothesis becomes dogmatic backed by social
sanction.
Language consists of digital contrasts whose cost is essentially
zero. As pure social conventions, signals of this kind cannot evolve in a
Darwinian social world—they are a theoretical impossibility.Being intrinsically unreliable, language works only if one can build up
a reputation for trustworthiness within a certain kind of
society—namely, one where symbolic cultural facts (sometimes called
'institutional facts') can be established and maintained through
collective social endorsement. In any hunter-gatherer society, the basic mechanism for establishing trust in symbolic cultural facts is collective ritual.
Transcending the continuity-versus-discontinuity divide, some
scholars view the emergence of language as the consequence of some kind
of social transformation that, by generating unprecedented levels of public trust, liberated a
genetic potential for linguistic creativity that had previously lain
dormant. "Ritual/speech coevolution theory" exemplifies this approach. Scholars in this intellectual camp point to the fact that even chimpanzees and bonobos have latent symbolic capacities that they rarely—if ever—use in the wild. Objecting to the sudden mutation idea, these authors argue that even if
a chance mutation were to install a language organ in an evolving
bipedal primate, it would be adaptively useless under all known primate
social conditions. A very specific social structure—one capable of
upholding unusually high levels of public accountability and trust—must
have evolved before or concurrently with language to make reliance on
"cheap signals" (words) an evolutionarily stable strategy.
The animistic nature of early human language could serve as the
handicap-like cost that helped to ensure the reliability of
communication. The attribution of spiritual essence to everything
surrounding early humans served as a built-in mechanism that provided
instant verification and ensured
the inviolability of one's speech.
Animal vocal signals are, for the most part, intrinsically
reliable. When a cat purrs, the signal constitutes direct evidence of
the animal's contented state. The signal is trusted, not because the cat
is inclined to be honest, but because it just cannot fake that sound.
Primate vocal calls may be slightly more manipulable, but they remain
reliable for the same reason—because they are hard to fake. Primate social intelligence is "Machiavellian"—self-serving
and unconstrained by moral scruples. Monkeys and apes often attempt to
deceive each other, while at the same time remaining constantly on guard
against falling victim to deception themselves. Paradoxically, it is theorized that primates' resistance to deception
is what blocks the evolution of their signalling systems along
language-like lines. Language is ruled out because the best way to guard
against being deceived is to ignore all signals except those that are
instantly verifiable. Words automatically fail this test.
Frans de Waal and Barbara King both view human morality as having grown out of primate sociality. Although morality awareness may be a unique human trait, many social animals, such as primates, dolphins and whales, have been known to exhibit pre-moral sentiments. According to Michael Shermer, the following characteristics are shared by humans and other social animals, particularly the great apes:
attachment and bonding, cooperation
and mutual aid, sympathy and empathy, direct and indirect reciprocity,
altruism and reciprocal altruism, conflict resolution and peacemaking,
deception and deception detection, community concern and caring about
what others think about you, and awareness of and response to the social
rules of the group.
De Waal contends that all social animals have had to restrain or
alter their behavior for group living to be worthwhile. Pre-moral
sentiments evolved in primate societies as a method of restraining
individual selfishness and building more cooperative groups. For any
social species, the benefits of being part of an altruistic group should
outweigh the benefits of individualism. For example, a lack of group
cohesion could make individuals more vulnerable to attack from
outsiders. Being part of a group may also improve the chances of finding
food. This is evident among animals that hunt in packs to take down large or dangerous prey.
All social animals have hierarchical societies in which each
member knows its own place. Social order is maintained by certain rules
of expected behavior and dominant group members enforce order through
punishment. Additionally, higher order primates also have a sense of
fairness.
Chimpanzees live in fission-fusion groups that average 50 individuals. It is likely that early ancestors of humans lived in groups of similar size. Based on the size of extant hunter-gatherer societies, recent Paleolithic hominids
lived in bands of a few hundred individuals. As community size
increased over the course of human evolution, greater enforcement to
achieve group cohesion would have been required. Morality may have
evolved in these bands of 100 to 200 people as a means of social
control, conflict resolution and group solidarity. According to Dr. de
Waal, human morality has two extra levels of sophistication that are not
found in primate societies.
Psychologist Matt J. Rossano argues that religion emerged after
morality and built upon morality by expanding the social scrutiny of
individual behavior to include supernatural
agents. By including ever-watchful ancestors, spirits and gods in the
social realm, humans discovered an effective strategy for restraining
selfishness and building more cooperative groups. The adaptive value of religion would have enhanced group survival. Rossano is referring here to collective religious belief and the social
sanction that institutionalized morality. According to Rossano's
teaching, individual religious belief is thus initially epistemological,
not ethical, in nature.
Cognitive scientists underlined that religions may be explained as a
result of the brain architecture that developed early in the genus Homo in the course of the evolutionary history of life.
Nonetheless, there is disagreement on the exact mechanisms that drove
the evolution of the religious mind. The two main schools of thought
hold:
either that religion evolved due to natural selection and has selective advantage
or that religion is an evolutionary byproduct of other mental adaptations.
Stephen Jay Gould, for example, saw religion as an exaptation or a spandrel, in other words: religion evolved as byproduct of psychological mechanisms that evolved for other reasons.
Such mechanisms may include the ability to infer the presence of organisms that might do harm (agent detection), the ability to come up with causal narratives for natural events (etiology), and the ability to recognize that other people have minds of their own with their own beliefs, desires and intentions (theory of mind).
These three adaptations (among others) allow human beings to imagine
purposeful agents behind many observations that could not readily be
explained otherwise, e.g. thunder, lightning, movement of planets,
complexity of life. The emergence of collective religious belief identified such agents as deities that standardized the explanation.
Some scholars have suggested that religion is genetically "hardwired" into the human condition. One controversial proposal, the God gene hypothesis, states that some variants of a specific gene, the VMAT2 gene, predispose to spirituality.
Another view builds on the concept of the triune brain: the reptilian brain, the limbic system, and the neocortex, proposed by Paul D. MacLean.
Collective religious belief draws upon the emotions of love, fear, and
gregariousness and is deeply embedded in the limbic system through
socio-biological conditioning and social sanction. Individual religious
belief utilizes reason based in the neocortex and often varies from
collective religion. The limbic system is much older in evolutionary
terms than the neocortex and is, therefore, stronger than it – much in
the same way as the reptilian is stronger than both the limbic system
and the neocortex.
Yet another view is that the behavior of people who participate in a religion makes them feel better and this improves their biological fitness,
so that there is a genetic selection in favor of people who are willing
to believe in a religion. Specifically, rituals, beliefs, and the
social contact typical of religious groups may serve to calm the mind
(for example by reducing ambiguity and the uncertainty due to
complexity) and allow it to function better when under stress. This would allow religion to be used as a powerful survival mechanism, particularly in facilitating the evolution of hierarchies of warriors, which if true, may be why many modern religions tend to promote fertility and kinship.
Still another view, proposed by Fred H. Previc, sees human religion as a product of an increase in dopaminergic functions in the human brain and of a general intellectual expansion beginning around 80 thousand years ago (kya). Dopamine promotes an emphasis on distant space and time, which can correlate with religious experience. While the earliest extant shamanic cave-paintings date to around 40 kya,
the use of ocher for rock art predates this and there is clear evidence
for abstract thinking along the coast of South Africa 80 kya.
Paul Bloom suggests that "certain early emergent cognitive biases ... make it natural to believe in Gods and spirits".
The earliest evidence of religious thought is based on the ritual
treatment of the dead. Most animals display only a casual interest in
the dead of their own species. Ritual burial thus represents a significant change in human behavior.
Ritual burials represent an awareness of life and death and a possible
belief in the afterlife. Philip Lieberman states "burials with grave goods clearly signify religious practices and concern for the dead that transcends daily life."
The earliest evidence for treatment of the dead comes from Atapuerca in Spain. At this location the bones of 30 individuals believed to be Homo heidelbergensis have been found in a pit. Neanderthals are also contenders for the first hominids
to intentionally bury the dead. They may have placed corpses into
shallow graves along with stone tools and animal bones. The presence of
these grave goods
may indicate an emotional connection with the deceased and possibly a
belief in the afterlife. Neanderthal burial sites include Shanidar in Iraq and Krapina in Croatia and Kebara Cave in Israel.
The earliest known burial of modern humans is from a cave in Israel located at Qafzeh. Human remains have been dated to 100,000 years ago. Human skeletons were found stained with red ocher.
A variety of grave goods were found at the burial site. The mandible of
a wild boar was found placed in the arms of one of the skeletons. Philip Lieberman states:
Burial rituals incorporating grave
goods may have been invented by the anatomically modern hominids who
emigrated from Africa to the Middle East roughly 100,000 years ago
Matt Rossano suggests that the period between 80,000 and 60,000 years
before present, following the retreat of humans from the Levant to
Africa, was a crucial period in the evolution of religion.
Use of symbolism
The use of symbolism in religion is a universal established phenomenon. Archeologist Steven Mithen
contends that it is common for religious practices to involve the
creation of images and symbols to represent supernatural beings and
ideas. Because supernatural beings violate the principles of the natural
world, there will always be difficulty in communicating and sharing
supernatural concepts with others. This problem can be overcome by
anchoring these supernatural beings in material form through
representational art. When translated into material form, supernatural
concepts become easier to communicate and understand. Due to the association of art and religion, evidence of symbolism in
the fossil record is indicative of a mind capable of religious thoughts.
Art and symbolism demonstrates a capacity for abstract thought and
imagination necessary to construct religious ideas. Wentzel van
Huyssteen states that the translation of the non-visible through
symbolism enabled early human ancestors to hold beliefs in abstract
terms.
Some of the earliest evidence of symbolic behavior is associated with Middle Stone Age sites in Africa. From at least 100,000 years ago, there is evidence of the use of pigments such as red ocher.
Pigments are of little practical use to hunter gatherers, thus evidence
of their use is interpreted as symbolic or for ritual purposes. Among
extant hunter gatherer populations around the world, red ocher is still
used extensively for ritual purposes. It has been argued that it is
universal among human cultures for the color red to represent blood,
sex, life and death.
The use of red ocher as a proxy for symbolism is often criticized as being too indirect. Some scientists, such as Richard Klein and Steven Mithen,
only recognize unambiguous forms of art as representative of abstract
ideas. Upper Paleolithic cave art provides some of the most unambiguous
evidence of religious thought from the Paleolithic. Cave paintings at Chauvet depict creatures that are half human and half animal.
Organized religion traces its roots to the Neolithic Revolution that began 11,000 years ago in the Near East,
but may have occurred independently in several other locations around
the world. The invention of agriculture transformed many human societies
from a hunter-gatherer lifestyle to a sedentary lifestyle.
The Neolithic Revolution led to a population explosion and an
acceleration in the pace of technological development. The transition
from foraging bands to states and empires precipitated more specialized
and developed forms of religion that reflected the new social and
political environment. While bands and small tribes possess supernatural
beliefs, these beliefs do not serve to justify a central authority,
justify transfer of wealth or maintain peace between unrelated
individuals.
Organized religion emerged as a means of providing social and economic
stability through the following ways:
Justifying the central authority, which in turn possessed the
right to collect taxes in return for providing social and security
services.
Bands and tribes consist of small number of related individuals.
States and nations are composed of many thousands of unrelated
individuals. Jared Diamond
argues that organized religion served to provide a bond between
unrelated individuals who would otherwise be more prone to enmity. In
his book Guns, Germs, and Steel he argues that the leading cause of death among hunter-gatherer societies is murder.
Religions that revolved around moralizing gods may have facilitated
the rise of large, cooperative groups of unrelated individuals.
The states born out of the Neolithic Revolution, such as those of Ancient Egypt and Mesopotamia, were theocracies with chiefs, kings and emperors playing dual roles of political and spiritual leaders. Anthropologists have found that virtually all state societies and
chiefdoms from around the world have been found to justify political
power through divine authority. This suggests that political authority
co-opts collective religious belief to bolster itself.
Following the Neolithic Revolution, the pace of technological
development (cultural evolution) intensified due to the invention of
writing 5,000 years ago. Symbols that became words later on made
effective communication of ideas possible. Printing, invented only over a
thousand years ago, rapidly increased the speed of communication and
became the main spring of cultural evolution.
Writing is thought to have been first invented in either Sumeria or
Ancient Egypt, and was initially used for accounting. Soon after,
writing was used to record myth. The first religious texts mark the
beginning of religious history. The Pyramid Texts from ancient Egypt form one of the oldest known religious texts in the world, dating to between 2400 and 2300 BCE. Writing played a major role in sustaining and spreading organized
religion. In pre-literate societies, religious ideas were based on an oral tradition,
which was articulated by shamans and remained limited to the collective
memories of the society's inhabitants. With the advent of writing,
information that was not easy to remember could easily be stored in
sacred texts that were maintained by a select group (clergy). Humans
could store and process large amounts of information with writing that
otherwise would have been forgotten. Writing therefore enabled religions
to develop coherent and comprehensive doctrinal systems that remained
independent of time and place. Writing also brought a measure of objectivity to human knowledge.
Formulation of thoughts in words and the requirement for validation made
possible the mutual exchange of ideas and the sifting of generally
acceptable from unacceptable ideas. The generally acceptable ideas
became objective knowledge reflecting the continuously evolving
framework of human awareness of reality that Karl Popper calls 'verisimilitude' – a stage on the human journey to truth.
The school of skepticism questions the human ability to attain knowledge, while fallibilism says that knowledge is never certain. Empiricists hold that all knowledge comes from sense experience, whereas rationalists believe that some knowledge does not depend on it. Coherentists argue that a belief is justified if it is consistent with other beliefs. Foundationalists, by contrast, maintain that the justification of basic beliefs does not depend on other beliefs. Internalism and externalism debate whether justification is determined solely by mental states or also by external circumstances.
Separate branches of epistemology focus on knowledge in specific
fields, like scientific, mathematical, moral, and religious knowledge. Naturalized epistemology relies on empirical methods and discoveries, whereas formal epistemology uses formal tools from logic. Social epistemology investigates the communal aspect of knowledge, and historical epistemology examines its historical conditions. Epistemology is closely related to psychology,
which infers the beliefs people hold from their words and actions,
while epistemology studies the norms governing the evaluation of
beliefs. It also intersects with fields such as decision theory, education, and anthropology.
Early reflections on the nature, sources, and scope of knowledge are found in ancient Greek, Indian, and Chinese philosophy. The relation between reason and faith was a central topic in the medieval period. The modern era
was characterized by the contrasting perspectives of empiricism and
rationalism. Epistemologists in the 20th century examined the
components, structure, and value of knowledge while integrating insights
from the natural sciences and linguistics.
Definition
Epistemology is the philosophical study of knowledge and related concepts, such as justification. Also called theory of knowledge, it examines the nature and types of knowledge. It further investigates the sources of knowledge, like perception, inference, and testimony,
to understand how knowledge is created. Another set of questions
concerns the extent and limits of knowledge, addressing what people can
and cannot know. Central concepts in epistemology include belief, truth, evidence, and reason. As one of the main branches of philosophy, epistemology stands alongside fields like ethics, logic, and metaphysics. The term can also refer to specific positions of philosophers within this branch, as in Plato's epistemology and Immanuel Kant's epistemology.
Epistemology explores how people should acquire beliefs. It
determines which beliefs or forms of belief acquisition meet the
standards or epistemic goals of knowledge and which ones fail, thereby
providing an evaluation of beliefs. The fields of psychology and cognitive sociology
are also interested in beliefs and related cognitive processes, but
examine them from a different perspective. Unlike epistemology, they
study the beliefs people actually have and how people acquire them
instead of examining the evaluative norms of these processes. In this regard, epistemology is a normative discipline, whereas psychology and cognitive sociology are descriptive disciplines. Epistemology is relevant to many descriptive and normative disciplines,
such as the other branches of philosophy and the sciences, by exploring
the principles of how they may arrive at knowledge.
The word epistemology comes from the ancient Greek terms ἐπιστήμη (episteme, meaning knowledge or understanding) and λόγος (logos, meaning study of or reason), literally,
the study of knowledge. Despite its ancient roots, the word itself was
coined only in the 19th century to designate this field as a distinct
branch of philosophy.
Central concepts
Epistemologists examine several foundational concepts to understand
their essences and rely on them to formulate theories. Various
epistemological disagreements have their roots in disputes about the
nature and function of these concepts, like the controversies
surrounding the definition of knowledge and the role of justification in it.
Knowledge is an awareness, familiarity, understanding, or skill. Its
various forms all involve a cognitive success through which a person
establishes epistemic contact with reality. Epistemologists typically understand knowledge as an aspect of individuals, generally as a cognitive mental state
that helps them understand, interpret, and interact with the world.
While this core sense is of particular interest to epistemologists, the
term also has other meanings. For example, the epistemology of groups
examines knowledge as a characteristic of a group of people who share
ideas. The term can also refer to information stored in documents and computers.
Knowledge contrasts with ignorance,
often simply defined as the absence of knowledge. Knowledge is usually
accompanied by ignorance because people rarely have complete knowledge
of a field, forcing them to rely on incomplete or uncertain information
when making decisions. Even though many forms of ignorance can be mitigated through education
and research, certain limits to human understanding result in inevitable
ignorance. Some limitations are inherent in the human cognitive faculties themselves, such as the inability to know facts too complex for the human mind to conceive. Others depend on external circumstances when no access to the relevant information exists.
Epistemologists disagree on how much people know, for example,
whether fallible beliefs can amount to knowledge or whether absolute
certainty is required. The most stringent position is taken by radical skeptics, who argue that there is no knowledge at all.
Types
Bertrand Russell originated the distinction between propositional knowledge and knowledge by acquaintance.
Epistemologists distinguish between different types of knowledge. Their primary interest is in knowledge of facts, called propositional knowledge. It is theoretical knowledge that can be expressed in declarative sentences using a that-clause, like "Ravi knows that kangaroos hop". For this reason, it is also called knowledge-that. Epistemologists often understand it as a relation between a knower and a known proposition, in the case above between the person Ravi and the proposition "kangaroos hop". It is use-independent since it is not tied to one specific purpose,
unlike practical knowledge. It is a mental representation that embodies
concepts and ideas to reflect reality. Because of its theoretical nature, it is typically held that only
creatures with highly developed minds, such as humans, possess
propositional knowledge.
Propositional knowledge contrasts with non-propositional knowledge in the form of knowledge-how and knowledge by acquaintance. Knowledge-how is a practical ability or skill, like knowing how to read or how to prepare lasagna. It is usually tied to a specific goal and not mastered in the abstract without concrete practice. To know something by acquaintance means to have an immediate
familiarity with or awareness of it, usually as a result of direct
experiential contact. Examples are "familiarity with the city of Perth", "knowing the taste of tsampa", and "knowing Marta Vieira da Silva personally".
The analytic–synthetic distinction has its roots in the philosophy of Immanuel Kant.
Another influential distinction in epistemology is between a posteriori and a priori knowledge. A posteriori knowledge is knowledge of empirical facts based on sensory experience, like "seeing that the sun is shining" and "smelling that a piece of meat has gone bad". This type of knowledge is associated with the empirical science and everyday affairs. A priori
knowledge, by contrast, pertains to non-empirical facts and does not
depend on evidence from sensory experience, like knowing that . It belongs to fields such as mathematics and logic. The distinction between a posteriori and a priori knowledge is central to the debate between empiricists and rationalists regarding whether all knowledge depends on sensory experience.
A closely related contrast is between analytic and synthetic truths.
A sentence is analytically true if its truth depends only on the
meanings of the words it uses. For instance, the sentence "all bachelors
are unmarried" is analytically true because the word "bachelor" already
includes the meaning "unmarried". A sentence is synthetically true if
its truth depends on additional facts. For example, the sentence "snow
is white" is synthetically true because its truth depends on the color
of snow in addition to the meanings of the words snow and white. A priori knowledge is primarily associated with analytic sentences, whereas a posteriori
knowledge is primarily associated with synthetic sentences. However, it
is controversial whether this is true for all cases. Some philosophers,
such as Willard Van Orman Quine, reject the distinction, saying that there are no analytic truths.
The analysis of knowledge is the attempt to identify the essential components or conditions of all and only propositional knowledge states. According to the so-called traditional analysis, knowledge has three components: it is a belief that is justified and true. In the second half of the 20th century, this view was challenged by a series of thought experiments aiming to show that some justified true beliefs do not amount to knowledge. In one of them, a person is unaware of all the fake barns
in their area. By coincidence, they stop in front of the only real barn
and form a justified true belief that it is a real barn. Many epistemologists agree that this is not knowledge because the justification is not directly relevant to the truth. More specifically, this and similar counterexamples involve some form
of epistemic luck, that is, a cognitive success that results from
fortuitous circumstances rather than competence.
The so-called traditional analysis says that knowledge is justified true belief. Edmund Gettier tried to show that some justified true beliefs do not amount to knowledge.
Following these thought experiments, philosophers proposed various alternative definitions of knowledge by modifying or expanding the traditional analysis. According to one view, the known fact has to cause the belief in the right way. Another theory states that the belief is the product of a reliable belief formation process. Further approaches require that the person would not have the belief if it was false, that the belief is not inferred from a falsehood, that the justification cannot be undermined, or that the belief is infallible. There is no consensus on which of the proposed modifications and reconceptualizations is correct. Some philosophers, such as Timothy Williamson, reject the basic assumption underlying the analysis of knowledge by arguing that propositional knowledge is a unique state that cannot be dissected into simpler components.
Value
The value of knowledge is the worth it holds by expanding understanding and guiding action. Knowledge can have instrumental value by helping a person achieve their goals. For example, knowledge of a disease helps a doctor cure their patient. The usefulness of a known fact depends on the circumstances. Knowledge
of some facts may have little to no uses, like memorizing random phone
numbers from an outdated phone book. Being able to assess the value of knowledge matters in choosing what
information to acquire and share. It affects decisions like which
subjects to teach at school and how to allocate funds to research
projects.
Epistemologists are particularly interested in whether knowledge is more valuable than a mere true opinion. Knowledge and true opinion often have a similar usefulness since both
accurately represent reality. For example, if a person wants to go to Larissa, a true opinion about the directions can guide them as effectively as knowledge. Considering this problem, Plato proposed that knowledge is better because it is more stable. Another suggestion focuses on practical reasoning, arguing that people put more trust in knowledge than in mere true opinions when drawing conclusions and deciding what to do. A different response says that knowledge has intrinsic value in
addition to instrumental value. This view asserts that knowledge is
always valuable, whereas true opinion is only valuable in circumstances
where it is useful.
Beliefs are mental states about what is the case, like believing that snow is white or that God exists. In epistemology, they are often understood as subjective attitudes that affirm or deny a proposition, which can be expressed in a declarative sentence.
For instance, to believe that snow is white is to affirm the
proposition "snow is white". According to this view, beliefs are
representations of what the universe is like. They are stored in memory
and retrieved when actively thinking about reality or deciding how to
act. A different view understands beliefs as behavioral patterns or dispositions
to act rather than as representational items stored in the mind.
According to this perspective, to believe that there is mineral water in
the fridge is nothing more than a group of dispositions related to
mineral water and the fridge. Examples are the dispositions to answer
questions about the presence of mineral water affirmatively and to go to
the fridge when thirsty. Some theorists deny the existence of beliefs, saying that this concept borrowed from folk psychology oversimplifies much more complex psychological or neurological processes. Beliefs are central to various epistemological debates, which cover
their status as a component of propositional knowledge, the question of
whether people have control over and responsibility for their beliefs, and the issue of whether beliefs have degrees, called credences.
As propositional attitudes, beliefs are true or false depending on whether they affirm a true or a false proposition. According to the correspondence theory of truth,
to be true means to stand in the right relation to the world by
accurately describing what it is like. This means that truth is
objective: a belief is true if it corresponds to a fact. The coherence theory of truth
says that a belief is true if it belongs to a coherent system of
beliefs. A result of this view is that truth is relative since it
depends on other beliefs. Further theories of truth include pragmatist, semantic, pluralist, and deflationary theories. Truth plays a central role in epistemology as a goal of cognitive processes and an attribute of propositional knowledge.
In epistemology, justification is a property of beliefs that meet
certain norms about what a person should believe. According to a common
view, this means that the person has sufficient reasons for holding this
belief because they have information that supports it. Another view states that a belief is justified if it is formed by a reliable belief formation process, such as perception. The terms reasonable, warranted, and supported are sometimes used as synonyms of the word justified. Justification distinguishes well-founded beliefs from superstition and lucky guesses. However, it does not guarantee truth. For example, a person with strong
but misleading evidence may form a justified belief that is false.
Epistemologists often identify justification as a key component of knowledge. Usually, they are not only interested in whether a person has a sufficient reason to hold a belief, known as propositional justification, but also in whether the person holds the belief because or based on this reason, known as doxastic justification.
For example, if a person has sufficient reason to believe that a
neighborhood is dangerous but forms this belief based on superstition
then they have propositional justification but lack doxastic
justification.
Sources
Sources of justification are ways or cognitive capacities through
which people acquire justification. Often-discussed sources include perception, introspection, memory, reason, and testimony, but there is no universal agreement to what extent they all provide valid justification. Perception relies on sensory organs to gain empirical information. Distinct forms of perception correspond to different physical stimuli, such as visual, auditory, haptic, olfactory, and gustatory perception. Perception is not merely the reception of sense impressions but an active process that selects, organizes, and interprets sensory signals. Introspection is a closely related process focused on internal mental states
rather than external physical objects. For example, seeing a bus at a
bus station belongs to perception while feeling tired belongs to
introspection.
Rationalists understand reason as a source of justification for
non-empirical facts, explaining how people can know about mathematical,
logical, and conceptual truths. Reason is also responsible for
inferential knowledge, in which one or more beliefs serve as premises to
support another belief. Memory depends on information provided by other sources, which it
retains and recalls, like remembering a phone number perceived earlier. Justification by testimony relies on information one person
communicates to another person. This can happen by talking to each other
but can also occur in other forms, like a letter, a newspaper, and a
blog.
Other concepts
Rationality is closely related to justification and the terms rational belief and justified belief
are sometimes used interchangeably. However, rationality has a wider
scope that encompasses both a theoretical side, covering beliefs, and a
practical side, covering decisions, intentions, and actions. There are different conceptions about what it means for something to be
rational. According to one view, a mental state is rational if it is
based on or responsive to good reasons. Another view emphasizes the role
of coherence, stating that rationality requires that the different
mental states of a person are consistent and support each other. A slightly different approach holds that rationality is about achieving
certain goals. Two goals of theoretical rationality are accuracy and
comprehensiveness, meaning that a person has as few false beliefs and as
many true beliefs as possible.
Epistemologists rely on the concept of epistemic norms as
criteria to assess the cognitive quality of beliefs, like their
justification and rationality. They distinguish between deontic norms,
which prescribe what people should believe, and axiological norms, which identify the goals and values of beliefs. Epistemic norms are closely linked to intellectual or epistemic virtues, which are character traits like open-mindedness and conscientiousness.
Epistemic virtues help individuals form true beliefs and acquire
knowledge. They contrast with epistemic vices and act as foundational
concepts of virtue epistemology.
Epistemologists understand evidence
for a belief as information that favors or supports it. They
conceptualize evidence primarily in terms of mental states, such as
sensory impressions or other known propositions. But in a wider sense,
it can also include physical objects, like bloodstains examined by forensic analysts or financial records studied by investigative journalists. Evidence is often understood in terms of probability: evidence for a belief makes it more likely that the belief is true. A defeater is evidence against a belief or evidence that undermines another piece of evidence. For instance, witness testimony linking a suspect to a crime is evidence of their guilt, while an alibi is a defeater. Evidentialists
analyze justification in terms of evidence by asserting that for a
belief to be justified, it needs to rest on adequate evidence.
The presence of evidence usually affects doubt and certainty,
which are subjective attitudes toward propositions that differ
regarding their level of confidence. Doubt involves questioning the
validity or truth of a proposition. Certainty, by contrast, is a strong
affirmative conviction, indicating an absence of doubt about the
proposition's truth. Doubt and certainty are central to ancient Greek
skepticism and its goal of establishing that no belief is immune to
doubt. They are also crucial in attempts to find a secure foundation of
all knowledge, such as René Descartes' foundationalist epistemology.
While propositional knowledge is the main topic in epistemology, some theorists focus on understanding
instead. Understanding is a more holistic notion that involves a wider
grasp of a subject. To understand something, a person requires awareness
of how different things are connected and why they are the way they
are. For example, knowledge of isolated facts memorized from a textbook
does not amount to understanding. According to one view, understanding
is a unique epistemic good that, unlike propositional knowledge, is
always intrinsically valuable. Wisdom
is similar in this regard and is sometimes considered the highest
epistemic good. It encompasses a reflective understanding with practical
applications, helping people grasp and evaluate complex situations and
lead a good life.
In epistemology, knowledge ascription is the act of attributing
knowledge to someone, expressed in sentences like "Sarah knows that it
will rain today". According to invariantism, knowledge ascriptions have fixed standards across different contexts. Contextualists,
by contrast, argue that knowledge ascriptions are context-dependent.
From this perspective, Sarah may know about the weather in the context
of an everyday conversation even though she is not sufficiently informed
to know it in the context of a rigorous meteorological debate. Contrastivism,
another view, argues that knowledge ascriptions are comparative,
meaning that to know something involves distinguishing it from relevant
alternatives. For example, if a person spots a bird in the garden, they
may know that it is a sparrow rather than an eagle, but they may not
know that it is a sparrow rather than an indistinguishable sparrow
hologram.
Philosophical skepticism
questions the human ability to attain knowledge by challenging the
foundations upon which knowledge claims rest. Some skeptics limit their
criticism to specific domains of knowledge. For example, religious skeptics
say that it is impossible to know about the existence of deities or the
truth of other religious doctrines. Similarly, moral skeptics challenge
the existence of moral knowledge and metaphysical skeptics say that
humans cannot know ultimate reality. External world skepticism questions knowledge of external facts, whereas skepticism about other minds doubts knowledge of the mental states of others.
Global skepticism is the broadest form of skepticism, asserting that there is no knowledge in any domain. In ancient philosophy, this view was embraced by academic skeptics, whereas Pyrrhonian skeptics recommended the suspension of belief to attain tranquility. Few epistemologists have explicitly defended global skepticism. The
influence of this position stems from attempts by other philosophers to
show that their theory overcomes the challenge of skepticism. For
example, René Descartes used methodological doubt to find facts that cannot be doubted.
One consideration in favor of global skepticism is the dream argument.
It starts from the observation that, while people are dreaming, they
are usually unaware of this. This inability to distinguish between dream
and regular experience is used to argue that there is no certain
knowledge since a person can never be sure that they are not dreaming. Some critics assert that global skepticism is self-refuting
because denying the existence of knowledge is itself a knowledge claim.
Another objection says that the abstract reasoning leading to
skepticism is not convincing enough to overrule common sense.
Fallibilism is another response to skepticism. Fallibilists agree with skeptics that absolute certainty is impossible.
They reject the assumption that knowledge requires absolute certainty,
leading them to the conclusion that fallible knowledge exists. They emphasize the need to keep an open and inquisitive mind,
acknowledging that doubt can never be fully excluded, even for
well-established knowledge claims like thoroughly tested scientific
theories.
Epistemic relativism is related to skepticism but differs in that
it does not question the existence of knowledge in general. Instead,
epistemic relativists only reject the notion of universal epistemic
standards or absolute principles that apply equally to everyone. This
means that what a person knows depends on subjective criteria or social
conventions used to assess epistemic status.
The debate between empiricism and rationalism centers on the origins of human knowledge. Empiricism emphasizes that sense experience is the primary source of all knowledge. Some empiricists illustrate this view by describing the mind as a blank slate
that only develops ideas about the external world through the sense
data received from the sensory organs. According to them, the mind can
attain various additional insights by comparing impressions, combining
them, generalizing to form more abstract ideas, and deducing new
conclusions from them. Empiricists say that all these mental operations
depend on sensory material and do not function on their own.
Even though rationalists usually accept sense experience as one source of knowledge, they argue that certain forms of knowledge are directly accessed through reason without sense experience, like knowledge of mathematical and logical truths. Some forms of rationalism state that the mind possesses inborn ideas, accessible without sensory assistance. Others assert that there is an additional cognitive faculty, sometimes called rational intuition, through which people acquire nonempirical knowledge. Some rationalists limit their discussion to the origin of concepts, saying that the mind relies on inborn categories to understand the world and organize experience.
Diagram
of foundationalism, coherentism, and infinitism with arrows symbolizing
support between beliefs. According to foundationalism, some basic
beliefs are justified without support from other beliefs. According to
coherentism, justification requires that beliefs mutually support each
other. According to infinitism, justification requires that beliefs form
infinite support chains.
Foundationalists and coherentists disagree about the structure of knowledge. Foundationalism distinguishes between basic and non-basic beliefs. A
belief is basic if it is justified directly, meaning that its validity
does not depend on the support of other beliefs. A belief is non-basic if it is justified by another belief. For example, the belief that it rained last night is a non-basic belief
if it is inferred from the observation that the street is wet. According to foundationalism, basic beliefs are the foundation on which
all other knowledge is built while non-basic beliefs act as the
superstructure resting on this foundation.
Coherentists reject the distinction between basic and non-basic
beliefs, saying that the justification of any belief depends on other
beliefs. They assert that a belief must align with other beliefs to
amount to knowledge. This occurs when beliefs are consistent and support
each other. According to coherentism, justification is a holistic aspect determined by the whole system of beliefs, which resembles an interconnected web.
Foundherentism
is an intermediary position combining elements of both foundationalism
and coherentism. It accepts the distinction between basic and non-basic
beliefs while asserting that the justification of non-basic beliefs
depends on coherence with other beliefs.
Infinitism
presents a less common alternative perspective on the structure of
knowledge. It agrees with coherentism that there are no basic beliefs
while rejecting the view that beliefs can support each other in a circular manner.
Instead, it argues that beliefs form infinite justification chains, in
which each link of the chain supports the belief following it and is
supported by the belief preceding it.
Alvin Goldman was an influential defender of externalism.
The disagreement between internalism and externalism is about the sources of justification. Internalists say that justification depends only on factors within the
individual, such as perceptual experience, memories, and other beliefs.
This view emphasizes the importance of the cognitive perspective of the
individual in the form of their mental states. It is commonly associated
with the idea that the relevant factors are accessible, meaning that
the individual can become aware of their reasons for holding a justified
belief through introspection and reflection.
Evidentialism is an influential internalist view, asserting that justification depends on the possession of evidence. In this context, evidence for a belief is any information in the
individual's mind that supports the belief. For example, the perceptual
experience of rain is evidence for the belief that it is raining.
Evidentialists suggest various other forms of evidence, including
memories, intuitions, and other beliefs. According to evidentialism, a belief is justified if the individual's
evidence supports it and they hold the belief on the basis of this
evidence.
Externalism, by contrast, asserts that at least some relevant factors of knowledge are external to the individual. For instance, when considering the belief that a cup of coffee stands
on the table, externalists are not primarily interested in the
subjective perceptual experience that led to this belief. Instead, they
focus on objective factors, like the quality of the person's eyesight,
their ability to differentiate coffee from other beverages, and the
circumstances under which they observed the cup. A key motivation of many forms of externalism is that justification
makes it more likely that a belief is true. Based on this view,
justification is external to the extent that some factors contributing
to this likelihood are not part of the believer's cognitive perspective.
Reliabilism is an externalist theory asserting that a reliable connection between belief and truth is required for justification. Some reliabilists explain this in terms of reliable processes.
According to this view, a belief is justified if it is produced by a
reliable process, like perception. A belief-formation process is deemed
reliable if most of the beliefs it generates are true. An alternative
view focuses on beliefs rather than belief-formation processes, saying
that a belief is justified if it is a reliable indicator of the fact it
presents. This means that the belief tracks the fact: the person
believes it because it is true but would not believe it otherwise.
Virtue epistemology,
another type of externalism, asserts that a belief is justified if it
manifests intellectual virtues. Intellectual virtues are capacities or
traits that perform cognitive functions and help people form true
beliefs. Suggested examples include faculties, like vision, memory, and
introspection, and character traits, like open-mindedness.
Branches and approaches
Some branches of epistemology are characterized by their research methods. Formal epistemology employs formal tools from logic and mathematics to investigate the nature of knowledge. For example, Bayesian epistemology represents beliefs as degrees of certainty and uses probability theory to formally define norms of rationality governing how certain people should be. Experimental epistemologists base their research on empirical evidence about common knowledge practices. Applied epistemology
focuses on the practical application of epistemological principles to
diverse real-world problems, like the reliability of knowledge claims on
the internet, how to assess sexual assault allegations, and how racism may lead to epistemic injustice. Metaepistemologists study the nature, goals, and research methods of epistemology. As a metatheory,
it does not directly advocate for specific epistemological theories but
examines their fundamental concepts and background assumptions.
Particularism and generalism disagree about the right method of conducting epistemological research.
Particularists start their inquiry by looking at specific cases. For
example, to find a definition of knowledge, they rely on their
intuitions about concrete instances of knowledge and particular thought
experiments. They use these observations as methodological constraints
that any theory of general principles needs to follow. Generalists
proceed in the opposite direction. They prioritize general epistemic
principles, saying that it is not possible to accurately identify and
describe specific cases without a grasp of these principles. Other methods in contemporary epistemology aim to extract philosophical insights from ordinary language or look at the role of knowledge in making assertions and guiding actions.
Phenomenological
epistemology emphasizes the importance of first-person experience. It
distinguishes between the natural and the phenomenological attitudes.
The natural attitude focuses on objects belonging to common sense and
natural science. The phenomenological attitude focuses on the experience
of objects and aims to provide a presuppositionless description of how
objects appear to the observer.
Naturalized epistemology is closely associated with the natural sciences,
relying on their methods and theories to examine knowledge. Arguing
that epistemological theories should rest on empirical observation, it
is critical of a priori reasoning. Evolutionary epistemology is a naturalistic approach that understands cognition as a product of evolution, examining knowledge and the cognitive faculties responsible for it through the lens of natural selection. Social epistemology
focuses on the social dimension of knowledge. While traditional
epistemology is mainly interested in the knowledge possessed by
individuals, social epistemology covers knowledge acquisition,
transmission, and evaluation within groups, with specific emphasis on
how people rely on each other when seeking knowledge.
Pragmatist
epistemology is a form of fallibilism that emphasizes the close
relation between knowing and acting. It sees the pursuit of knowledge as
an ongoing process guided by common sense and experience while always
open to revision. This approach reinterprets some core epistemological
notions, for example, by conceptualizing beliefs as habits that shape
actions rather than representations that mirror the world. Motivated by pragmatic considerations, epistemic conservatism is a view about belief revision.
It prioritizes pre-existing beliefs, asserting that a person should
only change their beliefs if they have a good reason to. One argument
for epistemic conservatism rests on the recognition that the cognitive
resources of humans are limited, making it impractical to constantly
reexamine every belief.
Postmodern epistemology critiques the conditions of knowledge in advanced societies. This concerns in particular the metanarrative of a constant progress of scientific knowledge leading to a universal and foundational understanding of reality. Similarly, feminist epistemology adopts a critical perspective, focusing on the effect of gender
on knowledge. Among other topics, it explores how preconceptions about
gender influence who has access to knowledge, how knowledge is produced,
and which types of knowledge are valued in society. Some postmodern and feminist thinkers adopt a constructivist
approach, arguing that the way people view the world is not a simple
reflection of external reality but a social construction. This view
emphasizes the creative role of interpretation while undermining
objectivity since social constructions can vary across societies. Another critical approach, found in decolonial scholarship, opposes the
global influence of Western knowledge systems. It seeks to undermine
Western hegemony and decolonize knowledge.
The decolonial outlook is also present in African epistemology. Grounded in African ontology, it emphasizes the interconnectedness of reality as a continuum between knowing subject and known object. It understands knowledge as a holistic
phenomenon that includes sensory, emotional, intuitive, and rational
aspects, extending beyond the limits of the physical domain.
Another epistemological tradition is found in ancient Indian philosophy. Its diverse schools of thought examine different sources of knowledge, called pramāṇa. Perception, inference, and testimony
are sources discussed by most schools. Other sources only considered by
some schools are non-perception, which leads to knowledge of absences,
and presumption.Buddhist epistemology focuses on immediate experience, understood as the presentation of unique particulars without secondary cognitive processes, like thought and desire. Nyāya
epistemology is a causal theory of knowledge, understanding sources of
knowledge as reliable processes that cause episodes of truthful
awareness. It sees perception as the primary source of knowledge and
emphasizes its importance for successful action. Mīmāṃsā epistemology considers the holy scriptures known as the Vedas as a key source of knowledge, addressing the problem of their right interpretation. Jain epistemology states that reality is many-sided, meaning that no single viewpoint can capture the entirety of truth.
Historical epistemology
examines how the understanding of knowledge and related concepts has
changed over time. It asks whether the main issues in epistemology are
perennial and to what extent past epistemological theories are relevant
to contemporary debates. It is particularly concerned with scientific
knowledge and practices associated with it. It contrasts with the history of epistemology, which presents,
reconstructs, and evaluates epistemological theories of philosophers in
the past.
Knowledge in particular domains
Some branches of epistemology focus on knowledge within specific academic disciplines. The epistemology of science
examines how scientific knowledge is generated and what problems arise
in the process of validating, justifying, and interpreting scientific
claims. A key issue concerns the problem of how individual observations can support universal scientific laws. Other topics include the nature of scientific evidence and the aims of science. The epistemology of mathematics studies the origin of mathematical
knowledge. In exploring how mathematical theories are justified, it
investigates the role of proofs and whether there are empirical sources
of mathematical knowledge.
Distinct areas of epistemology are dedicated to specific sources of knowledge. Examples are the epistemology of perception, the epistemology of memory, and the epistemology of testimony. In the epistemology of perception, direct and indirect realists
debate the connection between the perceiver and the perceived object.
Direct realists say that this connection is direct, meaning that there
is no difference between the object present in perceptual experience and
the physical object causing this experience. According to indirect
realism, the connection is indirect, involving mental entities, like
ideas or sense data, that mediate between the perceiver and the external
world. The contrast between direct and indirect realism is important
for explaining the nature of illusions.
Epistemological issues are found in most areas of philosophy. The epistemology of logic examines how people know that an argument is valid. For example, it explores how logicians justify that modus ponens is a correct rule of inference or that all contradictions are false. Epistemologists of metaphysics investigate whether knowledge of the basic structure of reality is possible and what sources this knowledge could have. Knowledge of moral statements, like the claim that lying is wrong, belongs to the epistemology of ethics. It studies the role of ethical intuitions, coherence among moral beliefs, and the problem of moral disagreement. The ethics of belief is a closely related field exploring the intersection of epistemology and ethics. It examines the norms governing belief formation and asks whether violating them is morally wrong. Religious epistemology
studies the role of knowledge and justification for religious doctrines
and practices. It evaluates the reliability of evidence from religious experience and holy scriptures while also asking whether the norms of reason should be applied to religious faith.
Epistemologists of language explore the nature of linguistic
knowledge. One of their topics is the role of tacit knowledge, for
example, when native speakers have mastered the rules of grammar but are unable to explicitly articulate them. Epistemologists of modality examine knowledge about what is possible and necessary. Epistemic problems that arise when two people have diverging opinions
on a topic are covered by the epistemology of disagreement. Epistemologists of ignorance are interested in epistemic faults and gaps in knowledge.
Related fields
Epistemology and psychology
were not defined as distinct fields until the 19th century; earlier
investigations about knowledge often do not fit neatly into today's
academic categories. Both contemporary disciplines study beliefs and the mental processes
responsible for their formation and change. One key contrast is that
psychology describes what beliefs people have and how they acquire them,
thereby explaining why someone has a specific belief. The focus of
epistemology is on evaluating beliefs, leading to a judgment about
whether a belief is justified and rational in a particular case. Epistemology also shares a close connection with cognitive science, which understands mental events as processes that transform information. Artificial intelligence relies on the insights of epistemology and cognitive science to implement concrete solutions to problems associated with knowledge representation and automatic reasoning.
Logic
is the study of correct reasoning. For epistemology, it is relevant to
inferential knowledge, which arises when a person reasons from one known
fact to another. This is the case, for example, when inferring that it rained based on the observation that the streets are wet. Whether an inferential belief amounts to knowledge depends on the form of reasoning used, in particular, that the process does not violate the laws of logic. Another overlap between the two fields is found in the epistemic approach to fallacies. Fallacies are faulty arguments based on incorrect reasoning. The epistemic approach to fallacies explains why they are faulty,
stating that arguments aim to expand knowledge. According to this view,
an argument is a fallacy if it fails to do so. A further intersection is found in epistemic logic, which uses formal logical devices to study epistemological concepts like knowledge and belief.
Both decision theory
and epistemology are interested in the foundations of rational thought
and the role of beliefs. Unlike many approaches in epistemology, the
main focus of decision theory lies less in the theoretical and more in
the practical side, exploring how beliefs are translated into action. Decision theorists examine the reasoning involved in decision-making and the standards of good decisions, identifying beliefs as a central aspect of decision-making. One of
their innovations is to distinguish between weaker and stronger beliefs,
which helps them consider the effects of uncertainty on decisions.
Epistemology and education
have a shared interest in knowledge, with one difference being that
education focuses on the transmission of knowledge, exploring the roles
of both learner and teacher. Learning theory examines how people acquire knowledge. Behavioral learning theories explain the process in terms of behavior changes, for example, by associating a certain response with a particular stimulus. Cognitive learning theories study how the cognitive processes that affect knowledge acquisition transform information. Pedagogy looks at the transmission of knowledge from the teacher's perspective, exploring the teaching methods they may employ. In teacher-centered methods, the teacher serves as the main authority
delivering knowledge and guiding the learning process. In student-centered methods, the teacher primarily supports and facilitates the learning process, allowing students to take a more active role. The beliefs students have about knowledge, called personal epistemology, influence their intellectual development and learning success.
The anthropology
of knowledge examines how knowledge is acquired, stored, retrieved, and
communicated. It studies the social and cultural circumstances that
affect how knowledge is reproduced and changes, covering the role of
institutions like university departments and scientific journals as well
as face-to-face discussions and online communications. This field has a
broad concept of knowledge, encompassing various forms of understanding
and culture, including practical skills. Unlike epistemology, it is not
interested in whether a belief is true or justified but in how
understanding is reproduced in society. A closely related field, the sociology of knowledge
has a similar conception of knowledge. It explores how physical,
demographic, economic, and sociocultural factors impact knowledge. This
field examines in what sociohistorical contexts knowledge emerges and
the effects it has on people, for example, how socioeconomic conditions
are related to the dominant ideology in a society.
History
Early reflections on the nature and sources of knowledge are found in ancient history. In ancient Greek philosophy, Plato (427–347 BCE) studied what knowledge is, examining how it differs from true opinion by being based on good reasons. He proposed that learning is a form of recollection in which the soul remembers what it already knew but had forgotten. Plato's student Aristotle
(384–322 BCE) was particularly interested in scientific knowledge,
exploring the role of sensory experience and the process of making
inferences from general principles. Aristotle's ideas influenced the Hellenistic schools of philosophy, which began to arise in the 4th century BCE and included Epicureanism, Stoicism, and skepticism. The Epicureans had an empiricist outlook, stating that sensations are always accurate and act as the supreme standard of judgments. The Stoics defended a similar position but confined their trust to lucid and specific sensations, which they regarded as true. The skeptics questioned that knowledge is possible, recommending instead suspension of judgment to attain a state of tranquility. Emerging in the 3rd century CE and inspired by Plato's philosophy, Neoplatonism distinguished knowledge from true belief, arguing that knowledge is infallible and limited to the realm of immaterial forms.
The Buddhist philosopher Dharmakirti developed a causal theory of knowledge.
The Upanishads, philosophical scriptures composed in ancient India between 700 and 300 BCE, examined how people acquire knowledge, including the role of introspection, comparison, and deduction. In the 6th century BCE, the school of Ajñana developed a radical skepticism questioning the possibility and usefulness of knowledge. By contrast, the school of Nyaya,
which emerged around 200 CE, asserted that knowledge is possible. It
provided a systematic treatment of how people acquire knowledge,
distinguishing between valid and invalid sources. When Buddhist philosophers became interested in epistemology, they relied on concepts developed in Nyaya and other traditions. Buddhist philosopher Dharmakirti (6th or 7th century CE) analyzed the process of knowing as a series of causally related events.
Ancient Chinese philosophers
understood knowledge as an interconnected phenomenon fundamentally
linked to ethical behavior and social involvement. Many saw wisdom as
the goal of attaining knowledge. Mozi
(470–391 BCE) proposed a pragmatic approach to knowledge using
historical records, sensory evidence, and practical outcomes to validate
beliefs. Mencius (c. 372–289 BCE) explored analogical reasoning as a source of knowledge and employed this method to criticize Mozi. Xunzi (c. 310–220 BCE)
aimed to combine empirical observation and rational inquiry. He
emphasized the importance of clarity and standards of reasoning without
excluding the role of feeling and emotion.
The relation between reason and faith was a central topic in the medieval period. In Arabic–Persian philosophy, al-Farabi (c. 870–950) and Averroes (1126–1198) discussed how philosophy and theology interact, debating which one is a better vehicle to truth. Al-Ghazali (c. 1056–1111) criticized many core teachings of previous Islamic philosophers, saying that they relied on unproven assumptions that did not amount to knowledge. Similarly in Western philosophy, Anselm of Canterbury (1033–1109) proposed that theological teaching and philosophical inquiry are in harmony and complement each other. Formulating a more critical approach, Peter Abelard (1079–1142) argued against unquestioned theological authorities and said that all things are open to rational doubt. Influenced by Aristotle, Thomas Aquinas (1225–1274) developed an empiricist theory, stating that "nothing is in the intellect unless it first appeared in the senses". According to an early form of direct realism proposed by William of Ockham (c. 1285–1349), perception of mind-independent objects happens directly without intermediaries. Meanwhile, in 14th-century India, Gaṅgeśa developed a reliabilist theory of knowledge and considered the problems of testimony and fallacies. In China, Wang Yangming
(1472–1529) explored the unity of knowledge and action, holding that
moral knowledge is inborn and can be attained by overcoming
self-interest.
The course of modern philosophy was shaped by René Descartes
(1596–1650), who stated that philosophy must begin from a position of
indubitable knowledge of first principles. Inspired by skepticism, he
aimed to find absolutely certain knowledge by encountering truths that
cannot be doubted. He thought that this is the case for the assertion "I think, therefore I am", from which he constructed the rest of his philosophical system. Descartes, together with Baruch Spinoza (1632–1677) and Gottfried Wilhelm Leibniz (1646–1716), belonged to the school of rationalism, which asserts that the mind possesses innate ideas independent of experience. John Locke (1632–1704) rejected this view in favor of an empiricism according to which the mind is a blank slate.
This means that all ideas depend on experience, either as "ideas of
sense", which are directly presented through the senses, or as "ideas of
reflection", which the mind creates by reflecting on its own
activities. David Hume
(1711–1776) used this idea to explore the limits of what people can
know. He said that knowledge of facts is never certain, adding that
knowledge of relations between ideas, like mathematical truths, can be
certain but contains no information about the world. Immanuel Kant
(1724–1804) sought a middle ground between rationalism and empiricism
by identifying a type of knowledge overlooked by Hume. For Kant, this
knowledge pertains to principles that underlie and structure all
experience, such as spatial and temporal relations and fundamental categories of understanding.
In the 19th century and influenced by Kant's philosophy, Georg Wilhelm Friedrich Hegel
(1770–1831) rejected empiricism by arguing that sensory impressions
alone cannot amount to knowledge since all knowledge is actively
structured by the knowing subject. John Stuart Mill (1806–1873), by contrast, defended a wide-sweeping form of empiricism and explained knowledge of general truths through inductive reasoning. Charles Peirce (1839–1914) thought that all knowledge is fallible, emphasizing that knowledge seekers should remain open to revising their beliefs in light of new evidence. He used this idea to argue against Cartesian foundationalism, which seeks absolutely certain truths.
In the 20th century, fallibilism was further explored by J. L. Austin (1911–1960) and Karl Popper (1902–1994). In continental philosophy, Edmund Husserl (1859–1938) applied the skeptical idea of suspending judgment to the study of experience. By not judging whether an experience is accurate, he tried to describe its internal structure instead. Influenced by earlier empiricists, logical positivists, like A. J. Ayer (1910–1989), said that all knowledge is either empirical or analytic, rejecting any form of metaphysical knowledge. Bertrand Russell (1872–1970) developed an empiricist sense-datum theory, distinguishing between direct knowledge by acquaintance of sense data and indirect knowledge by description, which is inferred from knowledge by acquaintance. Common sense had a central place in G. E. Moore's
(1873–1958) epistemology. He used trivial observations, like the fact
that he has two hands, to argue against abstract philosophical theories
that deviate from common sense. Ordinary language philosophy, as practiced by the late Ludwig Wittgenstein (1889–1951), is a similar approach that tries to extract epistemological insights from how ordinary language is used.
Edmund Gettier (1927–2021) conceived counterexamples
against the idea that knowledge is justified true belief. These
counterexamples prompted many philosophers to suggest alternative definitions of knowledge. Developed by philosophers such as Alvin Goldman (1938–2024), reliabilism
emerged as one of the alternatives, asserting that knowledge requires
reliable sources and shifting the focus away from justification. Virtue epistemologists, such as Ernest Sosa (1940–present) and Linda Zagzebski
(1946–present), analyse belief formation in terms of the intellectual
virtues or cognitive competencies involved in the process. Naturalized epistemology, as conceived by Willard Van Orman Quine (1908–2000), employs concepts and ideas from the natural sciences to formulate its theories. Other developments in late 20th-century epistemology were the emergence of social, feminist, and historical epistemology.