Joseph Noel Paton , Dante Meditating the Episode of Francesca da Rimini and Paolo Malatesta
Imagination is the production of sensations, feelings and thoughts informing oneself. These experiences can be re-creations of past experiences, such as
vivid memories with imagined changes, or completely invented and
possibly fantastic scenes. Imagination helps apply knowledge to solve problems and is fundamental to integrating experience and the learning process.
Imagination is the process of developing theories and ideas based
on the functioning of the mind through a creative division. Drawing
from actual perceptions, imagination employs intricate conditional
processes that engage both semantic and episodic memory to generate new or refined ideas. This part of the mind helps develop better and easier ways to accomplish tasks, whether old or new.
A way to train imagination is by listening to and practicing storytelling (narrative), wherein imagination is expressed through stories and writings such as fairy tales, fantasies, and science fiction.When children develop their imagination, they often exercise it through pretend play. They use role-playing
to act out what they have imagined, and followingly, they play on by
acting as if their make-believe scenarios are actual reality.
Etymology
The
English word "imagination" originates from the Latin term "imaginatio,"
which is the standard Latin translation of the Greek term "phantasia."
The Latin term also translates to "mental image"
or "fancy." The use of the word "imagination" in English can be traced
back to the mid-14th century, referring to a faculty of the mind that
forms and manipulates images.
Definition
In modern philosophical understanding, imagination is commonly seen as a faculty for creating mental images and for making non-rational, associative transitions among these images.
One view of imagination links it to cognition, suggesting that imagination is a cognitive process in mental functioning. It is also associated with rational thinking
in a way that both imaginative and rational thoughts involve the
cognitive process that "underpins thinking about possibilities". However, imagination is not considered to be purely a cognitive
activity because it is also linked to the body and place. It involves
setting up relationships with materials and people, precluding the
notion that imagination is confined to the mind.
The psychological view of imagination relates this concept to a
cognate term, "mental imagery," which denotes the process of reviving in
the mind recollections of objects previously given in sense perception. Since this use of the term conflicts with that of ordinary language, some psychologists prefer to describe this process as "imaging" or "imagery" or to speak of it as "reproductive" as opposed to "productive" or "constructive" imagination. Constructive imagination is further divided into voluntary imagination driven by the lateral prefrontal cortex (LPFC), such as mental rotation, and involuntary imagination (LPFC-independent), such as REM sleepdreaming, daydreaming, hallucinations, and spontaneous insight. In clinical settings, clinicians nowadays increasingly make use of visual imagery for psychological treatment of anxiety disorders, depression, schizophrenia and Parkinson's disease.
Conceptual history
Ancient
Ancient Greek philosophers conceived imagination, or "phantasia," as working with "pictures" in the sense of mental images. Aristotle, in his work De Anima, identified imagination as a faculty that enables an image to occur within us, a definition associating imagination with a broad range of activities involved in thoughts, dreams, and memories.
In Philebus, Plato discusses daydreaming and considers imagination about the future as the work of a painter within the soul. However, Plato
portrayed this painter as an illustrator rather than a creator,
reflecting his view of imagination as a representational rather than an
inventive faculty.
Greek philosophers typically distinguished imagination from perception
and rational thinking: "For imagination is different from either
perceiving or discursive thinking, though it is not found without
sensation, or judgement without it" (De Anima, iii 3).Aristotle viewed imagination as a faculty that mediates between the senses and intellect. The mental images it manipulates, whether arising from visions, dreams
or sensory perception, were thought to be transmitted through the lower
parts of the soul, suggesting that these images could be influenced by
emotions and primal desires, thereby confusing the judgement of the intellect.
Middle Ages
In the Middle Ages, the concept of imagination encompassed domains such as religion, literature, artwork, and notably, poetry. Men of science often recognized poets as "imaginative," viewing imagination as the mental faculty that specifically permitted poetry writing. This association, they suggested, lies in the capacity of imagination
for image-making and image-forming, which results in a sense of
"visualizing" with "the inner eye."
"That
oon of hem was blynd and myghte not see, / But it were with thilke eyen
of his mynde / With whiche men seen, after that they ben blynde."
Medieval theories of faculty psychology posited imagination as a faculty of the internal senses (alongside memory and common sense): imagination receives mental images from memory or perception, organizes them, and transmits them to the reasoning faculties, providing the intellect with sense data. In this way, it enables the reshaping of images from sense perception (even in the absence of perception, such as in dreams), performing a filtering function of reality.
Medieval
paintings of imaginary creatures, as seen in frescos and manuscripts,
often combined body parts of different animals, and even humans.
Although not attributed the capacity for creations, imagination was
thought to combine images received from memory or perception in creative
ways, allowing for the invention of novel concepts or expressions. For example, it could fuse images of "gold" and "mountain" to produce the idea of a "golden mountain."
In medieval artistic works, imagination served the role of combining
images of perceivable things to portray legendary, mysterious, or
extraordinary creatures. This can be seen in the depiction of a Mongolian in the Grandes Chroniques de France(1241), as well as in the portrayal of angels, demons, hell, and the apocalypse in Christian religious paintings.
Renaissance and early modern
The Renaissance
saw the revival of classical texts and the celebration for men's
dignity, yet scholars of the time did not significantly contribute to
the conceptual understanding of "imagination."Marsilio Ficino, for example, did not regard artistic creations such as painting, sculpture and poetry as privileged forms of human creativity, nor did he attribute creativity to the faculty of imagination. Instead, Ficino posited that imagination could be the vehicle through which divine intervention transmits insights in the form of images, which ultimately facilitates the creation of art.
Don Quixote, engrossed in reading books of chivalry.
Nevertheless, the groundwork laid by humanists made it easier for later thinkers to develop the connection between imagination and creativity. Early modern philosophers began to consider imagination as a trait or ability that an individual could possess. Miguel de Cervantes, influenced by Spanish physician and philosopher Juan Huarte de San Juan, crafted the iconic character Don Quixote, who epitomized Huarte's idea of "wits full of invention." This type of wit was thought to be typically found in individuals for
whom imagination was the most prominent component of their "ingenium" (Spanish: ingenio; term meaning close to "intellect").
Early modern
philosophers also started to acknowledge imagination as an active,
cognitive faculty, although it was principally seen as a mediator
between sense perception (Latin: sensus) and pure understanding (Latin: intellectio pura). René Descartes, in Meditations on First Philosophy
(1641), interpreted imagination as a faculty actively focusing on
bodies (corporeal entities) while being passively dependent on stimuli
from different senses. In the writing of Thomas Hobbes, imagination became a key element of human cognition.
In the 16th and 17th centuries, the connotations of imagination" extended to many areas of early modern civic life. Juan Luis Vives noted the connection between imagination and rhetoric skills. Huarte
extended this idea, linking imagination to any disciplines that
necessitates "figures, correspondence, harmony, and proportion," such as
medical practice and the art of warfare. Additionally, Galileo used the concept of imagination to conduct thought experiments, such as asking readers to imagine the direction a stone released from a sling would fly.
Enlightenment and thereafter
By the Age of Enlightenment, philosophical discussions frequently linked the power of imagination with creativity, particularly in aesthetics. William Duff
was among the first to identify imagination as a quality of genius,
distinguishing it from talent by emphasizing that only genius is
characterized by creative innovation. Samuel Taylor Coleridge
distinguished between imagination expressing realities of an imaginal
realm above our mundane personal existence, and "fancy", or fantasy,
which represents the creativity of the artistic soul. In Preliminary Discourseto the EncyclopediaofDiderot (French: Discours Préliminaire des Éditeurs), d'Alembert referred to imagination as the creative force for Fine Arts.
Immanuel Kant, in his Critique of Pure Reason (German: Kritik der reinen Vernunft), viewed imagination (German: Einbildungskraft) as a faculty of intuition, capable of making "presentations," i.e., sensible representations of objects that are not directly present. Kant distinguished two forms of imagination: productive and
reproductive. Productive imagination functions as the original source of
the presentation of an object, thus preceding experience; while reproductive imagination generates presentations derived from past experiences, recalling empirical intuitions it previously had. Kant's treatise linked imagination to cognition, perception, aesthetic judgement, artistic creation, and morality.
The Kantian idea prepared the way for Fichte, Schelling and the Romantics to transform the philosophical understanding of it into an authentic creative force, associated with genius, inventive activity, and freedom. In the work of Hegel,
imagination, though not given as much importance as by his
predecessors, served as a starting point for the defense of Hegelian phenomenology. Hegel distinguished between a phenomenological account of imagination, which focuses on the lived experience and consciousness, and a scientific, speculative account, which seeks to understand the nature and function of imagination in a systematic and theoretical manner.
Modern
Between 1913 and 1916, Carl Jung developed the concept of "active imagination" and introduced it into psychotherapy. For Jung, active imagination often includes working with dreams and the creative self via imagination or fantasy. It is a meditation technique wherein the contents of one's unconscious are translated into images, narratives, or personified as separate entities, thus serving as a bridge between the conscious "ego" and the unconscious.
Albert Einstein famously said: "Imagination... is more important than knowledge. Knowledge is limited. Imagination encircles the world."
Nikola Tesla
described imagination as: "When I get an idea I start at once building
it up in my imagination. I change the construction, make improvements
and operate the device in my mind. It is absolutely immaterial to me
whether I run my turbine in thought or test it in my shop. I even note
if it is out of balance. There is no difference whatever, the results
are the same. In this way I am able to rapidly develop and perfect a
conception without touching anything."
The phenomenology of imagination is discussed in The Imaginary: A Phenomenological Psychology of the Imagination (French: L'Imaginaire: Psychologie phénoménologique de l'imagination), also published under the title The Psychology of the Imagination, a 1940 book by Jean-Paul Sartre. In this book, Sartre
propounded his concept of imagination, with imaginary objects being
"melanges of past impressions and recent knowledge," and discussed what
the existence of imagination shows about the nature of human consciousness. Based on Sartre's work, subsequent thinkers extended this idea into the realm of sociology, proposing ideas such as imaginary and the ontology of imagination.
Cross cultural
Imagination has been, and continues to be a well-acknowledged concept in many cultures, particularly within religious contexts, as an image-forming faculty of the mind. In Buddhist aesthetics, imagination plays a crucial role in religious practice, especially in visualization practices, which include the recollection of the Buddha's body, visualization of celestial Buddhas and Buddha-fields (Pure Lands and mandalas), and devotion to images.
In Zhuang Zi's Taoism, imagination is perceived as a complex mental activity that is championed as a vital form of cognition. It is defended on empathetic grounds but discredited by the rational intellect as only a presentation and fantasy.
Memory and mental imagery are two mental activities involved in the process of imagination, each influencing the other. Functional magnetic resonance imaging (fMRI) technology shows that remembering and imagining activate the identical parts of the brain. When compared to the recall of common ideas, the generation of new and
old original ideas exhibits a similar activation pattern, particularly
in the bilateral parahippocampal and medial prefrontal cortex (mPFC)
regions. This suggests that the construction of new ideas relies on
processes similar to those in the reconstruction of original ideas from episodic memory.
Imagination can also contribute to the formation of false memories.
For example, when participants read a description of being lost in a
shopping mall and were asked to write out and imagine the event, around
25% later recalled it as a real memory, despite it never having
occurred. This may be due to similar brain areas being involved in both imagining and remembering, particularly areas associated with visual imagery. An fMRI study found that participants who imagined objects after hearing verbal prompts sometimes later falsely remembered seeing them. This was linked to increased activity in the precuneus and inferior parietal cortex, suggesting that overlap between imagination and perception
may lead to memory distortions. Imagination has also been shown to
influence memory by increasing a person’s confidence that an imagined
event actually occurred, a process known as imagination inflation. When individuals vividly imagine an event they initially believe did
not happen, they begin to feel more certain that it did occur, even
without supporting evidence. In this way, imagination can blur the line
between real and imagined experiences, making it difficult to
distinguish between true and false memories.
Perception
Piaget posited that a person's perceptions
depend on their world view. The world view is the result of arranging
perceptions into existing imagery by imagination. Piaget cites the
example of a child saying that the moon is following her when she walks
around the village at night. Like this, perceptions are integrated into
the world view so that they make sense. Imagination is needed to make
sense of perceptions.
Visual imagery involves a network of brain areas from the frontal cortex to sensory areas, overlapping with the default mode network, and can function much like a weak version of afferent perception.
A study that used fMRI
while subjects were asked to imagine precise visual figures, to
mentally disassemble them, or mentally blend them, showed activity in
the occipital, frontoparietal, posterior parietal, precuneus, and dorsolateral prefrontal regions of the subject's brains.
Cognitive development in children
Imagination is crucial to children’s mental, emotional, and social development. Children often engage in pretend play, using their imagination to create and act out scenarios through role-playing, symbolic use of objects, and more. This can support the development of new cognitive structures and abilities by encouraging skills such as reflection, role-integration, language, and representation, which contribute to a deeper understanding of social relationships and perspectives. It also supports early reading development by helping children make
sense of texts, apply them to new contexts, and explore their meaning
through role-play and movement. This allows reading to become a more interactive process, improving
understanding in a child-centred way. Furthermore, research suggests
that pretend play is linked to the development of emotion regulation. Children who engage in pretend play, especially with caregivers, may show better emotion regulation skills, highlighting the broader benefits of imagination for social and emotional development. Similarly, imaginative play fosters executive function (EF), including both hot EF (related to emotions) and cool EF (related to cognitive information processing). Studies have shown that imaginative play not only strengthens these cognitive abilities but also contributes to the development of prosocial behaviors.
Decision-making
Imagination plays a key role in decision-making
by allowing individuals to mentally simulate different scenarios and
outcomes. Through imagination, people can explore potential consequences
of their choices, consider alternative paths, and assess risks without
directly experiencing them. This enhances problem-solving
skills and supports informed decisions by allowing individuals to
anticipate future outcomes and evaluate various possibilities. Imagination also plays a role in improving decision-making by encouraging greater patience.
It was found that when individuals were prompted to envision future
outcomes as part of a sequence, they tended to be more patient in their
choices. This effect has been linked to increased activity in brain regions
associated with imagination, suggesting that imagining future scenarios
can support more thoughtful decisions.
Mental Health
Various studies have shown that imagination can play a role in well-being. Goal-directed imagination, where individuals mentally simulate achieving personal goals, has been shown to influence mental health. A study found that clearer, more detailed, and more positive goal-directed imagination was associated with higher well-being and fewer depressive symptoms. These findings suggest that encouraging goal-directed imagination could be a valuable tool in psychological interventions aimed at improving mental health. Similarly, imagery-based cognitive bias modification, an intervention that involves imagining positive outcomes, can enhance the vividness of positive future thinking, reduce negative affect and anxiety, and increase optimism in adults. One real-world example of using imagination in therapy to support mental health is imagery rescripting,
a technique that involves mentally revisiting and altering distressing
memories to reduce their emotional impact. This method encourages
individuals to reimagine a traumatic or negative event with a more
positive ending, which can help reduce symptoms of anxiety, depression, and PTSD. By changing the emotional tone of the memory through imagination,
patients often experience a greater sense of control and emotional
relief, making the original event feel less threatening.
However, while imagination can be a powerful tool for mental health interventions, it may also contribute to psychological distress when dysregulated. Disruptions in imaginative processes are common in schizophrenia spectrum disorders (SSDs) and may play a role in symptoms such as distorted self-perception and altered reality processing. Imagination has also been found to be closely linked to the sense of identity, and disturbances in embodiment may contribute to challenges in self-experience associated with these conditions. Maladaptive daydreaming (MDD) is another example of how imagination can lead to distress when not regulated. Unlike regular daydreaming, MDD
is understood as a form of unusual imagination that is vivid and
addictive, which often involves fantasizing about an idealized self.
Research found that MDD is associated with emotional and functional distress, highlighting the potential impact of excessive imagination.
Evolutionary theory
Phylogenesis and ontogenesis of various components of imagination
Phylogenetic acquisition of imagination was a gradual process. The simplest form of imagination, REM-sleep dreaming, evolved in mammals with acquisition of REM sleep 140 million years ago. Spontaneous insight improved in primates with acquisition of the lateral prefrontal cortex 70 million years ago. After hominins split from the chimpanzee line 6 million years ago they further improved their imagination. Prefrontal analysis was acquired 3.3 million years ago when hominins started to manufacture Mode One stone tools. Progress in stone tools culture to Mode Two stone tools by 2 million years ago signifies remarkable improvement of prefrontal analysis. The most advanced mechanism of imagination, prefrontal synthesis, was likely acquired by humans around 70,000 years ago and resulted in behavioral modernity. This leap toward modern imagination has been characterized by paleoanthropologists as the "Cognitive revolution", "Upper Paleolithic Revolution", and the "Great Leap Forward".
Moral imagination
Moral
imagination usually describes the mental capacity to find answers to
ethical questions and dilemmas through the process of imagination and visualization. Different definitions of "moral imagination" can be found in the literature.
The philosopher Mark Johnson
described it as "[an ability to imaginatively discern various
possibilities for acting in a given situation and to envision the
potential help and harm that are likely to result from a given action."
In one proposed example, Hitler's assassin Claus von Stauffenberg was said to have decided to dare to overthrow the Nazi regime
as a result (among other factors) of a process of "moral imagination".
His willingness to kill Hitler was less due to his compassion for his
comrades, his family, or friends living at that time, but from thinking
about the potential problems of later generations and people he did not
know. In other words, through a process of moral imagination he was able
to become concerned for "abstract" people (for example, Germans of
later generations, people who were not yet alive, or people outside his
reach).
The research fields of artificial imagination traditionally include (artificial) visual and aural imagination, which extend to all actions involved in forming ideas, images, and concepts—activities
linked to imagination. Practitioners are also exploring topics such as
artificial visual memory, modeling and filtering content based on human emotions, and interactive search. Additionally, there is interest in how artificial imagination may evolve to create an artificial world comfortable enough for people to use as an escape from reality.
A subfield of artificial imagination that receives rising concern is artificial morals. Artificial intelligence faces challenges regarding the responsibility for machines' mistakes or decisions and the difficulty in creating machines with universally accepted moral rules. Recent research in artificial morals bypasses the strict definition of morality, using machine learning methods to train machines to imitate human morals instead. However, by considering data about moral decisions from thousands of
people, the trained moral model may reflect widely accepted rules.
The working class is often defined as those lacking college degrees, which is a majority of American adults.
In the United States, the concept of a working class
remains vaguely defined, and classifying people or jobs into this class
can be contentious. According to Frank Newport, "for some, working
class is a more literal label; namely, an indication that one is
working."
Economists and pollsters in the United States generally define "working class" adults as those lacking a college degree, rather than by occupation or income. Other definitions refer to those in blue-collar occupations, despite the considerable range in required skills
and income among such occupations. Many members of the working class,
as defined by academic models, are often identified in the vernacular as
being middle-class, despite there being considerable ambiguity over the term's meaning.
Sociologists such as Dennis Gilbert and Joseph Kahl see the working class as the most populous in the United States, while other sociologists such as William Thompson, Joseph Hickey and
James Henslin deem the lower middle class slightly more populous. In the class models devised by these sociologists, the working class
comprises between 30% and 35% of the population, roughly the same
percentages as the lower middle class. According to the class model by
Dennis Gilbert, the working class comprises those between the 25th and
55th percentile of society. In 2018, 31% of Americans described
themselves as working class. Retired American adults are less likely to describe themselves as
"working class", regardless of the actual income or education level of
the adult.
Recent history
Higher educational attainment in the US corresponds with median household wealth.
Since the 1970s, economic and occupational insecurity has become a
major problem for American workers, their families, and their
communities, to a much greater extent than their counterparts in peer
countries. According to Matthew Desmond,
the U.S. "offers some of the lowest wages in the industrialized world"
which has "swelled the ranks of the working poor, most of whom are
thirty-five or older." Jonathan Hopkin writes that the United States took the lead in implementing the neoliberal
agenda in the 1980s, making it "the most extreme case of the subjection
of society to the brute force of the market." As such, he argues this
made the United States an outlier with economic inequality hitting "unprecedented levels for the rich democracies."
While outsourcing, the busting and decline of unionization and welfare supports, the rise of immigration, the prison-industrial complex,
and unemployment have brought increased competition and considerable
economic insecurity to working-class employees in the "traditional"
blue-collar fields, there is an increasing demand for service personnel, including clerical and retail occupations. Sociologist Gosta Esping-Anderson describes these supervised service occupations as "junk jobs," as they fail to pay living wages in the face of asset and price inflation,
fail to pay benefits, are often insecure, unstable, or temporary, and
provide little work control and little opportunity for skill development
or advancement. In contrast to other expensive countries with higher
proportions of quality jobs, the U.S. has developed an economy where
two-thirds of jobs do not require or reward higher education; the other
one-third of jobs consist largely in managing the junk job workers.
Despite, or perhaps because of the well-known limitations that the US labor market, inequality—including deep educational inequality, and other structural factors set on social mobility in the US, many commentators find more interesting the idea of class cultures.
Education, for example, can pose an especially intransigent barrier in
the US, and not just because of gross educational inequality; culture
plays some role as well. The middle class is often recognized in the US
by educational attainment, which is correlated with (but may not cause)
income and wealth, especially for white men. Members of the working
class commonly have a high school diploma and many have only some
college education. Due to differences between middle and working class cultures, working class college students may face culture shock upon entering the post-secondary education system, with its "middle class" culture.
Social connectedness to people of higher income levels is a strong predictor of upward income mobility. However, data shows substantial social segregation correlating with economic income groups.
Some researchers try to measure the cultural differences between the
American middle class and working class and suggest their ahistorical
sources and implications for educational attainment, future income, and
other life chances. Sociologist Melvin Kohn argues that working class
values emphasize external standards, such as obedience and a strong
respect for authority as well as little tolerance for deviance. This is
opposed to middle-class individuals who, he says, emphasize internal
standards, self-direction, curiosity and a tolerance for non-conformity.
... views were quite varied at
every class level, but the values we are calling working-class become
increasingly common at lower class levels... Kohn's interpretation... is
based on the idea that the middle-class parents who stress the values
of self-control, curiosity, and consideration are cultivating capacities
for self-direction... while working-class parents who focus on
obedience, neatness, and good manners are instilling behavioral
conformity.
— Dennis Gilbert, The American Class Structure, 1998.
Other social scientists, such as Barbara Jensen, show that
middle-class culture tends to be highly individualistic, while
working-class culture tends to center around the community. Such cultural value differences are thought to be closely linked to an
individual's occupation. Working-class employees tend to be closely
supervised and thus emphasize external values and obedience.
Working class culture can be broken down into subgroup
tendencies. According to Rubin (1976), there is a differential in social
and emotional skills both between working-class men and women and
between the blue-color working-class and college-educated workers.
Working-class men are characterized by Rubin as taking a rational
posture while women are characterized as being more emotional and
oriented towards communication of feelings. This constellation of
cultural issues has been explored in the popular media, for example, the
television shows, Roseanne or All in the Family featuring Archie Bunker and his wife Edith Bunker.
These popular television programs also explored generational change and
conflict in working-class families. One does need to note, however,
that there are great variations in cultural values among the members of
all classes and that any statement pertaining to the cultural values of
such large social groups needs to be seen as a broad generalization.
Further, if the hypothesis that culture primarily produces class
were true, such a non-dialectical, causal relationship pertains more
validly in some low-social mobility societies. Scandinavian countries,
by contrast, have discovered that removing structural barriers (and to
some extent broadly valorizing working class culture) is effective in
increasing social mobility, if not in eradicating social class under
capitalism.
A map of the county swing from 2012 to 2016. Trump's gains among working class voters, particularly in the Midwest, were decisive in winning the Electoral College.
According to Thomas B. Edsall,
an experienced political commentator, the white working class, defined
as non-Hispanic whites who have not completed college, plays a pivotal
role in the politics of the United States. This segment of the electorate was solidly Democratic during the New Deal,
but its support of Democratic candidates steadily eroded to about 50%
by the end of the 20th century. It is also diminishing as a portion of
the electorate, both due to increased educational opportunities and
because whites make up a declining share of the electorate overall.
The political role of the white working class was re-examined during the 2016 United States presidential election, due to the strong support for Donald Trump by white working class voters. Trump's victory was in part credited to this support in swing states such as Wisconsin, Michigan, and Pennsylvania, that had previously been won by his Democratic predecessor Barack Obama. Professional pollsters did not predict such a large swing for Trump
among the white working class. According to Nate Cohn, the gains that
Trump's opponent Hillary Clinton made among other voter classes "were overwhelmed by Mr. Trump's huge appeal to white voters without a degree." Voter turnout among white voters who did not have a college degree had
increased by 3 percent from 2012 to 2016., despite the composition of
white voters who did not have a college degree decreasing by 1 percent
from 2012 to 2016.
According to Nate Silver,
educational attainment, not income, predicted who would vote for Trump
in 2016. Specifically, Trump gained among White voters without college
degrees and lost ground among White voters with college degrees.
According to Lynn Vavreck
and colleagues, survey data revealed that economic insecurities
mattered to Trump voters most when connected to a racial animus, with
the job losses being specifically important when lost to an out-group, in a composite they called 'racialized economics'. Trump supporters have in turn been claimed to have actually have their
jobs threatened by Trump's policies, but have continued supporting him. Jonathan Metzl has claimed that low-income white men in Missouri, Tennessee
and Kansas oppose policies that support people in their position
because they believe that undeserving groups would benefit from them. Arlie Russell Hochschild has studied working-class people in Louisiana, and come to the conclusion that what is motivating them is a feeling, which she calls the Deep Story:
You
are patiently standing in a long line leading up a hill, as in a
pilgrimage. You are situated in the middle of this line, along with
others who are also white, older, Christian, predominantly male, some
with college degrees, some not. Just over the brow of the hill is the
American Dream, the goal of everyone waiting in line.... You've suffered
long hours, layoffs, and exposure to dangerous chemicals at work, and
received reduced pensions. You have shown moral character through trial
by fire, and the American Dream of prosperity and security is a reward
for all of this, showing who you have been and are—a badge of honor....
Look! You see people cutting in line ahead of you! You're following the
rules. They aren't. As they cut in, it feels like you are being moved
back. How can they just do that? Who are they? Some are black. Through
affirmative action plans, pushed by the federal government, they are
being given preference for places in colleges and universities,
apprenticeships, jobs, welfare payments, and free lunches.... Women,
immigrants, refugees, public sector workers—where will it end? Your
money is running through a liberal sympathy sieve you don't control or
agree with.... But it's people like you who have made this country
great. You feel uneasy. It has to be said: the line cutters irritate
you.... You are a stranger in your own land. You do not recognize
yourself in how others see you. It is a struggle to feel seen and
honored.... [Y]ou are slipping backward.
True statements are usually held to be the opposite of false statements. The concept of truth is discussed and debated in various contexts, including philosophy, art, theology, law, and science.
Most human activities depend upon the concept, where its nature as a
concept is assumed rather than being a subject of discussion, including journalism
and everyday life. Some philosophers view the concept of truth as
basic, and unable to be explained in any terms that are more easily
understood than the concept of truth itself. Most commonly, truth is viewed as the correspondence of language or thought to a mind-independent world. This is called the correspondence theory of truth.
Various theories and views of truth continue to be debated among scholars, philosophers, and theologians. There are many different questions about the nature of truth which are
still the subject of contemporary debates. These include the question of
defining truth; whether it is even possible to give an informative
definition of truth; identifying things as truth-bearers capable of being true or false; if truth and falsehood are bivalent, or if there are other truth values; identifying the criteria of truth that allow us to identify it and to distinguish it from falsehood; the role that truth plays in constituting knowledge; and, if truth is always absolute or if it can be relative to one's perspective.
The English word truth is derived from Old Englishtriewth, Middle Englishtrewthe, as a -th nominalisation of the adjective true (Old English treowe). In ordinary usage, the word may denote either a quality of "faithfulness, fidelity, loyalty, sincerity, veracity",or that of "agreement with fact or reality".
The adjective, cognate with German treu 'faithful', stems from Proto-Germanic*trewwj- 'having good faith', and perhaps ultimately from Proto-Indo-European*dru- 'tree', on the notion of "steadfast as an oak" (cf. Sanskritdā́ru '(piece of) wood'; Old Norsetrú 'faith, word of honour, belief').
The question of what is a proper basis for deciding how words,
symbols, ideas and beliefs may properly be considered true, whether by a
single person or an entire society, is dealt with by the five most
prevalent substantive theories of truth listed below. Each presents perspectives that are widely shared by published scholars.
Theories other than the most prevalent substantive theories are
also discussed. According to a survey of professional philosophers and
others on their philosophical views which was carried out in November
2009 (taken by 3226 respondents, including 1803 philosophy faculty
members and/or PhDs and 829 philosophy graduate students) 45% of
respondents accept or lean toward correspondence theories, 21% accept or
lean toward deflationary theories and 14% epistemic theories.
Correspondence theories emphasize that true beliefs and true statements correspond to the actual state of affairs. This type of theory stresses a relationship between thoughts or
statements on one hand, and things or objects on the other. It is a
traditional model tracing its origins to ancient Greek philosophers such as Socrates, Plato, and Aristotle. This class of theories holds that the truth or the falsity of a
representation is determined in principle entirely by how it relates to
"things" according to whether it accurately describes those "things". A
classic example of correspondence theory is the statement by the
thirteenth century philosopher and theologian Thomas Aquinas: "Veritas est adaequatio rei et intellectus" ("Truth is the adequation of things and intellect"), which Aquinas attributed to the ninth century NeoplatonistIsaac Israeli.Aquinas also restated the theory as: "A judgment is said to be true when it conforms to the external reality".
Correspondence theory centres around the assumption that truth is a matter of accurately copying what is known as "objective reality" and then representing it in thoughts, words, and other symbols. Many modern theorists have stated that this ideal cannot be achieved without analysing additional factors. For example, language plays a role in that all languages have words to
represent concepts that are virtually undefined in other languages. The German word Zeitgeist
is one such example: one who speaks or understands the language may
"know" what it means, but any translation of the word apparently fails
to accurately capture its full meaning (this is a problem with many
abstract words, especially those derived in agglutinative languages). Thus, some words add an additional parameter to the construction of an accurate truth predicate. Among the philosophers who grappled with this problem is Alfred Tarski, whose semantic theory is summarized further on.
For coherence theories in general, truth requires a proper fit of
elements within a whole system. Very often, coherence is taken to imply
something more than simple logical consistency; often there is a demand
that the propositions in a coherent system lend mutual inferential
support to each other. So, for example, the completeness and
comprehensiveness of the underlying set of concepts is a critical factor
in judging the validity and usefulness of a coherent system. A central tenet of coherence theories is the idea that truth is
primarily a property of whole systems of propositions, and can be
ascribed to an individual proposition only in virtue of its relationship
to that system as a whole. Among the assortment of perspectives
commonly regarded as coherence theory, theorists differ on the question
of whether coherence entails many possible true systems of thought or
only a single absolute system.
Some variants of coherence theory are claimed to describe the essential and intrinsic properties of formal systems in logic and mathematics. Formal reasoners are content to contemplate axiomatically independent and sometimes mutually contradictory systems side by side, for example, the various alternative geometries.
On the whole, coherence theories have been rejected for lacking
justification in their application to other areas of truth, especially
with respect to assertions about the natural world, empirical
data in general, assertions about practical matters of psychology and
society, especially when used without support from the other major
theories of truth.
Three influential forms of the pragmatic theory of truth were introduced around the turn of the 20th century by Charles Sanders Peirce, William James, and John Dewey.
Although there are wide differences in viewpoint among these and other
proponents of pragmatic theory, they all hold that truth is verified and
confirmed by the results of putting one's concepts into practice.
Peirce defines it: "Truth is that concordance of an abstract
statement with the ideal limit towards which endless investigation would
tend to bring scientific belief, which concordance the abstract
statement may possess by virtue of the confession of its inaccuracy and
one-sidedness, and this confession is an essential ingredient of truth." This statement stresses Peirce's view that ideas of approximation,
incompleteness, and partiality, what he describes elsewhere as fallibilism and "reference to the future", are essential to a proper conception of truth. Although Peirce uses words like concordance and correspondence to describe one aspect of the pragmatic sign relation, he is also quite explicit in saying that definitions of truth based on mere correspondence are no more than nominal definitions, which he accords a lower status than real definitions.
James' version of pragmatic theory, while complex, is often
summarized by his statement that "the 'true' is only the expedient in
our way of thinking, just as the 'right' is only the expedient in our
way of behaving." By this, James meant that truth is a quality, the value of which is confirmed by its effectiveness when applying concepts to practice (thus, "pragmatic").
Dewey, less broadly than James but more broadly than Peirce, held that inquiry, whether scientific, technical, sociological, philosophical, or cultural, is self-corrective over time if openly submitted for testing by a community of inquirers in order to clarify, justify, refine, and/or refute proposed truths.
Though not widely known, a new variation of the pragmatic theory
was defined and wielded successfully from the 20th century forward.
Defined and named by William Ernest Hocking,
this variation is known as "negative pragmatism". Essentially, what
works may or may not be true, but what fails cannot be true because the
truth always works. Philosopher of science Richard Feynman also subscribed to it: "We never are definitely right, we can only be sure we are wrong." This approach incorporates many of the ideas from Peirce, James, and
Dewey. For Peirce, the idea of "endless investigation would tend to
bring about scientific belief" fits negative pragmatism in that a
negative pragmatist would never stop testing. As Feynman noted, an idea
or theory "could never be proved right, because tomorrow's experiment
might succeed in proving wrong what you thought was right." Similarly, James and Dewey's ideas also ascribe truth to repeated testing which is "self-corrective" over time.
Pragmatism and negative pragmatism are also closely aligned with the coherence theory of truth
in that any testing should not be isolated but rather incorporate
knowledge from all human endeavors and experience. The universe is a
whole and integrated system, and testing should acknowledge and account
for its diversity. As Feynman said, "...if it disagrees with experiment, it is wrong."
Social constructivism
holds that truth is constructed by social processes, is historically
and culturally specific, and that it is in part shaped through the power
struggles within a community. Constructivism views all of our knowledge
as "constructed," because it does not reflect any external
"transcendent" realities (as a pure correspondence theory might hold).
Rather, perceptions of truth are viewed as contingent on convention,
human perception, and social experience. It is believed by
constructivists that representations of physical and biological reality,
including race, sexuality, and gender, are socially constructed.
Giambattista Vico was among the first to claim that history and culture were man-made. Vico's epistemological orientation unfolds in one axiom: verum ipsum factum—"truth itself is constructed". Hegel and Marx
were among the other early proponents of the premise that truth is, or
can be, socially constructed. Marx, like many critical theorists who
followed, did not reject the existence of objective truth, but rather
distinguished between true knowledge and knowledge that has been
distorted through power or ideology. For Marx, scientific and true
knowledge is "in accordance with the dialectical understanding of
history" and ideological knowledge is "an epiphenomenal expression of
the relation of material forces in a given economic arrangement".
Consensus theory
holds that truth is whatever is agreed upon, or in some versions, might
come to be agreed upon, by some specified group. Such a group might
include all human beings, or a subset thereof consisting of more than one person.
Among the current advocates of consensus theory as a useful accounting of the concept of "truth" is the philosopher Jürgen Habermas. Habermas maintains that truth is what would be agreed upon in an ideal speech situation. Among the current strong critics of consensus theory is the philosopher Nicholas Rescher.
Modern developments in the field of philosophy have resulted in the rise of a new thesis: that the term truth does not denote a real property of sentences or propositions. This thesis is in part a response to the common use of truth predicates (e.g., that some particular thing "...is
true") which was particularly prevalent in philosophical discourse on
truth in the first half of the 20th century. From this point of view, to
assert that "'2 + 2 = 4' is true" is logically equivalent to asserting
that "2 + 2 = 4", and the phrase "is true" is—philosophically, if not
practically (see: "Michael" example, below)—completely dispensable in
this and every other context. In common parlance, truth predicates are
not commonly heard, and it would be interpreted as an unusual occurrence
were someone to utilize a truth predicate in an everyday conversation
when asserting that something is true. Newer perspectives that take this
discrepancy into account, and work with sentence structures as actually
employed in common discourse, can be broadly described:
as deflationary theories of truth, since they attempt to deflate the presumed importance of the words "true" or truth,
as disquotational theories, to draw attention to the disappearance of the quotation marks in cases like the above example, or
as minimalist theories of truth.
Whichever term is used, deflationary theories can be said to hold in
common that "the predicate 'true' is an expressive convenience, not the
name of a property requiring deep analysis." Once we have identified the truth predicate's formal features and
utility, deflationists argue, we have said all there is to be said about
truth. Among the theoretical concerns of these views is to explain away
those special cases where it does appear that the concept of truth has peculiar and interesting properties. (See, e.g., Semantic paradoxes, and below.)
The scope of deflationary principles is generally limited to
representations that resemble sentences. They do not encompass a broader
range of entities that are typically considered true or otherwise. In
addition, some deflationists point out that the concept employed in "...is
true" formulations does enable us to express things that might
otherwise require infinitely long sentences; for example, one cannot
express confidence in Michael's accuracy by asserting the endless
sentence:
Michael says, 'snow is white' and snow is white, or he says 'roses are red' and roses are red or he says... etc.
This assertion can instead be succinctly expressed by saying: What Michael says is true.
An early variety of deflationary theory is the redundancy theory of truth,
so-called because—in examples like those above, e.g. "snow is white [is
true]"—the concept of "truth" is redundant and need not have been
articulated; that is, it is merely a word that is traditionally used in
conversation or writing, generally for emphasis, but not a word that
actually equates to anything in reality. This theory is commonly
attributed to Frank P. Ramsey, who held that the use of words like fact and truth was nothing but a roundabout
way of asserting a proposition, and that treating these words as
separate problems in isolation from judgment was merely a "linguistic
muddle".
A variant of redundancy theory is the "disquotational" theory, which uses a modified form of the logician Alfred Tarski's schema: proponents observe that to say that "'P' is true" is to assert "P". A version of this theory was defended by C. J. F. Williams (in his book What is Truth?).
Yet another version of deflationism is the prosentential theory of
truth, first developed by Dorothy Grover, Joseph Camp, and Nuel Belnap
as an elaboration of Ramsey's claims. They argue that utterances such
as "that's true", when said in response to (e.g.) "it's raining", are "prosentences"—expressions that merely repeat the content of other expressions. In the same way that it means the same as my dog in the statement "my dog was hungry, so I fed it", that's true is supposed to mean the same as it's raining when the former is said in reply to the latter.
As noted above, proponents of these ideas do not necessarily follow Ramsey in asserting that truth is not a property; rather, they can be understood to say that, for instance, the assertion "P" may well
involve a substantial truth—it is only the redundancy involved in
statements such as "that's true" (i.e., a prosentence) which is to be
minimized.
Performative
Attributed to philosopher P. F. Strawson is the performative theory of truth which holds that to say "'Snow is white' is true" is to perform the speech act
of signaling one's agreement with the claim that snow is white (much
like nodding one's head in agreement). The idea that some statements are
more actions than communicative statements is not as odd as it may
seem. For example, when a wedding couple says "I do" at the appropriate
time in a wedding, they are performing the act of taking the other to be
their lawful wedded spouse. They are not describing themselves as taking the other, but actually doing so (perhaps the most thorough analysis of such "illocutionary acts" is J. L. Austin, most notably in How to Do Things With Words).
Strawson holds that a similar analysis is applicable to all
speech acts, not just illocutionary ones: "To say a statement is true is
not to make a statement about a statement, but rather to perform the
act of agreeing with, accepting, or endorsing a statement. When one says
'It's true that it's raining,' one asserts no more than 'It's raining.'
The function of [the statement] 'It's true that...' is to agree with, accept, or endorse the statement that 'it's raining.'"
Philosophical skepticism comes in various forms. Radical forms of skepticism deny that knowledge or rational belief is possible and urge us to suspend judgment
regarding ascription of truth on many or all controversial matters.
More moderate forms of skepticism claim only that nothing can be known
with certainty, or that we can know little or nothing about the "big
questions" in life, such as whether God exists or whether there is an
afterlife. Religious skepticism is "doubt concerning basic religious principles (such as immortality, providence, and revelation)". Scientific skepticism concerns testing beliefs for reliability, by subjecting them to systematic investigation using the scientific method, to discover empirical evidence for them.
A notable early expression of philosophical skepticism appears in the Gospel of John, where the Roman governor Pontius Pilate, after hearing Jesus claim to “bear witness to the truth,” replies with the rhetorical question, “Quid est veritas?” (“What is truth?”) (John 18:38).
Commentators have frequently interpreted this question as a dismissal
of the very possibility of objective or knowable truth. Alfred Plummer
characterizes Pilate as a “practical man of the world” who regards such
talk as visionary and naive. Similarly, Francis Bacon cited the episode in his essay Of Truth as illustrative of those who “jest at truth” and refuse to remain for an answer, reflecting a skeptical or cynical stance. The episode remains a widely referenced example of epistemological
doubt, emblematic of philosophical skepticism toward the existence or
accessibility of absolute truth.
Several of the major theories of truth hold that there is a
particular property the having of which makes a belief or proposition
true. Pluralist theories of truth assert that there may be more than one
property that makes propositions true: ethical propositions might be
true by virtue of coherence. Propositions about the physical world might
be true by corresponding to the objects and properties they are about.
Some of the pragmatic theories, such as those by Charles Peirce and William James, included aspects of correspondence, coherence and constructivist theories. Crispin Wright argued in his 1992 book Truth and Objectivity
that any predicate which satisfied certain platitudes about truth
qualified as a truth predicate. In some discourses, Wright argued, the
role of the truth predicate might be played by the notion of
superassertibility. Michael Lynch, in a 2009 book Truth as One and Many,
argued that we should see truth as a functional property capable of
being multiply manifested in distinct properties like correspondence or
coherence.
Logic is concerned with the patterns in reason that can help tell if a proposition is true or not. Logicians use formal languages to express the truths they are concerned with, and as such there is only truth under some interpretation or truth within some logical system.
A logical truth (also called an analytic truth or a necessary
truth) is a statement that is true in all logically possible worlds or under all possible interpretations, as contrasted to a fact (also called a synthetic claim or a contingency), which is only true in this world
as it has historically unfolded. A proposition such as "If p and q,
then p" is considered to be a logical truth because of the meaning of
the symbols and words in it and not because of any fact of any particular world. They are such that they could not be untrue.
Historically, with the nineteenth century development of Boolean algebra,
mathematical models of logic began to treat "truth", also represented
as "T" or "1", as an arbitrary constant. "Falsity" is also an arbitrary
constant, which can be represented as "F" or "0". In propositional logic, these symbols can be manipulated according to a set of axioms and rules of inference, often given in the form of truth tables.
In addition, from at least the time of Hilbert's program at the turn of the twentieth century to the proof of Gödel's incompleteness theorems and the development of the Church–Turing thesis in the early part of that century, true statements in mathematics were generally assumed to be those statements that are provable in a formal axiomatic system.
The works of Kurt Gödel, Alan Turing, and others shook this assumption, with the development of statements that are true but cannot be proven within the system. Two examples of the latter can be found in Hilbert's problems. Work on Hilbert's 10th problem led in the late twentieth century to the construction of specific Diophantine equations for which it is undecidable whether they have a solution, or even if they do, whether they have a finite or infinite number of solutions. More fundamentally, Hilbert's first problem was on the continuum hypothesis. Gödel and Paul Cohen showed that this hypothesis cannot be proved or disproved using the standard axioms of set theory. In the view of some, then, it is equally reasonable to take either the continuum hypothesis or its negation as a new axiom.
Gödel thought that the ability to perceive the truth of a mathematical or logical proposition is a matter of intuition, an ability he admitted could be ultimately beyond the scope of a formal theory of logic or mathematics and perhaps best considered in the realm of human comprehension
and communication. But he commented, "The more I think about language,
the more it amazes me that people ever understand each other at all".
Tarski's theory of truth (named after Alfred Tarski) was developed for formal languages, such as formal logic. Here he restricted it in this way: no language could contain its own truth predicate, that is, the expression is true could only apply to sentences in some other language. The latter he called an object language,
the language being talked about. (It may, in turn, have a truth
predicate that can be applied to sentences in still another language.)
The reason for his restriction was that languages that contain their own
truth predicate will contain paradoxical
sentences such as, "This sentence is not true". As a result, Tarski
held that the semantic theory could not be applied to any natural
language, such as English, because they contain their own truth
predicates. Donald Davidson used it as the foundation of his truth-conditional semantics and linked it to radical interpretation in a form of coherentism.
Bertrand Russell
is credited with noticing the existence of such paradoxes even in the
best symbolic formations of mathematics in his day, in particular the
paradox that came to be named after him, Russell's paradox. Russell and Whitehead attempted to solve these problems in Principia Mathematica by putting statements into a hierarchy of types,
wherein a statement cannot refer to itself, but only to statements
lower in the hierarchy. This in turn led to new orders of difficulty
regarding the precise natures of types and the structures of
conceptually possible type systems that have yet to be resolved to this day.
Kripke's theory of truth (named after Saul Kripke)
contends that a natural language can in fact contain its own truth
predicate without giving rise to contradiction. He showed how to
construct one as follows:
Beginning with a subset of sentences of a natural language that
contains no occurrences of the expression "is true" (or "is false"). So,
The barn is big is included in the subset, but not "The barn is big is true", nor problematic sentences such as "This sentence is false".
Defining truth just for the sentences in that subset.
Extending the definition of truth to include sentences that
predicate truth or falsity of one of the original subset of sentences.
So "The barn is big is true" is now included, but not either "This sentence is false" nor "'The barn is big is true' is true".
Defining truth for all sentences that predicate truth or falsity of a
member of the second set. Imagine this process repeated infinitely, so
that truth is defined for The barn is big; then for "The barn is big is true"; then for "'The barn is big is true' is true", and so on.
Truth never gets defined for sentences like This sentence is false,
since it was not in the original subset and does not predicate truth of
any sentence in the original or any subsequent set. In Kripke's terms,
these are "ungrounded." Since these sentences are never assigned either
truth or falsehood even if the process is carried out infinitely,
Kripke's theory implies that some sentences are neither true nor false.
This contradicts the principle of bivalence: every sentence must be either true or false. Since this principle is a key premise in deriving the liar paradox, the paradox is dissolved.
The truth predicate "P is true" has great practical value in human language, allowing efficient
endorsement or impeaching of claims made by others, to emphasize the
truth or falsity of a statement, or to enable various indirect (Gricean) conversational implications. Individuals or societies will sometime punish "false" statements to deter falsehoods; the oldest surviving law text, the Code of Ur-Nammu, lists penalties for false accusations of sorcery or adultery, as well as for committing perjury in court. Even four-year-old children can pass simple "false belief" tests and successfully assess that another individual's belief diverges from reality in a specific way; by adulthood there are strong implicit intuitions about "truth" that form a "folk theory" of truth. These intuitions include:
Normativity: It is usually good to believe what is true
False beliefs: The notion that believing a statement does not necessarily make it true
Like many folk theories, the folk theory of truth is useful in
everyday life but, upon deep analysis, turns out to be technically
self-contradictory; in particular, any formal system that fully obeys "capture and release" semantics for truth (also known as the T-schema), and that also respects classical logic, is provably inconsistent and succumbs to the liar paradox or to a similar contradiction.
Socrates', Plato's and Aristotle's ideas about truth are seen by some as consistent with correspondence theory. In his Metaphysics,
Aristotle stated: "To say of what is that it is not, or of what is not
that it is, is false, while to say of what is that it is, and of what is
not that it is not, is true". The Stanford Encyclopedia of Philosophy proceeds to say of Aristotle:
...Aristotle sounds much more like a genuine correspondence theorist in the Categories
(12b11, 14b14), where he talks of "underlying things" that make
statements true and implies that these "things" (pragmata) are logically
structured situations or facts (viz., his sitting, his not sitting).
Most influential is his claim in De Interpretatione (16a3) that
thoughts are "likenesses" (homoiosis) of things. Although he nowhere
defines truth in terms of a thought's likeness to a thing or fact, it is
clear that such a definition would fit well into his overall philosophy
of mind....
Similar statements can also be found in Plato's dialogues (Cratylus 385b2, Sophist 263b).
Some Greek philosophers maintained that truth was either not
accessible to mortals, or of greatly limited accessibility, forming
early philosophical skepticism. Among these were Xenophanes, Democritus, and Pyrrho, the founder of Pyrrhonism, who argued that there was no criterion of truth.
The Epicureans believed that all sense perceptions were true, and that errors arise in how we judge those perceptions.
What corresponds in the mind to what is outside it.
Avicenna elaborated on his definition of truth later in Book VIII, Chapter 6:
The truth of a thing is the property of the being of each thing which has been established in it.
This definition is but a rendering of the medieval Latin translation of the work by Simone van Riet. A modern translation of the original Arabic text states:
Truth is also said of the veridical belief in the existence [of something].
Aquinas (1225–1274)
Reevaluating Avicenna, and also Augustine and Aristotle, Thomas Aquinas stated in his Disputed Questions on Truth:
A natural thing, being placed between two intellects, is called true
insofar as it conforms to either. It is said to be true with respect to
its conformity with the divine intellect insofar as it fulfills the end
to which it was ordained by the divine intellect...
With respect to its conformity with a human intellect, a thing is said
to be true insofar as it is such as to cause a true estimate about
itself.
Thus, for Aquinas, the truth of the human intellect (logical truth) is based on the truth in things (ontological truth). Following this, he wrote an elegant re-statement of Aristotle's view in his Summa I.16.1:
Veritas est adæquatio intellectus et rei.
(Truth is the conformity of the intellect and things.)
Aquinas also said that real things participate in the act of being of the Creator God
who is Subsistent Being, Intelligence, and Truth. Thus, these beings
possess the light of intelligibility and are knowable. These things
(beings; reality) are the foundation of the truth that is found in the human mind, when it acquires knowledge of things, first through the senses, then through the understanding and the judgement done by reason. For Aquinas, human intelligence ("intus", within and "legere", to read) has the capability to reach the essence and existence of things because it has a non-material, spiritual element, although some moral, educational, and other elements might interfere with its capability.
Changing concepts of truth in the Middle Ages
Richard Firth Green examined the concept of truth in the later Middle Ages in his A Crisis of Truth, and concludes that roughly during the reign of Richard II of England the very meaning of the concept changes. The idea of the oath, which was so much part and parcel of for instance Romance literature, changes from a subjective concept to a more objective one (in Derek Pearsall's summary). Whereas truth (the "trouthe" of Sir Gawain and the Green Knight) was first "an ethical truth in which truth is understood to reside in persons", in Ricardian England it "transforms... into a political truth in which truth is understood to reside in documents".
Modern philosophy
Kant (1724–1804)
Immanuel Kant endorses a definition of truth along the lines of the correspondence theory of truth. Kant writes in the Critique of Pure Reason:
"The nominal definition of truth, namely that it is the agreement of
cognition with its object, is here granted and presupposed". He denies that this correspondence definition of truth provides us with
a test or criterion to establish which judgements are true. He states
in his logic lectures:
...Truth, it is said,
consists in the agreement of cognition with its object. In consequence
of this mere nominal definition, my cognition, to count as true, is
supposed to agree with its object. Now I can compare the object with my
cognition, however, only by cognizing it. Hence my cognition is
supposed to confirm itself, which is far short of being sufficient for
truth. For since the object is outside me, the cognition in me, all I
can ever pass judgement on is whether my cognition of the object agrees
with my cognition of the object.
The ancients called such a circle in explanation a diallelon.
And actually the logicians were always reproached with this mistake by
the sceptics, who observed that with this definition of truth it is just
as when someone makes a statement before a court and in doing so
appeals to a witness with whom no one is acquainted, but who wants to
establish his credibility by maintaining that the one who called him as
witness is an honest man. The accusation was grounded, too. Only the
solution of the indicated problem is impossible without qualification
and for every man....
This passage makes use of his distinction between nominal and real
definitions. A nominal definition explains the meaning of a linguistic
expression. A real definition describes the essence of certain objects and enables us to determine whether any given item falls within the definition. Kant holds that the definition of truth is merely nominal and,
therefore, we cannot employ it to establish which judgements are true.
According to Kant, the ancient skeptics were critical of the logicians
for holding that, by means of a merely nominal definition of truth, they
can establish which judgements are true. They were trying to do
something that is "impossible without qualification and for every man".
Hegel (1770–1831)
G. W. F. Hegel
distanced his philosophy from empiricism by presenting truth as a
self-moving process, rather than a matter of merely subjective thoughts.
Hegel's truth is analogous to an organism in that it is
self-determining according to its own inner logic: "Truth is its own
self-movement within itself."
Schopenhauer (1788–1860)
For Arthur Schopenhauer, a judgment is a combination or separation of two or more concepts. If a judgment is to be an expression of knowledge, it must have a sufficient reason or ground by which the judgment could be called true. Truth is the reference of a judgment to something different from itself which is its sufficient reason (ground). Judgments can have material, formal, transcendental, or metalogical truth. A judgment has material
truth if its concepts are based on intuitive perceptions that are
generated from sensations. If a judgment has its reason (ground) in
another judgment, its truth is called logical or formal. If a
judgment, of, for example, pure mathematics or pure science, is based on
the forms (space, time, causality) of intuitive, empirical knowledge,
then the judgment has transcendental truth.
Kierkegaard (1813–1855)
When Søren Kierkegaard, as his character Johannes Climacus, ends his writings: My thesis was, subjectivity, heartfelt is the truth, he does not advocate for subjectivism
in its extreme form (the theory that something is true simply because
one believes it to be so), but rather that the objective approach to
matters of personal truth cannot shed any light upon that which is most
essential to a person's life. Objective truths are concerned with the
facts of a person's being, while subjective truths are concerned with a
person's way of being. Kierkegaard agrees that objective truths for the
study of subjects like mathematics, science, and history are relevant
and necessary, but argues that objective truths do not shed any light on
a person's inner relationship to existence. At best, these truths can
only provide a severely narrowed perspective that has little to do with
one's actual experience of life.
While objective truths are final and static, subjective truths
are continuing and dynamic. The truth of one's existence is a living,
inward, and subjective experience that is always in the process of
becoming. The values, morals, and spiritual approaches a person adopts,
while not denying the existence of objective truths of those beliefs,
can only become truly known when they have been inwardly appropriated
through subjective experience. Thus, Kierkegaard criticizes all
systematic philosophies which attempt to know life or the truth of
existence via theories and objective knowledge about reality. As
Kierkegaard claims, human truth is something that is continually
occurring, and a human being cannot find truth separate from the
subjective experience of one's own existing, defined by the values and
fundamental essence that consist of one's way of life.
Nietzsche (1844–1900)
Friedrich Nietzsche believed the search for truth, or 'the will to truth', was a consequence of the will to power of philosophers. He thought that truth should be used as long as it promoted life and the will to power, and he thought untruth was better than truth if it had this life enhancement as a consequence. As he wrote in Beyond Good and Evil, "The falseness of a judgment is to us not necessarily an objection to a judgment... The question is to what extent it is life-advancing, life-preserving, species-preserving, perhaps even species-breeding..." (aphorism 4). He proposed the will to power as a truth only because, according to him, it was the most life-affirming and sincere perspective one could have.
Robert Wicks discusses Nietzsche's basic view of truth as follows:
...Some scholars regard Nietzsche's 1873 unpublished essay, "On Truth and Lies in a Nonmoral Sense" ("Über Wahrheit und Lüge im außermoralischen Sinn")
as a keystone in his thought. In this essay, Nietzsche rejects the idea
of universal constants, and claims that what we call "truth" is only "a
mobile army of metaphors, metonyms, and anthropomorphisms." His view at
this time is that arbitrariness completely prevails within human
experience: concepts originate via the very artistic transference of
nerve stimuli into images; "truth" is nothing more than the invention of
fixed conventions for merely practical purposes, especially those of
repose, security and consistence....
Separately Nietzsche suggested that an ancient, metaphysical belief
in the divinity of Truth lies at the heart of and has served as the
foundation for the entire subsequent Western intellectual tradition:
"But you will have gathered what I am getting at, namely, that it is
still a metaphysical faith on which our faith in science rests—that even
we knowers of today, we godless anti-metaphysicians still take our
fire too, from the flame lit by the thousand-year old faith, the
Christian faith which was also Plato's faith, that God is Truth; that
Truth is 'Divine'..."
Moreover, Nietzsche challenges the notion of objective truth, arguing
that truths are human creations and serve practical purposes. He wrote,
"Truths are illusions about which one has forgotten that this is what
they are." He argues that truth is a human invention, arising from the artistic
transference of nerve stimuli into images, serving practical purposes
like repose, security, and consistency; formed through metaphorical and
rhetorical devices, shaped by societal conventions and forgotten
origins:
"What,
then, is truth? A mobile army of metaphors, metonyms, and
anthropomorphisms – in short, a sum of human relations which have been
enhanced, transposed, and embellished poetically and rhetorically..."
Nietzsche argues that truth is always filtered through individual
perspectives and shaped by various interests and biases. In "On the
Genealogy of Morality," he asserts, "There are no facts, only
interpretations." He suggests that truth is subject to constant reinterpretation and
change, influenced by shifting cultural and historical contexts as he
writes in "Thus Spoke Zarathustra" that "I say unto you: one must still
have chaos in oneself to be able to give birth to a dancing star." In the same book, Zarathustra proclaims, "Truths are illusions which we
have forgotten are illusions; they are metaphors that have become worn
out and have been drained of sensuous force, coins which have lost their
embossing and are now considered as metal and no longer as coins."
Heidegger (1889–1976)
Other philosophers take this common meaning to be secondary and derivative. According to Martin Heidegger, the original meaning and essence of truth in Ancient Greece
was unconcealment, or the revealing or bringing of what was previously
hidden into the open, as indicated by the original Greek term for truth,
aletheia. On this view, the conception of truth as correctness is a later
derivation from the concept's original essence, a development Heidegger
traces to the Latin term veritas. Owing to the primacy of ontology in Heidegger's philosophy, he considered this truth to lie within Being itself, and already in Being and Time (1927) had identified truth with "being-truth" or the "truth of Being" and partially with the Kantian thing-in-itself in an epistemology essentially concerning a mode of Dasein.
Sartre (1905–1980)
In Being and Nothingness (1943), partially following Heidegger, Jean-Paul Sartre identified our knowledge of the truth as a relation between the in-itself and for-itself of being
- yet simultaneously closely connected in this vein to the data
available to the material personhood, in the body, of an individual in
their interaction with the world and others - with Sartre's description
that "the world is human" allowing him to postulate all truth as
strictly understood by self-consciousness as self-consciousness of something, a view also preceded by Henri Bergson in Time and Free Will (1889), the reading of which Sartre had credited for his interest in philosophy. This first existentialist theory, more fully fleshed out in Sartre's essay Truth and Existence
(1948), which already demonstrates a more radical departure from
Heidegger in its emphasis on the primacy of the idea, already formulated
in Being and Nothingness, of existence as preceding essence in its role in the formulation of truth, has nevertheless been critically examined as idealist rather than materialist in its departure from more traditional idealist epistemologies such as those of Ancient Greek philosophy in Plato and Aristotle, and staying as does Heidegger with Kant.
Later, in the Search for a Method (1957), in which Sartre used a unification of existentialism and Marxism that he would later formulate in the Critique of Dialectical Reason (1960), Sartre, with his growing emphasis on the Hegelian totalisation of historicity,
posited a conception of truth still defined by its process of relation
to a container giving it material meaning, but with specific reference
to a role in this broader totalisation, for "subjectivity is neither
everything nor nothing; it represents a moment in the objective process
(that in which externality is internalised), and this moment is
perpetually eliminated only to be perpetually reborn": "For us, truth is
something which becomes, it has and will have become. It
is a totalisation which is forever being totalised. Particular facts do
not signify anything; they are neither true nor false so long as they
are not related, through the mediation of various partial totalities, to
the totalisation in process." Sartre describes this as a "realistic epistemology", developed out of Marx's ideas but with such a development only possible in an existentialist light, as with the theme of the whole work. In an early segment of the lengthy two-volume Critique
of 1960, Sartre continued to describe truth as a "totalising" "truth of
history" to be interpreted by a "Marxist historian", whilst his break
with Heidegger's epistemological ideas is finalised in the description
of a seemingly antinomous "dualism of Being and Truth" as the essence of a truly Marxist epistemology.
Camus (1913–1960)
The well-regarded French philosopher Albert Camus wrote in his famous essay, The Myth of Sisyphus (1942), that "there are truths but no truth", in fundamental agreement with Nietzsche's perspectivism,
and favourably cites Kierkergaad in posing that "no truth is absolute
or can render satisfactory an existence that is impossible in itself". Later, in The Rebel (1951), he declared, akin to Sartre, that "the very lowest form of truth" is "the truth of history", but describes this in the context of its abuse and like Kierkergaad in the Concluding Unscientific Postscript
he criticizes Hegel in holding a historical attitude "which consists of
saying: 'This is truth, which appears to us, however, to be error, but
which is true precisely because it happens to be error. As for proof, it
is not I, but history, at its conclusion, that will furnish it.'"
Peirce (1839–1914)
Pragmatists like C. S. Peirce take truth to have some manner of essential relation to human practices for inquiring into and discovering truth, with Peirce himself holding that truth is what human inquiry
would find out on a matter, if our practice of inquiry were taken as
far as it could profitably go: "The opinion which is fated to be
ultimately agreed to by all who investigate, is what we mean by the
truth..."
Nishida (1870–1945)
According to Kitaro Nishida,
"knowledge of things in the world begins with the differentiation of
unitary consciousness into knower and known and ends with self and
things becoming one again. Such unification takes form not only in
knowing but in the valuing (of truth) that directs knowing, the willing
that directs action, and the feeling or emotive reach that directs
sensing."
Fromm (1900–1980)
Erich Fromm
finds that trying to discuss truth as "absolute truth" is sterile and
that emphasis ought to be placed on "optimal truth". He considers truth
as stemming from the survival imperative of grasping one's environment
physically and intellectually, whereby young children instinctively seek
truth so as to orient themselves in "a strange and powerful world". The
accuracy of their perceived approximation of the truth will therefore
have direct consequences on their ability to deal with their
environment. Fromm can be understood to define truth as a functional
approximation of reality. His vision of optimal truth is described
below:
...the
dichotomy between 'absolute = perfect' and 'relative = imperfect' has
been superseded in all fields of scientific thought, where "it is
generally recognized that there is no absolute truth but nevertheless
that there are objectively valid laws and principles".
[...]
In that respect, "a scientifically or rationally valid statement means
that the power of reason is applied to all the available data of
observation without any of them being suppressed or falsified for the
sake of the desired result". The history of science is "a history of
inadequate and incomplete statements, and every new insight makes
possible the recognition of the inadequacies of previous propositions
and offers a springboard for creating a more adequate formulation."
[...]
As a result "the history of thought is the history of an
ever-increasing approximation to the truth. Scientific knowledge is not
absolute but optimal; it contains the optimum of truth attainable in a
given historical period." Fromm furthermore notes that "different
cultures have emphasized various aspects of the truth" and that
increasing interaction between cultures allows for these aspects to
reconcile and integrate, increasing further the approximation to the
truth.
Foucault (1926–1984)
Truth, says Michel Foucault,
is problematic when any attempt is made to see truth as an "objective"
quality. He prefers not to use the term truth itself but "Regimes of
Truth". In his historical investigations he found truth to be something
that was itself a part of, or embedded within, a given power structure.
Thus Foucault's view shares much in common with the concepts of Nietzsche. Truth for Foucault is also something that shifts through various episteme throughout history.
Baudrillard (1929–2007)
Jean Baudrillard
considered truth to be largely simulated, that is pretending to have
something, as opposed to dissimulation, pretending to not have
something. He took his cue from iconoclasts whom he claims knew that images of God demonstrated that God did not exist. Baudrillard wrote in "Precession of the Simulacra":
The simulacrum is never that which conceals the truth—it is the truth which conceals that there is none. The simulacrum is true.
—Ecclesiastes
Some examples of simulacra that Baudrillard cited were: that prisons simulate the "truth" that society is free; scandals (e.g., Watergate)
simulate that corruption is corrected; Disney simulates that the U.S.
itself is an adult place. Though such examples seem extreme, such
extremity is an important part of Baudrillard's theory. For a less
extreme example, movies usually end with the bad being punished,
humiliated, or otherwise failing, thus affirming for viewers the concept
that the good end happily and the bad unhappily, a narrative which
implies that the status quo and established power structures are largely
legitimate.
Other contemporary positions
Truthmaker theory is "the branch of metaphysics that explores the relationships between what is true and what exists". It is different from substantive theories of truth in the sense that it
does not aim at giving a definition of what truth is. Instead, it has
the goal of determining how truth depends on being.