Search This Blog

Thursday, July 24, 2025

Panpsychism

From Wikipedia, the free encyclopedia
Illustration of the Neoplatonic concept of the anima mundi emanating from The Absolute, in some ways a precursor to modern panpsychism

In philosophy of mind, panpsychism (/pænˈskɪzəm/) is the view that the mind or a mind-like aspect is a fundamental and ubiquitous feature of reality. It is also described as a theory that "the mind is a fundamental feature of the world which exists throughout the universe". It is one of the oldest philosophical theories, and has been ascribed in some form to philosophers including Thales, Plato, Spinoza, Leibniz, Schopenhauer, William JamesAlfred North Whitehead, and Bertrand Russell. In the 19th century, panpsychism was the default philosophy of mind in Western thought, but it saw a decline in the mid-20th century with the rise of logical positivism. Recent interest in the hard problem of consciousness and developments in the fields of neuroscience, psychology, and quantum mechanics have revived interest in panpsychism in the 21st century because it addresses the hard problem directly.

Overview

Etymology

The term panpsychism comes from the Greek pan (πᾶν: "all, everything, whole") and psyche (ψυχή: "soul, mind"). The use of "psyche" is controversial because it is synonymous with "soul", a term usually taken to refer to something supernatural; more common terms now found in the literature include mind, mental properties, mental aspect, and experience.

Concept

Panpsychism holds that mind or a mind-like aspect is a fundamental and ubiquitous feature of reality. It is sometimes defined as a theory in which "the mind is a fundamental feature of the world which exists throughout the universe". Panpsychists posit that the type of mentality we know through our own experience is present, in some form, in a wide range of natural bodies. This notion has taken on a wide variety of forms. Some historical and non-Western panpsychists ascribe attributes such as life or spirits to all entities (animism). Contemporary academic proponents, however, hold that sentience or subjective experience is ubiquitous, while distinguishing these qualities from more complex human mental attributes. They therefore ascribe a primitive form of mentality to entities at the fundamental level of physics but may not ascribe mentality to most aggregate things, such as rocks or buildings.

Terminology

The philosopher David Chalmers, who has explored panpsychism as a viable theory, distinguishes between microphenomenal experiences (the experiences of microphysical entities) and macrophenomenal experiences (the experiences of larger entities, such as humans).

Philip Goff draws a distinction between panexperientialism and pancognitivism. In the form of panpsychism under discussion in the contemporary literature, conscious experience is present everywhere at a fundamental level, hence the term panexperientialism. Pancognitivism, by contrast, is the view that thought is present everywhere at a fundamental level—a view that had some historical advocates, but no present-day academic adherents. Contemporary panpsychists do not believe microphysical entities have complex mental states such as beliefs, desires, and fears.

Originally, the term panexperientialism had a narrower meaning, having been coined by David Ray Griffin to refer specifically to the form of panpsychism used in process philosophy (see below).

History

Antiquity

Two iwakura – a rock where a kami or spirit is said to reside in the religion of Shinto

Panpsychist views are a staple in pre-Socratic Greek philosophy. According to Aristotle, Thales (c. 624 – 545 BCE), the first Greek philosopher, posited a theory which held "that everything is full of gods". Thales believed that magnets demonstrated this. This has been interpreted as a panpsychist doctrine. Other Greek thinkers associated with panpsychism include Anaxagoras (who saw the unifying principle or arche as nous or mind), Anaximenes (who saw the arche as pneuma or spirit) and Heraclitus (who said "The thinking faculty is common to all").

Plato argues for panpsychism in his Sophist, in which he writes that all things participate in the form of Being and that it must have a psychic aspect of mind and soul (psyche). In the Philebus and Timaeus, Plato argues for the idea of a world soul or anima mundi. According to Plato:

This world is indeed a living being endowed with a soul and intelligence ... a single visible living entity containing all other living entities, which by their nature are all related.

Stoicism developed a cosmology that held that the natural world is infused with the divine fiery essence pneuma, directed by the universal intelligence logos. The relationship between beings' individual logos and the universal logos was a central concern of the Roman Stoic Marcus Aurelius. The metaphysics of Stoicism finds connections with Hellenistic philosophies such as Neoplatonism. Gnosticism also made use of the Platonic idea of anima mundi.

Renaissance

Illustration of the Cosmic order by Robert Fludd, where the World soul is depicted as a woman

After Emperor Justinian closed Plato's Academy in 529 CE, neoplatonism declined. Though there were mediaeval theologians, such as John Scotus Eriugena, who ventured into what might be called panpsychism, it was not a dominant strain in philosophical theology. But in the Italian Renaissance, it enjoyed something of a revival in the thought of figures such as Gerolamo Cardano, Bernardino Telesio, Francesco Patrizi, Giordano Bruno, and Tommaso Campanella. Cardano argued for the view that soul or anima was a fundamental part of the world, and Patrizi introduced the term panpsychism into philosophical vocabulary. According to Bruno, "There is nothing that does not possess a soul and that has no vital principle". Platonist ideas resembling the anima mundi (world soul) also resurfaced in the work of esoteric thinkers such as Paracelsus, Robert Fludd, and Cornelius Agrippa.

Early modern

In the 17th century, two rationalists, Baruch Spinoza and Gottfried Leibniz, can be said to be panpsychists. In Spinoza's monism, the one single infinite and eternal substance is "God, or Nature" (Deus sive Natura), which has the aspects of mind (thought) and matter (extension). Leibniz's view is that there are infinitely many absolutely simple mental substances called monads that make up the universe's fundamental structure. While it has been said that George Berkeley's idealist philosophy is also a form of panpsychism, Berkeley rejected panpsychism and posited that the physical world exists only in the experiences minds have of it, while restricting minds to humans and certain other specific agents.

19th century

In the 19th century, panpsychism was at its zenith. Philosophers such as Arthur Schopenhauer, C.S. Peirce, Josiah Royce, William James, Eduard von Hartmann, F.C.S. Schiller, Ernst Haeckel, William Kingdon Clifford and Thomas Carlyle as well as psychologists such as Gustav Fechner, Wilhelm Wundt, Rudolf Hermann Lotze all promoted panpsychist ideas.

Arthur Schopenhauer argued for a two-sided view of reality as both Will and Representation (Vorstellung). According to Schopenhauer, "All ostensible mind can be attributed to matter, but all matter can likewise be attributed to mind".

Josiah Royce, the leading American absolute idealist, held that reality is a "world self", a conscious being that comprises everything, though he did not necessarily attribute mental properties to the smallest constituents of mentalistic "systems". The American pragmatist philosopher Charles Sanders Peirce espoused a sort of psycho-physical monism in which the universe is suffused with mind, which he associated with spontaneity and freedom. Following Pierce, William James also espoused a form of panpsychism. In his lecture notes, James wrote:

Our only intelligible notion of an object in itself is that it should be an object for itself, and this lands us in panpsychism and a belief that our physical perceptions are effects on us of 'psychical' realities

English philosopher Alfred Barratt, the author of Physical Metempiric (1883), has been described as advocating panpsychism.

In 1893, Paul Carus proposed a philosophy similar to panpsychism, "panbiotism", according to which "everything is fraught with life; it contains life; it has the ability to live".

20th century

Bertrand Russell's neutral monist views tended toward panpsychism. The physicist Arthur Eddington also defended a form of panpsychism. The psychologists Gerard Heymans, James Ward and Charles Augustus Strong also endorsed variants of panpsychism.

In 1990, the physicist David Bohm published "A new theory of the relationship of mind and matter," a paper based on his interpretation of quantum mechanics. The philosopher Paavo Pylkkänen has described Bohm's view as a version of panprotopsychism.

One widespread misconception is that the arguably greatest systematic metaphysician of the 20th century, Alfred North Whitehead, was also panpsychism's most significant 20th century proponent. This misreading attributes to Whitehead an ontology according to which the basic nature of the world is made up of atomic mental events, termed "actual occasions". But rather than signifying such exotic metaphysical objects—which would in fact exemplify the fallacy of misplaced concreteness Whitehead criticizes—Whitehead's concept of "actual occasion" refers to the "immediate experienced occasion" of any possible perceiver, having in mind only himself as perceiver at the outset, in accordance with his strong commitment to radical empiricism.

Contemporary

Panpsychism has recently seen a resurgence in the philosophy of mind, set into motion by Thomas Nagel's 1979 article "Panpsychism" and further spurred by Galen Strawson's 2006 realistic monist article "Realistic Monism: Why Physicalism Entails Panpsychism". Other recent proponents include American philosophers David Ray Griffin and David Skrbina, British philosophers Gregg Rosenberg, Timothy Sprigge, and Philip Goff, and Canadian philosopher William Seager. The British philosopher David Papineau, while distancing himself from orthodox panpsychists, has written that his view is "not unlike panpsychism" in that he rejects a line in nature between "events lit up by phenomenology [and] those that are mere darkness".

The integrated information theory of consciousness (IIT), proposed by the neuroscientist and psychiatrist Giulio Tononi in 2004 and since adopted by other neuroscientists such as Christof Koch, postulates that consciousness is widespread and can be found even in some simple systems.

In 2019, cognitive scientist Donald Hoffman published The Case Against Reality: How evolution hid the truth from our eyes. Hoffman argues that consensus reality lacks concrete existence, and is nothing more than an evolved user-interface. He argues that the true nature of reality is abstract "conscious agents". Science editor Annaka Harris argues that panpsychism is a viable theory in her 2019 book Conscious, though she stops short of fully endorsing it.

Panpsychism has been postulated by psychoanalyst Robin S. Brown as a means to theorizing relations between "inner" and "outer" tropes in the context of psychotherapy. Panpsychism has also been applied in environmental philosophy by Australian philosopher Freya Mathews, who has put forward the notion of ontopoetics as a version of panpsychism.

The geneticist Sewall Wright endorsed a version of panpsychism. He believed that consciousness is not a mysterious property emerging at a certain level of the hierarchy of increasing material complexity, but rather an inherent property, implying the most elementary particles have these properties.

Varieties

Panpsychism encompasses many theories, united only by the notion that mind in some form is ubiquitous.

Philosophical frameworks

Cosmopsychism

Cosmopsychism hypothesizes that the cosmos is a unified object that is ontologically prior to its parts. It has been described as an alternative to panpsychism, or as a form of panpsychism. Proponents of cosmopsychism claim that the cosmos as a whole is the fundamental level of reality and that it instantiates consciousness. They differ on that point from panpsychists, who usually claim that the smallest level of reality is fundamental and instantiates consciousness. Accordingly, human consciousness, for example, merely derives from a larger cosmic consciousness.

Panexperientialism

Panexperientialism is associated with the philosophies of, among others, Charles Hartshorne and Alfred North Whitehead, although the term itself was invented by David Ray Griffin to distinguish the process philosophical view from other varieties of panpsychism. Whitehead's process philosophy argues that the fundamental elements of the universe are "occasions of experience", which can together create something as complex as a human being. Building on Whitehead's work, process philosopher Michel Weber argues for a pancreativism. Goff has used the term panexperientialism more generally to refer to forms of panpsychism in which experience rather than thought is ubiquitous.

Panprotopsychism

Panprotopsychists believe that higher-order phenomenal properties (such as qualia) are logically entailed by protophenomenal properties, at least in principle. This is similar to how facts about H2O molecules logically entail facts about water: the lower-level facts are sufficient to explain the higher-order facts, since the former logically entail the latter. It also makes sense of questions about the unity of consciousness relating to the diversity of phenomenal experiences and the deflation of the self. Adherents of panprotopsychism believe that "protophenomenal" facts logically entail consciousness. Protophenomenal properties are usually picked out through a combination of functional and negative definitions: panphenomenal properties are those that logically entail phenomenal properties (a functional definition), which are themselves neither physical nor phenomenal (a negative definition).

Panprotopsychism is advertised as a solution to the combination problem: the problem of explaining how the consciousness of microscopic physical things might combine to give rise to the macroscopic consciousness of the whole brain. Because protophenomenal properties are by definition the constituent parts of consciousness, it is speculated that their existence would make the emergence of macroscopic minds less mysterious. The philosopher David Chalmers argues that the view faces difficulty with the combination problem. He considers it "ad hoc", and believes it diminishes the parsimony that made the theory initially interesting.

Russellian monism

Russellian monism is a type of neutral monism. The theory is attributed to Bertrand Russell, and may also be called Russell's panpsychism, or Russell's neutral monism. Russell believed that all causal properties are extrinsic manifestations of identical intrinsic properties. Russell called these identical internal properties quiddities. Just as the extrinsic properties of matter can form higher-order structure, so can their corresponding and identical quiddities. Russell believed the conscious mind was one such structure.

Religious or mystical ontologies

Advaita Vedānta

Advaita Vedānta is a form of idealism in Indian philosophy which views consensus reality as illusory. Anand Vaidya and Purushottama Bilimoria have argued that it can be considered a form of panpsychism or cosmopsychism.

Animism and hylozoism

Animism maintains that all things have a soul, and hylozoism maintains that all things are alive. Both could reasonably be interpreted as panpsychist, but both have fallen out of favour in contemporary academia. Modern panpsychists have tried to distance themselves from theories of this sort, careful to carve out the distinction between the ubiquity of experience and the ubiquity of mind and cognition.

Panpsychism and metempsychosis

Between 1840 and 1864, the Austrian mystic Jakob Lorber claimed to have received a 26-volume revelation. Various books of the Lorber Revelations say that specifica, closely resembling Leibniz's monads, form the most basic, irreducible substance of all physical and metaphysical creation. According to the Lorber Revelations, specifica grow in complexity and intelligence to form ever higher level clusters of intelligence until a fully intelligent human soul is reached. In this scenario panpsychism and metempsychosis are used to overcome the combination problem.

Buddha-nature

In the art of the Japanese rock garden, the artist must be aware of the "ishigokoro" ('heart', or 'mind') of the rocks.

Buddha-nature is an important and multifaceted doctrine in Mahayana Buddhism that is related to the capacity to attain Buddhahood. In numerous Indian sources, the idea is connected to the mind, especially the Buddhist concept of the luminous mind. In some Buddhist traditions, the Buddha-nature doctrine may be interpreted as implying a form of panpsychism. Graham Parks argues that most "traditional Chinese, Japanese and Korean philosophy would qualify as panpsychist in nature".

The Huayan, Tiantai, and Tendai schools of Buddhism explicitly attribute Buddha-nature to inanimate objects such as lotus flowers and mountains. This idea was defended by figures such as the Tiantai patriarch Zhanran, who spoke of the Buddha-nature of grasses and trees. Similarly, Soto Zen master Dogen argued that "insentient beings expound" the teachings of the Buddha, and wrote about the "mind" (心, shin) of "fences, walls, tiles, and pebbles". The 9th-century Shingon figure Kukai went so far as to argue that natural objects such as rocks and stones are part of the supreme embodiment of the Buddha. According to Parks, Buddha-nature is best described "in western terms" as something "psychophysical".

Scientific theories

Conscious realism

It is a natural and near-universal assumption that the world has the properties and causal structures that we perceive it to have; to paraphrase Einstein's famous remark, we naturally assume that the moon is there whether anyone looks or not. Both theoretical and empirical considerations, however, increasingly indicate that this is not correct.

— Donald Hoffman, Conscious agent networks: Formal analysis and applications to cognition

Conscious realism is a theory proposed by Donald Hoffman, a cognitive scientist specialising in perception. He has written numerous papers on the topic which he summarised in his 2019 book The Case Against Reality: How evolution hid the truth from our eyes. Conscious realism builds upon Hoffman's former User-Interface Theory. In combination they argue that (1) consensus reality and spacetime are illusory, and are merely a "species specific evolved user interface"; (2) Reality is made of a complex, dimensionless, and timeless network of "conscious agents".

The consensus view is that perception is a reconstruction of one's environment. Hoffman views perception as a construction rather than a reconstruction. He argues that perceptual systems are analogous to information channels, and thus subject to data compression and reconstruction. The set of possible reconstructions for any given data set is quite large. Of that set, the subset that is homomorphic in relation to the original is minuscule, and does not necessarily—or, seemingly, even often—overlap with the subset that is efficient or easiest to use.

For example, consider a graph, such as a pie chart. A pie chart is easy to understand and use not because it is perfectly homomorphic with the data it represents, but because it is not. If a graph of, for example, the chemical composition of the human body were to look exactly like a human body, then we could not understand it. It is only because the graph abstracts away from the structure of its subject matter that it can be visualized. Alternatively, consider a graphical user interface on a computer. The reason graphical user interfaces are useful is that they abstract away from lower-level computational processes, such as machine code, or the physical state of a circuit-board. In general, it seems that data is most useful to us when it is abstracted from its original structure and repackaged in a way that is easier to understand, even if this comes at the cost of accuracy. Hoffman offers the "fitness beats truth theorem" as mathematical proof that perceptions of reality bear little resemblance to reality's true nature. From this he concludes that our senses do not faithfully represent the external world.

Even if reality is an illusion, Hoffman takes consciousness as an indisputable fact. He represents rudimentary units of consciousness (which he calls "conscious agents") as Markovian kernels. Though the theory was not initially panpsychist, he reports that he and his colleague Chetan Prakash found the math to be more parsimonious if it were. They hypothesize that reality is composed of these conscious agents, who interact to form "larger, more complex" networks.

Axioms and postulates of integrated information theory

Integrated information theory

Giulio Tononi first articulated Integrated information theory (IIT) in 2004, and it has undergone two major revisions since then. Tononi approaches consciousness from a scientific perspective, and has expressed frustration with philosophical theories of consciousness for lacking predictive power. Though integral to his theory, he refrains from philosophical terminology such as qualia or the unity of consciousness, instead opting for mathematically precise alternatives like entropy function and information integration. This has allowed Tononi to create a measurement for integrated information, which he calls phi (Φ). He believes consciousness is nothing but integrated information, so Φ measures consciousness. As it turns out, even basic objects or substances have a nonzero degree of Φ. This would mean that consciousness is ubiquitous, albeit to a minimal degree.

The philosopher Hedda Hassel Mørch's views IIT as similar to Russellian monism, while other philosophers, such as Chalmers and John Searle, consider it a form of panpsychism. IIT does not hold that all systems are conscious, leading Tononi and Koch to state that IIT incorporates some elements of panpsychism but not others. Koch has called IIT a "scientifically refined version" of panpsychism.

In relation to other theories

A diagram depicting four positions on the mind-body problem. Versions of panpsychism have been likened to each of these positions as well as contrasted to them.

Because panpsychism encompasses a wide range of theories, it can in principle be compatible with reductive materialism, dualism, functionalism, or other perspectives depending on the details of a given formulation.

Dualism

David Chalmers and Philip Goff have each described panpsychism as an alternative to both materialism and dualism. Chalmers says panpsychism respects the conclusions of both the causal argument against dualism and the conceivability argument for dualism. Goff has argued that panpsychism avoids the disunity of dualism, under which mind and matter are ontologically separate, as well as dualism's problems explaining how mind and matter interact. By contrast, Uwe Meixner argues that panpsychism has dualist forms, which he contrasts to idealist forms.

Emergentism

Panpsychism is incompatible with emergentism. In general, theories of consciousness fall under one or the other umbrella; they hold either that consciousness is present at a fundamental level of reality (panpsychism) or that it emerges higher up (emergentism).

Idealism

There is disagreement over whether idealism is a form of panpsychism or a separate view. Both views hold that everything that exists has some form of experience. According to the philosophers William Seager and Sean Allen-Hermanson, "idealists are panpsychists by default". Charles Hartshorne contrasted panpsychism and idealism, saying that while idealists rejected the existence of the world observed with the senses or understood it as ideas within the mind of God, panpsychists accepted the reality of the world but saw it as composed of minds. Chalmers also contrasts panpsychism with idealism (as well as materialism and dualism). Meixner writes that formulations of panpsychism can be divided into dualist and idealist versions. He further divides the latter into "atomistic idealistic panpsychism", which he ascribes to David Hume, and "holistic idealistic panpsychism", which he favors.

Neutral monism

Neutral monism rejects the dichotomy of mind and matter, instead taking a third substance as fundamental that is neither mental nor physical. Proposals for the nature of the third substance have varied, with some theorists choosing to leave it undefined. This has led to a variety of formulations of neutral monism, which may overlap with other philosophies. In versions of neutral monism in which the world's fundamental constituents are neither mental nor physical, it is quite distinct from panpsychism. In versions where the fundamental constituents are both mental and physical, neutral monism may lead to panpsychism, panprotopsychism, or dual aspect theory.

In The Conscious Mind, David Chalmers writes that, in some instances, the differences between "Russell's neutral monism" and his property dualism are merely semantic. Philip Goff believes that neutral monism can reasonably be regarded as a form of panpsychism "in so far as it is a dual aspect view". Neutral monism, panpsychism, and dual aspect theory are grouped together or used interchangeably in some contexts.

Physicalism and materialism

Chalmers calls panpsychism an alternative to both materialism and dualism. Similarly, Goff calls panpsychism an alternative to both physicalism and substance dualism. Strawson, on the other hand, describes panpsychism as a form of physicalism, in his view the only viable form. Panpsychism can be combined with reductive materialism but cannot be combined with eliminative materialism because the latter denies the existence of the relevant mental attributes.

Arguments for

Hard problem of consciousness

But what consciousness is, we know not; and how it is that anything so remarkable as a state of consciousness comes about as the result of irritating nervous tissue, is just as unaccountable as the appearance of the Djin when Aladdin rubbed his lamp in the story, or as any other ultimate fact of nature.

— Thomas Henry Huxley (1896)

It evidently feels like something to be a human brain. This means that when things in the world are organised in a particular way, they begin to have an experience. The questions of why and how this material structure has experience, and why it has that particular experience rather than another experience, are known as the hard problem of consciousness. The term is attributed to Chalmers. He argues that even after "all the perceptual and cognitive functions within the vicinity of consciousness" are accounted for, "there may still remain a further unanswered question: Why is the performance of these functions accompanied by experience?"

Though Chalmers gave the hard problem of consciousness its present name, similar views were expressed before. Isaac NewtonJohn LockeGottfried LeibnizJohn Stuart MillThomas Henry HuxleyWilhelm Wundt, all wrote about the seeming incompatibility of third-person functional descriptions of mind and matter and first-person conscious experience. Likewise, Asian philosophers like Dharmakirti and Guifeng Zongmi discussed the problem of how consciousness arises from unconscious matter. Similar sentiments have been articulated through philosophical inquiries such as the problem of other minds, solipsism, the explanatory gap, philosophical zombies, and Mary's room. These problems have caused Chalmers to consider panpsychism a viable solution to the hard problem, though he is not committed to any single view.

Brian Jonathan Garrett has compared the hard problem to vitalism, the now discredited hypothesis that life is inexplicable and can only be understood if some vital life force exists. He maintains that given time, consciousness and its evolutionary origins will be understood just as life is now understood. Daniel Dennett called the hard problem a "hunch", and maintained that conscious experience, as it is usually understood, is merely a complex cognitive illusionPatricia Churchland, also an eliminative materialist, maintains that philosophers ought to be more patient: neuroscience is still in its early stages, so Chalmers's hard problem is premature. Clarity will come from learning more about the brain, not from metaphysical speculation.

Solutions

In The Conscious Mind (1996), Chalmers attempts to pinpoint why the hard problem is so hard. He concludes that consciousness is irreducible to lower-level physical facts, just as the fundamental laws of physics are irreducible to lower-level physical facts. Therefore, consciousness should be taken as fundamental in its own right and studied as such. Just as fundamental properties of reality are ubiquitous (even small objects have mass), consciousness may also be, though he considers that an open question.

In Mortal Questions (1979), Thomas Nagel argues that panpsychism follows from four premises:

  • P1: There is no spiritual plane or disembodied soul; everything that exists is material.
  • P2: Consciousness is irreducible to lower-level physical properties.
  • P3: Consciousness exists.
  • P4: Higher-order properties of matter (i.e., emergent properties) can, at least in principle, be reduced to their lower-level properties.

Before the first premise is accepted, the range of possible explanations for consciousness is fully open. Each premise, if accepted, narrows down that range of possibilities. If the argument is sound, then by the last premise panpsychism is the only possibility left.

  • If (P1) is true, then either consciousness does not exist, or it exists within the physical world.
  • If (P2) is true, then either consciousness does not exist, or it (a) exists as distinct property of matter or (b) is fundamentally entailed by matter.
  • If (P3) is true, then consciousness exists, and is either (a) its own property of matter or (b) composed by the matter of the brain but not logically entailed by it.
  • If (P4) is true, then (b) is false, and consciousness must be its own unique property of matter.

Therefore, if all four premises are true, consciousness is its own unique property of matter and panpsychism is true.

Mind-body problem

Dualism makes the problem insoluble; materialism denies the existence of any phenomenon to study, and hence of any problem.

— John R. Searle, Consciousness and Language, p. 47

In 2015, Chalmers proposed a possible solution to the mind-body problem through the argumentative format of thesis, antithesis, and synthesis. The goal of such arguments is to argue for sides of a debate (the thesis and antithesis), weigh their vices and merits, and then reconcile them (the synthesis). Chalmers's thesis, antithesis, and synthesis are as follows:

  1. Thesis: materialism is true; everything is fundamentally physical.
  2. Antithesis: dualism is true; not everything is fundamentally physical.
  3. Synthesis: panpsychism is true.

(1) A centerpiece of Chalmers's argument is the physical world's causal closure. Newton's law of motion explains this phenomenon succinctly: for every action there is an equal and opposite reaction. Cause and effect is a symmetrical process. There is no room for consciousness to exert any causal power on the physical world unless it is itself physical.

(2) On one hand, if consciousness is separate from the physical world then there is no room for it to exert any causal power on the world (a state of affairs philosophers call epiphenomenalism). If consciousness plays no causal role, then it is unclear how Chalmers could even write this paper. On the other hand, consciousness is irreducible to the physical processes of the brain.

(3) Panpsychism has all the benefits of materialism because it could mean that consciousness is physical while also escaping the grasp of epiphenomenalism. After some argumentation Chalmers narrows it down further to Russellian monism, concluding that thoughts, actions, intentions and emotions may just be the quiddities of neurotransmitters, neurons, and glial cells.

Problem of substance

Physics is mathematical, not because we know so much about the physical world, but because we know so little: it is only its mathematical properties that we can discover. For the rest our knowledge is negative.

— Bertrand Russell, An Outline of Philosophy (1927)

Rather than solely trying to solve the problem of consciousness, Russell also attempted to solve the problem of substance, which is arguably a form of the problem of infinite regress.

(1) Like many sciences, physics describes the world through mathematics. Unlike other sciences, physics cannot describe what Schopenhauer called the "object that grounds" mathematics. Economics is grounded in resources being allocated, and population dynamics is grounded in individual people within that population. The objects that ground physics, however, can be described only through more mathematics. In Russell's words, physics describes "certain equations giving abstract properties of their changes". When it comes to describing "what it is that changes, and what it changes from and to—as to this, physics is silent". In other words, physics describes matter's extrinsic properties, but not the intrinsic properties that ground them.

(2) Russell argued that physics is mathematical because "it is only mathematical properties we can discover". This is true almost by definition: if only extrinsic properties are outwardly observable, then they will be the only ones discovered. This led Alfred North Whitehead to conclude that intrinsic properties are "intrinsically unknowable".

(3) Consciousness has many similarities to these intrinsic properties of physics. It, too, cannot be directly observed from an outside perspective. And it, too, seems to ground many observable extrinsic properties: presumably, music is enjoyable because of the experience of listening to it, and chronic pain is avoided because of the experience of pain, etc. Russell concluded that consciousness must be related to these extrinsic properties of matter. He called these intrinsic properties quiddities. Just as extrinsic physical properties can create structures, so can their corresponding and identical quiddites. The conscious mind, Russell argued, is one such structure.

Proponents of panpsychism who use this line of reasoning include Chalmers, Annaka Harris, and Galen Strawson. Chalmers has argued that the extrinsic properties of physics must have corresponding intrinsic properties; otherwise the universe would be "a giant causal flux" with nothing for "causation to relate", which he deems a logical impossibility. He sees consciousness as a promising candidate for that role. Galen Strawson calls Russell's panpsychism "realistic physicalism". He argues that "the experiential considered specifically as such" is what it means for something to be physical. Just as mass is energy, Strawson believes that consciousness "just is" matter.

Max Tegmark, theoretical physicist and creator of the mathematical universe hypothesis, disagrees with these conclusions. By his account, the universe is not just describable by math but is math; comparing physics to economics or population dynamics is a disanalogy. While population dynamics may be grounded in individual people, those people are grounded in "purely mathematical objects" such as energy and charge. The universe is, in a fundamental sense, made of nothing.

Quantum mechanics

In a 2018 interview, Chalmers called quantum mechanics "a magnet for anyone who wants to find room for crazy properties of the mind", but not entirely without warrant. The relationship between observation (and, by extension, consciousness) and the wave-function collapse is known as the measurement problem. It seems that atoms, photons, etc. are in quantum superposition (which is to say, in many seemingly contradictory states or locations simultaneously) until measured in some way. This process is known as a wave-function collapse. According to the Copenhagen interpretation of quantum mechanics, one of the oldest interpretations and the most widely taught, it is the act of observation that collapses the wave-function. Erwin Schrödinger famously articulated the Copenhagen interpretation's unusual implications in the thought experiment now known as Schrödinger's cat. He imagines a box that contains a cat, a flask of poison, radioactive material, and a Geiger counter. The apparatus is configured so that when the Geiger counter detects radioactive decay, the flask will shatter, poisoning the cat. Unless and until the Geiger counter detects the radioactive decay of a single atom, the cat survives. The radioactive decay the Geiger counter detects is a quantum event; each decay corresponds to a quantum state transition of a single atom of the radioactive material. According to Schrödinger's wave equation, until they are observed, quantum particles, including the atoms of the radioactive material, are in quantum state superposition; each unmeasured atom in the radioactive material is in a quantum superposition of decayed and not decayed. This means that while the box remains sealed and its contents unobserved, the Geiger counter is also in a superposition of states of decay detected and no decay detected; the vial is in a superposition of both shattered and not shattered and the cat in a superposition of dead and alive. But when the box is unsealed, the observer finds a cat that is either dead or alive; there is no superposition of states. Since the cat is no longer in a superposition of states, then neither is the radioactive atom (nor the vial or the Geiger counter). Hence Schrödinger's wave function no longer holds and the wave function that described the atom—and its superposition of states—is said to have "collapsed": the atom now has only a single state, corresponding to the cat's observed state. But until an observer opens the box and thereby causes the wave function to collapse, the cat is both dead and alive. This has raised questions about, in John S. Bell's words, "where the observer begins and ends".

The measurement problem has largely been characterised as the clash of classical physics and quantum mechanics. Bohm argued that it is rather a clash of classical physics, quantum mechanics, and phenomenology; all three levels of description seem to be difficult to reconcile, or even contradictory. Though not referring specifically to quantum mechanics, Chalmers has written that if a theory of everything is ever discovered, it will be a set of "psychophysical laws", rather than simply a set of physical laws. With Chalmers as their inspiration, Bohm and Pylkkänen set out to do just that in their panprotopsychism. Chalmers, who is critical of the Copenhagen interpretation and most quantum theories of consciousness, has coined this "the Law of the Minimisation of Mystery".

Schrödinger's cat simultaneously dead and alive in a quantum superposition
According to the Copenhagen interpretation of quantum mechanics, Schrödinger's cat is both dead and alive until observed or measured in some way.

The many-worlds interpretation of quantum mechanics does not take observation as central to the wave-function collapse, because it denies that the collapse happens. On the many-worlds interpretation, just as the cat is both dead and alive, the observer both sees a dead cat and sees a living cat. Even though observation does not play a central role in this case, questions about observation are still relevant to the discussion. In Roger Penrose's words:

I do not see why a conscious being need be aware of only "one" of the alternatives in a linear superposition. What is it about consciousnesses that says that consciousness must not be "aware" of that tantalising linear combination of both a dead and a live cat? It seems to me that a theory of consciousness would be needed for one to square the many world view with what one actually observes.

Chalmers believes that the tentative variant of panpsychism outlined in The Conscious Mind (1996) does just that. Leaning toward the many-worlds interpretation due to its mathematical parsimony, he believes his variety of panpsychist property dualism may be the theory Penrose is seeking. Chalmers believes that information will play an integral role in any theory of consciousness because the mind and brain have corresponding informational structures. He considers the computational nature of physics further evidence of information's central role, and suggests that information that is physically realised is simultaneously phenomenally realised; both regularities in nature and conscious experience are expressions of information's underlying character. The theory implies panpsychism, and also solves the problem Penrose poses. On Chalmers's formulation, information in any given position is phenomenally realised, whereas the informational state of the superposition as a whole is not. Panpsychist interpretations of quantum mechanics have been put forward by such philosophers as Whitehead, Shan Gao, Michael Lockwood, and Hoffman, who is a cognitive scientist. Protopanpsychist interpretations have been put forward by Bohm and Pylkkänen.

Tegmark has formally calculated the "decoherence rates" of neurons, finding that the brain is a "classical rather than a quantum system" and that quantum mechanics does not relate "to consciousness in any fundamental way". Hagan et al. criticize Tegmark's estimate and present a revised calculation that yields a range of decoherence rates within the realm of physiological relevance.

In 2007, Steven Pinker criticized explanations of consciousness invoking quantum physics, saying: "to my ear, this amounts to the feeling that quantum mechanics sure is weird, and consciousness sure is weird, so maybe quantum mechanics can explain consciousness"; a view echoed by physicist Stephen Hawking. In 2017, Penrose rejected these characterizations, stating that disagreements are about the nature of quantum mechanics.

Arguments against

Theoretical issues

One criticism of panpsychism is that it cannot be empirically tested. A corollary of this criticism is that panpsychism has no predictive power. Tononi and Koch write: "Besides claiming that matter and mind are one thing, [panpsychism] has little constructive to say and offers no positive laws explaining how the mind is organized and works".

John Searle has alleged that panpsychism's unfalsifiability goes deeper than run-of-the-mill untestability: it is unfalsifiable because "It does not get up to the level of being false. It is strictly speaking meaningless because no clear notion has been given to the claim". The need for coherence and clarification is accepted by David Skrbina, a proponent of panpsychism.

Many proponents of panpsychism base their arguments not on empirical support but on panpsychism's theoretical virtues. Chalmers says that while no direct evidence exists for the theory, neither is there direct evidence against it, and that "there are indirect reasons, of a broadly theoretical character, for taking the view seriously". Notwithstanding Tononi and Koch's criticism of panpsychism, they state that it integrates consciousness into the physical world in a way that is "elegantly unitary".

A related criticism is what seems to many to be the theory's bizarre nature. Goff dismisses this objection: though he admits that panpsychism is counterintuitive, he argues that Einstein's and Darwin's theories are also counterintuitive. "At the end of the day," he writes, "you should judge a view not for its cultural associations but by its explanatory power".

Problem of mental causation

Philosophers such as Chalmers have argued that theories of consciousness should be capable of providing insight into the brain and mind to avoid the problem of mental causation. If they fail to do that, the theory will succumb to epiphenomenalism, a view commonly criticised as implausible or even self-contradictory. Proponents of panpsychism (especially those with neutral monist tendencies) hope to bypass this problem by dismissing it as a false dichotomy; mind and matter are two sides of the same coin, and mental causation is merely the extrinsic description of intrinsic properties of mind. Robert Howell has argued that all causal functions are still accounted for dispositionally (i.e., in terms of the behaviors described by science), leaving phenomenality causally inert. He concludes, "This leaves us once again with epiphenomenal qualia, only in a very surprising place". Neutral monists reject such dichotomous views of mind-body interaction.

Combination problem

The combination problem (which is related to the binding problem) can be traced to William James, but was given its present name by William Seager in 1995. The problem arises from the tension between the seemingly irreducible nature of consciousness and its ubiquity. If consciousness is ubiquitous, then in panpsychism, every atom (or every bit, depending on the version of panpsychism) has a minimal level of it. How then, as Keith Frankish puts it, do these "tiny consciousnesses combine" to create larger conscious experiences such as "the twinge of pain" he feels in his knee? This objection has garnered significant attention, and many have attempted to answer it. None of the proposed answers has gained widespread acceptance.

Concepts related to this problem include the classical sorites paradox (aggregates and organic wholes), mereology (the philosophical study of parts and wholes), Gestalt psychology, and Leibniz's concept of the vinculum substantiale.

Philosophy of science

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Philosophy_of_science

Philosophy of science
is the branch of philosophy concerned with the foundations, methods, and implications of science. Amongst its central questions are the difference between science and non-science, the reliability of scientific theories, and the ultimate purpose and meaning of science as a human endeavour. Philosophy of science focuses on metaphysical, epistemic and semantic aspects of scientific practice, and overlaps with metaphysics, ontology, logic, and epistemology, for example, when it explores the relationship between science and the concept of truth. Philosophy of science is both a theoretical and empirical discipline, relying on philosophical theorising as well as meta-studies of scientific practice. Ethical issues such as bioethics and scientific misconduct are often considered ethics or science studies rather than the philosophy of science.

Many of the central problems concerned with the philosophy of science lack contemporary consensus, including whether science can infer truth about unobservable entities and whether inductive reasoning can be justified as yielding definite scientific knowledge. Philosophers of science also consider philosophical problems within particular sciences (such as biology, physics and social sciences such as economics and psychology). Some philosophers of science also use contemporary results in science to reach conclusions about philosophy itself.

While philosophical thought pertaining to science dates back at least to the time of Aristotle, the general philosophy of science emerged as a distinct discipline only in the 20th century following the logical positivist movement, which aimed to formulate criteria for ensuring all philosophical statements' meaningfulness and objectively assessing them. Karl Popper criticized logical positivism and helped establish a modern set of standards for scientific methodology. Thomas Kuhn's 1962 book The Structure of Scientific Revolutions was also formative, challenging the view of scientific progress as the steady, cumulative acquisition of knowledge based on a fixed method of systematic experimentation and instead arguing that any progress is relative to a "paradigm", the set of questions, concepts, and practices that define a scientific discipline in a particular historical period.

Subsequently, the coherentist approach to science, in which a theory is validated if it makes sense of observations as part of a coherent whole, became prominent due to W. V. Quine and others. Some thinkers such as Stephen Jay Gould seek to ground science in axiomatic assumptions, such as the uniformity of nature. A vocal minority of philosophers, and Paul Feyerabend in particular, argue against the existence of the "scientific method", so all approaches to science should be allowed, including explicitly supernatural ones. Another approach to thinking about science involves studying how knowledge is created from a sociological perspective, an approach represented by scholars like David Bloor and Barry Barnes. Finally, a tradition in continental philosophy approaches science from the perspective of a rigorous analysis of human experience.

Philosophies of the particular sciences range from questions about the nature of time raised by Einstein's general relativity, to the implications of economics for public policy. A central theme is whether the terms of one scientific theory can be intra- or intertheoretically reduced to the terms of another. Can chemistry be reduced to physics, or can sociology be reduced to individual psychology? The general questions of philosophy of science also arise with greater specificity in some particular sciences. For instance, the question of the validity of scientific reasoning is seen in a different guise in the foundations of statistics. The question of what counts as science and what should be excluded arises as a life-or-death matter in the philosophy of medicine. Additionally, the philosophies of biology, psychology, and the social sciences explore whether the scientific studies of human nature can achieve objectivity or are inevitably shaped by values and by social relations.

Introduction

Defining science

In formulating 'the problem of induction', David Hume devised one of the most pervasive puzzles in the philosophy of science.
Karl Popper in the 1980s. Popper is credited with formulating 'the demarcation problem', which considers the question of how we distinguish between science and pseudoscience.

Distinguishing between science and non-science is referred to as the demarcation problem. For example, should psychoanalysis, creation science, and historical materialism be considered pseudosciences? Karl Popper called this the central question in the philosophy of science. However, no unified account of the problem has won acceptance among philosophers, and some regard the problem as unsolvable or uninteresting. Martin Gardner has argued for the use of a Potter Stewart standard ("I know it when I see it") for recognizing pseudoscience.

Early attempts by the logical positivists grounded science in observation while non-science was non-observational and hence meaningless. Popper argued that the central property of science is falsifiability. That is, every genuinely scientific claim is capable of being proven false, at least in principle.

An area of study or speculation that masquerades as science in an attempt to claim a legitimacy that it would not otherwise be able to achieve is referred to as pseudoscience, fringe science, or junk science. Physicist Richard Feynman coined the term "cargo cult science" for cases in which researchers believe they are doing science because their activities have the outward appearance of it but actually lack the "kind of utter honesty" that allows their results to be rigorously evaluated.

Scientific explanation

A closely related question is what counts as a good scientific explanation. In addition to providing predictions about future events, society often takes scientific theories to provide explanations for events that occur regularly or have already occurred. Philosophers have investigated the criteria by which a scientific theory can be said to have successfully explained a phenomenon, as well as what it means to say a scientific theory has explanatory power.

One early and influential account of scientific explanation is the deductive-nomological model. It says that a successful scientific explanation must deduce the occurrence of the phenomena in question from a scientific law. This view has been subjected to substantial criticism, resulting in several widely acknowledged counterexamples to the theory. It is especially challenging to characterize what is meant by an explanation when the thing to be explained cannot be deduced from any law because it is a matter of chance, or otherwise cannot be perfectly predicted from what is known. Wesley Salmon developed a model in which a good scientific explanation must be statistically relevant to the outcome to be explained. Others have argued that the key to a good explanation is unifying disparate phenomena or providing a causal mechanism.

Justifying science

Although it is often taken for granted, it is not at all clear how one can infer the validity of a general statement from a number of specific instances or infer the truth of a theory from a series of successful tests. For example, a chicken observes that each morning the farmer comes and gives it food, for hundreds of days in a row. The chicken may therefore use inductive reasoning to infer that the farmer will bring food every morning. However, one morning, the farmer comes and kills the chicken. How is scientific reasoning more trustworthy than the chicken's reasoning?

One approach is to acknowledge that induction cannot achieve certainty, but observing more instances of a general statement can at least make the general statement more probable. So the chicken would be right to conclude from all those mornings that it is likely the farmer will come with food again the next morning, even if it cannot be certain. However, there remain difficult questions about the process of interpreting any given evidence into a probability that the general statement is true. One way out of these particular difficulties is to declare that all beliefs about scientific theories are subjective, or personal, and correct reasoning is merely about how evidence should change one's subjective beliefs over time.

Some argue that what scientists do is not inductive reasoning at all but rather abductive reasoning, or inference to the best explanation. In this account, science is not about generalizing specific instances but rather about hypothesizing explanations for what is observed. As discussed in the previous section, it is not always clear what is meant by the "best explanation". Occam's razor, which counsels choosing the simplest available explanation, thus plays an important role in some versions of this approach. To return to the example of the chicken, would it be simpler to suppose that the farmer cares about it and will continue taking care of it indefinitely or that the farmer is fattening it up for slaughter? Philosophers have tried to make this heuristic principle more precise regarding theoretical parsimony or other measures. Yet, although various measures of simplicity have been brought forward as potential candidates, it is generally accepted that there is no such thing as a theory-independent measure of simplicity. In other words, there appear to be as many different measures of simplicity as there are theories themselves, and the task of choosing between measures of simplicity appears to be every bit as problematic as the job of choosing between theories. Nicholas Maxwell has argued for some decades that unity rather than simplicity is the key non-empirical factor in influencing the choice of theory in science, persistent preference for unified theories in effect committing science to the acceptance of a metaphysical thesis concerning unity in nature. In order to improve this problematic thesis, it needs to be represented in the form of a hierarchy of theses, each thesis becoming more insubstantial as one goes up the hierarchy.

Observation inseparable from theory

Five balls of light are arranged in a cross shape.
Seen through a telescope, the Einstein cross seems to provide evidence for five different objects, but this observation is theory-laden. If we assume the theory of general relativity, the image only provides evidence for two objects.

When making observations, scientists look through telescopes, study images on electronic screens, record meter readings, and so on. Generally, on a basic level, they can agree on what they see, e.g., the thermometer shows 37.9 degrees C. But, if these scientists have different ideas about the theories that have been developed to explain these basic observations, they may disagree about what they are observing. For example, before Albert Einstein's general theory of relativity, observers would have likely interpreted an image of the Einstein cross as five different objects in space. In light of that theory, however, astronomers will tell you that there are actually only two objects, one in the center and four different images of a second object around the sides. Alternatively, if other scientists suspect that something is wrong with the telescope and only one object is actually being observed, they are operating under yet another theory. Observations that cannot be separated from theoretical interpretation are said to be theory-laden.

All observation involves both perception and cognition. That is, one does not make an observation passively, but rather is actively engaged in distinguishing the phenomenon being observed from surrounding sensory data. Therefore, observations are affected by one's underlying understanding of the way in which the world functions, and that understanding may influence what is perceived, noticed, or deemed worthy of consideration. In this sense, it can be argued that all observation is theory-laden.

The purpose of science

Should science aim to determine ultimate truth, or are there questions that science cannot answer? Scientific realists claim that science aims at truth and that one ought to regard scientific theories as true, approximately true, or likely true. Conversely, scientific anti-realists argue that science does not aim (or at least does not succeed) at truth, especially truth about unobservables like electrons or other universes. Instrumentalists argue that scientific theories should only be evaluated on whether they are useful. In their view, whether theories are true or not is beside the point, because the purpose of science is to make predictions and enable effective technology.

Realists often point to the success of recent scientific theories as evidence for the truth (or near truth) of current theories. Antirealists point to either the many false theories in the history of science, epistemic morals, the success of false modeling assumptions, or widely termed postmodern criticisms of objectivity as evidence against scientific realism. Antirealists attempt to explain the success of scientific theories without reference to truth. Some antirealists claim that scientific theories aim at being accurate only about observable objects and argue that their success is primarily judged by that criterion.

Real patterns

The notion of real patterns has been propounded, notably by philosopher Daniel C. Dennett, as an intermediate position between strong realism and eliminative materialism. This concept delves into the investigation of patterns observed in scientific phenomena to ascertain whether they signify underlying truths or are mere constructs of human interpretation. Dennett provides a unique ontological account concerning real patterns, examining the extent to which these recognized patterns have predictive utility and allow for efficient compression of information.

The discourse on real patterns extends beyond philosophical circles, finding relevance in various scientific domains. For example, in biology, inquiries into real patterns seek to elucidate the nature of biological explanations, exploring how recognized patterns contribute to a comprehensive understanding of biological phenomena. Similarly, in chemistry, debates around the reality of chemical bonds as real patterns continue.

Evaluation of real patterns also holds significance in broader scientific inquiries. Researchers, like Tyler Millhouse, propose criteria for evaluating the realness of a pattern, particularly in the context of universal patterns and the human propensity to perceive patterns, even where there might be none. This evaluation is pivotal in advancing research in diverse fields, from climate change to machine learning, where recognition and validation of real patterns in scientific models play a crucial role.

Values and science

Values intersect with science in different ways. There are epistemic values that mainly guide the scientific research. The scientific enterprise is embedded in particular culture and values through individual practitioners. Values emerge from science, both as product and process and can be distributed among several cultures in the society. When it comes to the justification of science in the sense of general public participation by single practitioners, science plays the role of a mediator between evaluating the standards and policies of society and its participating individuals, wherefore science indeed falls victim to vandalism and sabotage adapting the means to the end.

Thomas Kuhn is credited with coining the term "paradigm shift" to describe the creation and evolution of scientific theories.

If it is unclear what counts as science, how the process of confirming theories works, and what the purpose of science is, there is considerable scope for values and other social influences to shape science. Indeed, values can play a role ranging from determining which research gets funded to influencing which theories achieve scientific consensus. For example, in the 19th century, cultural values held by scientists about race shaped research on evolution, and values concerning social class influenced debates on phrenology (considered scientific at the time). Feminist philosophers of science, sociologists of science, and others explore how social values affect science.

History

Pre-modern

The origins of philosophy of science trace back to Plato and Aristotle, who distinguished the forms of approximate and exact reasoning, set out the threefold scheme of abductive, deductive, and inductive inference, and also analyzed reasoning by analogy. The eleventh century Arab polymath Ibn al-Haytham (known in Latin as Alhazen) conducted his research in optics by way of controlled experimental testing and applied geometry, especially in his investigations into the images resulting from the reflection and refraction of light. Roger Bacon (1214–1294), an English thinker and experimenter heavily influenced by al-Haytham, is recognized by many to be the father of modern scientific method. His view that mathematics was essential to a correct understanding of natural philosophy is considered to have been 400 years ahead of its time.

Modern

Francis Bacon's statue at Gray's Inn, South Square, London
Theory of Science by Auguste Comte

Francis Bacon (no direct relation to Roger Bacon, who lived 300 years earlier) was a seminal figure in philosophy of science at the time of the Scientific Revolution. In his work Novum Organum (1620)—an allusion to Aristotle's Organon—Bacon outlined a new system of logic to improve upon the old philosophical process of syllogism. Bacon's method relied on experimental histories to eliminate alternative theories. In 1637, René Descartes established a new framework for grounding scientific knowledge in his treatise, Discourse on Method, advocating the central role of reason as opposed to sensory experience. By contrast, in 1713, the 2nd edition of Isaac Newton's Philosophiae Naturalis Principia Mathematica argued that "... hypotheses ... have no place in experimental philosophy. In this philosophy[,] propositions are deduced from the phenomena and rendered general by induction." This passage influenced a "later generation of philosophically-inclined readers to pronounce a ban on causal hypotheses in natural philosophy". In particular, later in the 18th century, David Hume would famously articulate skepticism about the ability of science to determine causality and gave a definitive formulation of the problem of induction, though both theses would be contested by the end of the 18th century by Immanuel Kant in his Critique of Pure Reason and Metaphysical Foundations of Natural Science. In 19th century Auguste Comte made a major contribution to the theory of science. The 19th century writings of John Stuart Mill are also considered important in the formation of current conceptions of the scientific method, as well as anticipating later accounts of scientific explanation.

Logical positivism

Instrumentalism became popular among physicists around the turn of the 20th century, after which logical positivism defined the field for several decades. Logical positivism accepts only testable statements as meaningful, rejects metaphysical interpretations, and embraces verificationism (a set of theories of knowledge that combines logicism, empiricism, and linguistics to ground philosophy on a basis consistent with examples from the empirical sciences). Seeking to overhaul all of philosophy and convert it to a new scientific philosophy, the Berlin Circle and the Vienna Circle propounded logical positivism in the late 1920s.

Interpreting Ludwig Wittgenstein's early philosophy of language, logical positivists identified a verifiability principle or criterion of cognitive meaningfulness. From Bertrand Russell's logicism they sought reduction of mathematics to logic. They also embraced Russell's logical atomism, Ernst Mach's phenomenalism—whereby the mind knows only actual or potential sensory experience, which is the content of all sciences, whether physics or psychology—and Percy Bridgman's operationalism. Thereby, only the verifiable was scientific and cognitively meaningful, whereas the unverifiable was unscientific, cognitively meaningless "pseudostatements"—metaphysical, emotive, or such—not worthy of further review by philosophers, who were newly tasked to organize knowledge rather than develop new knowledge.

Logical positivism is commonly portrayed as taking the extreme position that scientific language should never refer to anything unobservable—even the seemingly core notions of causality, mechanism, and principles—but that is an exaggeration. Talk of such unobservables could be allowed as metaphorical—direct observations viewed in the abstract—or at worst metaphysical or emotional. Theoretical laws would be reduced to empirical laws, while theoretical terms would garner meaning from observational terms via correspondence rules. Mathematics in physics would reduce to symbolic logic via logicism, while rational reconstruction would convert ordinary language into standardized equivalents, all networked and united by a logical syntax. A scientific theory would be stated with its method of verification, whereby a logical calculus or empirical operation could verify its falsity or truth.

In the late 1930s, logical positivists fled Germany and Austria for Britain and America. By then, many had replaced Mach's phenomenalism with Otto Neurath's physicalism, and Rudolf Carnap had sought to replace verification with simply confirmation. With World War II's close in 1945, logical positivism became milder, logical empiricism, led largely by Carl Hempel, in America, who expounded the covering law model of scientific explanation as a way of identifying the logical form of explanations without any reference to the suspect notion of "causation". The logical positivist movement became a major underpinning of analytic philosophy, and dominated Anglosphere philosophy, including philosophy of science, while influencing sciences, into the 1960s. Yet the movement failed to resolve its central problems, and its doctrines were increasingly assaulted. Nevertheless, it brought about the establishment of philosophy of science as a distinct subdiscipline of philosophy, with Carl Hempel playing a key role.

For Kuhn, the addition of epicycles in Ptolemaic astronomy was "normal science" within a paradigm, whereas the Copernican Revolution was a paradigm shift.

Thomas Kuhn

In the 1962 book The Structure of Scientific Revolutions, Thomas Kuhn argued that the process of observation and evaluation takes place within a "paradigm", which he describes as "universally recognized achievements that for a time provide model problems and solutions to community of practitioners." A paradigm implicitly identifies the objects and relations under study and suggests what experiments, observations or theoretical improvements need to be carried out to produce a useful result. He characterized normal science as the process of observation and "puzzle solving" which takes place within a paradigm, whereas revolutionary science occurs when one paradigm overtakes another in a paradigm shift.

Kurn was a historian of science, and his ideas were inspired by the study of older paradigms that have been discarded, such as Aristotelian mechanics or aether theory. These had often been portrayed by historians as using "unscientific" methods or beliefs. But careful examination showed that they were no less "scientific" than modern paradigms. Both were based on valid evidence, both failed to answer every possible question.

A paradigm shift occurred when a significant number of observational anomalies arose in the old paradigm and efforts to resolve them within the paradigm were unsuccessful. A new paradigm was available that handled the anomalies with less difficulty and yet still covered (most of) the previous results. Over a period of time, often as long as a generation, more practitioners began working within the new paradigm and eventually the old paradigm was abandoned. For Kuhn, acceptance or rejection of a paradigm is a social process as much as a logical process.

Kuhn's position, however, is not one of relativism; he wrote "terms like 'subjective' and 'intuitive' cannot be applied to [paradigms]." Paradigms are grounded in objective, observable evidence, but our use of them is psychological and our acceptance of them is social.

Current approaches

Naturalism's axiomatic assumptions

According to Robert Priddy, all scientific study inescapably builds on at least some essential assumptions that cannot be tested by scientific processes; that is, that scientists must start with some assumptions as to the ultimate analysis of the facts with which it deals. These assumptions would then be justified partly by their adherence to the types of occurrence of which we are directly conscious, and partly by their success in representing the observed facts with a certain generality, devoid of ad hoc suppositions." Kuhn also claims that all science is based on assumptions about the character of the universe, rather than merely on empirical facts. These assumptions – a paradigm – comprise a collection of beliefs, values and techniques that are held by a given scientific community, which legitimize their systems and set the limitations to their investigation. For naturalists, nature is the only reality, the "correct" paradigm, and there is no such thing as supernatural, i.e. anything above, beyond, or outside of nature. The scientific method is to be used to investigate all reality, including the human spirit.

Some claim that naturalism is the implicit philosophy of working scientists, and that the following basic assumptions are needed to justify the scientific method:

  1. That there is an objective reality shared by all rational observers. "The basis for rationality is acceptance of an external objective reality." "Objective reality is clearly an essential thing if we are to develop a meaningful perspective of the world. Nevertheless its very existence is assumed." "Our belief that objective reality exist is an assumption that it arises from a real world outside of ourselves. As infants we made this assumption unconsciously. People are happy to make this assumption that adds meaning to our sensations and feelings, than live with solipsism." "Without this assumption, there would be only the thoughts and images in our own mind (which would be the only existing mind) and there would be no need of science, or anything else."
  2. That this objective reality is governed by natural laws;
    "Science, at least today, assumes that the universe obeys knowable principles that don't depend on time or place, nor on subjective parameters such as what we think, know or how we behave." Hugh Gauch argues that science presupposes that "the physical world is orderly and comprehensible."
  3. That reality can be discovered by means of systematic observation and experimentation.
    Stanley Sobottka said: "The assumption of external reality is necessary for science to function and to flourish. For the most part, science is the discovering and explaining of the external world." "Science attempts to produce knowledge that is as universal and objective as possible within the realm of human understanding."
  4. That Nature has uniformity of laws and most if not all things in nature must have at least a natural cause.
    Biologist Stephen Jay Gould referred to these two closely related propositions as the constancy of nature's laws and the operation of known processes. Simpson agrees that the axiom of uniformity of law, an unprovable postulate, is necessary in order for scientists to extrapolate inductive inference into the unobservable past in order to meaningfully study it. "The assumption of spatial and temporal invariance of natural laws is by no means unique to geology since it amounts to a warrant for inductive inference which, as Bacon showed nearly four hundred years ago, is the basic mode of reasoning in empirical science. Without assuming this spatial and temporal invariance, we have no basis for extrapolating from the known to the unknown and, therefore, no way of reaching general conclusions from a finite number of observations. (Since the assumption is itself vindicated by induction, it can in no way "prove" the validity of induction — an endeavor virtually abandoned after Hume demonstrated its futility two centuries ago)." Gould also notes that natural processes such as Lyell's "uniformity of process" are an assumption: "As such, it is another a priori assumption shared by all scientists and not a statement about the empirical world." According to R. Hooykaas: "The principle of uniformity is not a law, not a rule established after comparison of facts, but a principle, preceding the observation of facts ... It is the logical principle of parsimony of causes and of economy of scientific notions. By explaining past changes by analogy with present phenomena, a limit is set to conjecture, for there is only one way in which two things are equal, but there are an infinity of ways in which they could be supposed different."
  5. That experimental procedures will be done satisfactorily without any deliberate or unintentional mistakes that will influence the results.
  6. That experimenters won't be significantly biased by their presumptions.
  7. That random sampling is representative of the entire population.
    A simple random sample (SRS) is the most basic probabilistic option used for creating a sample from a population. The benefit of SRS is that the investigator is guaranteed to choose a sample that represents the population that ensures statistically valid conclusions.

Coherentism

Jeremiah Horrocks makes the first observation of the transit of Venus in 1639, as imagined by the artist W. R. Lavender in 1903.

In contrast to the view that science rests on foundational assumptions, coherentism asserts that statements are justified by being a part of a coherent system. Or, rather, individual statements cannot be validated on their own: only coherent systems can be justified. A prediction of a transit of Venus is justified by its being coherent with broader beliefs about celestial mechanics and earlier observations. As explained above, observation is a cognitive act. That is, it relies on a pre-existing understanding, a systematic set of beliefs. An observation of a transit of Venus requires a huge range of auxiliary beliefs, such as those that describe the optics of telescopes, the mechanics of the telescope mount, and an understanding of celestial mechanics. If the prediction fails and a transit is not observed, that is likely to occasion an adjustment in the system, a change in some auxiliary assumption, rather than a rejection of the theoretical system.

According to the Duhem–Quine thesis, after Pierre Duhem and W.V. Quine, it is impossible to test a theory in isolation. One must always add auxiliary hypotheses in order to make testable predictions. For example, to test Newton's Law of Gravitation in the solar system, one needs information about the masses and positions of the Sun and all the planets. Famously, the failure to predict the orbit of Uranus in the 19th century led not to the rejection of Newton's Law but rather to the rejection of the hypothesis that the Solar System comprises only seven planets. The investigations that followed led to the discovery of an eighth planet, Neptune. If a test fails, something is wrong. But there is a problem in figuring out what that something is: a missing planet, badly calibrated test equipment, an unsuspected curvature of space, or something else.

One consequence of the Duhem–Quine thesis is that one can make any theory compatible with any empirical observation by the addition of a sufficient number of suitable ad hoc hypotheses. Karl Popper accepted this thesis, leading him to reject naïve falsification. Instead, he favored a "survival of the fittest" view in which the most falsifiable scientific theories are to be preferred.

Anything goes methodology

Paul Karl Feyerabend

Paul Feyerabend (1924–1994) argued that no description of scientific method could possibly be broad enough to include all the approaches and methods used by scientists, and that there are no useful and exception-free methodological rules governing the progress of science. He argued that "the only principle that does not inhibit progress is: anything goes".

Feyerabend said that science started as a liberating movement, but that over time it had become increasingly dogmatic and rigid and had some oppressive features, and thus had become increasingly an ideology. Because of this, he said it was impossible to come up with an unambiguous way to distinguish science from religion, magic, or mythology. He saw the exclusive dominance of science as a means of directing society as authoritarian and ungrounded. Promulgation of this epistemological anarchism earned Feyerabend the title of "the worst enemy of science" from his detractors.

Sociology of scientific knowledge methodology

According to Kuhn, science is an inherently communal activity which can only be done as part of a community. For him, the fundamental difference between science and other disciplines is the way in which the communities function. Others, especially Feyerabend and some post-modernist thinkers, have argued that there is insufficient difference between social practices in science and other disciplines to maintain this distinction. For them, social factors play an important and direct role in scientific method, but they do not serve to differentiate science from other disciplines. On this account, science is socially constructed, though this does not necessarily imply the more radical notion that reality itself is a social construct.

Michel Foucault sought to analyze and uncover how disciplines within the social sciences developed and adopted the methodologies used by their practitioners. In works like The Archaeology of Knowledge, he used the term human sciences. The human sciences do not comprise mainstream academic disciplines; they are rather an interdisciplinary space for the reflection on man who is the subject of more mainstream scientific knowledge, taken now as an object, sitting between these more conventional areas, and of course associating with disciplines such as anthropology, psychology, sociology, and even history. Rejecting the realist view of scientific inquiry, Foucault argued throughout his work that scientific discourse is not simply an objective study of phenomena, as both natural and social scientists like to believe, but is rather the product of systems of power relations struggling to construct scientific disciplines and knowledge within given societies. With the advances of scientific disciplines, such as psychology and anthropology, the need to separate, categorize, normalize and institutionalize populations into constructed social identities became a staple of the sciences. Constructions of what were considered "normal" and "abnormal" stigmatized and ostracized groups of people, like the mentally ill and sexual and gender minorities.

However, some (such as Quine) do maintain that scientific reality is a social construct:

Physical objects are conceptually imported into the situation as convenient intermediaries not by definition in terms of experience, but simply as irreducible posits comparable, epistemologically, to the gods of Homer ... For my part I do, qua lay physicist, believe in physical objects and not in Homer's gods; and I consider it a scientific error to believe otherwise. But in point of epistemological footing, the physical objects and the gods differ only in degree and not in kind. Both sorts of entities enter our conceptions only as cultural posits.

The public backlash of scientists against such views, particularly in the 1990s, became known as the science wars.

A major development in recent decades has been the study of the formation, structure, and evolution of scientific communities by sociologists and anthropologists – including David Bloor, Harry Collins, Bruno Latour, Ian Hacking and Anselm Strauss. Concepts and methods (such as rational choice, social choice or game theory) from economics have also been applied for understanding the efficiency of scientific communities in the production of knowledge. This interdisciplinary field has come to be known as science and technology studies. Here the approach to the philosophy of science is to study how scientific communities actually operate.

Continental philosophy

Philosophers in the continental philosophical tradition are not traditionally categorized as philosophers of science. However, they have much to say about science, some of which has anticipated themes in the analytical tradition. For example, in The Genealogy of Morals (1887) Friedrich Nietzsche advanced the thesis that the motive for the search for truth in sciences is a kind of ascetic ideal.

In general, continental philosophy views science from a world-historical perspective. Philosophers such as Pierre Duhem (1861–1916) and Gaston Bachelard (1884–1962) wrote their works with this world-historical approach to science, predating Kuhn's 1962 work by a generation or more. All of these approaches involve a historical and sociological turn to science, with a priority on lived experience (a kind of Husserlian "life-world"), rather than a progress-based or anti-historical approach as emphasised in the analytic tradition. One can trace this continental strand of thought through the phenomenology of Edmund Husserl (1859–1938), the late works of Merleau-Ponty (Nature: Course Notes from the Collège de France, 1956–1960), and the hermeneutics of Martin Heidegger (1889–1976).

The largest effect on the continental tradition with respect to science came from Martin Heidegger's critique of the theoretical attitude in general, which of course includes the scientific attitude. For this reason, the continental tradition has remained much more skeptical of the importance of science in human life and in philosophical inquiry. Nonetheless, there have been a number of important works: especially those of a Kuhnian precursor, Alexandre Koyré (1892–1964). Another important development was that of Michel Foucault's analysis of historical and scientific thought in The Order of Things (1966) and his study of power and corruption within the "science" of madness. Post-Heideggerian authors contributing to continental philosophy of science in the second half of the 20th century include Jürgen Habermas (e.g., Truth and Justification, 1998), Carl Friedrich von Weizsäcker (The Unity of Nature, 1980; German: Die Einheit der Natur (1971)), and Wolfgang Stegmüller (Probleme und Resultate der Wissenschaftstheorie und Analytischen Philosophie, 1973–1986).

Other topics

Reductionism

Analysis involves breaking an observation or theory down into simpler concepts in order to understand it. Reductionism can refer to one of several philosophical positions related to this approach. One type of reductionism suggests that phenomena are amenable to scientific explanation at lower levels of analysis and inquiry. Perhaps a historical event might be explained in sociological and psychological terms, which in turn might be described in terms of human physiology, which in turn might be described in terms of chemistry and physics. Daniel Dennett distinguishes legitimate reductionism from what he calls greedy reductionism, which denies real complexities and leaps too quickly to sweeping generalizations.

Social accountability

A broad issue affecting the neutrality of science concerns the areas which science chooses to explore—that is, what part of the world and of humankind are studied by science. Philip Kitcher in his Science, Truth, and Democracy argues that scientific studies that attempt to show one segment of the population as being less intelligent, less successful, or emotionally backward compared to others have a political feedback effect which further excludes such groups from access to science. Thus such studies undermine the broad consensus required for good science by excluding certain people, and so proving themselves in the end to be unscientific.

Philosophy of particular sciences

There is no such thing as philosophy-free science; there is only science whose philosophical baggage is taken on board without examination.

— Daniel Dennett, Darwin's Dangerous Idea, 1995

In addition to addressing the general questions regarding science and induction, many philosophers of science are occupied by investigating foundational problems in particular sciences. They also examine the implications of particular sciences for broader philosophical questions. The late 20th and early 21st century has seen a rise in the number of practitioners of philosophy of a particular science.

Philosophy of statistics

The problem of induction discussed above is seen in another form in debates over the foundations of statistics. The standard approach to statistical hypothesis testing avoids claims about whether evidence supports a hypothesis or makes it more probable. Instead, the typical test yields a p-value, which is the probability of the evidence being such as it is, under the assumption that the null hypothesis is true. If the p-value is too high, the hypothesis is rejected, in a way analogous to falsification. In contrast, Bayesian inference seeks to assign probabilities to hypotheses. Related topics in philosophy of statistics include probability interpretations, overfitting, and the difference between correlation and causation.

Philosophy of mathematics

Philosophy of mathematics is concerned with the philosophical foundations and implications of mathematics. The central questions are whether numbers, triangles, and other mathematical entities exist independently of the human mind and what is the nature of mathematical propositions. Is asking whether "1 + 1 = 2" is true fundamentally different from asking whether a ball is red? Was calculus invented or discovered? A related question is whether learning mathematics requires experience or reason alone. What does it mean to prove a mathematical theorem and how does one know whether a mathematical proof is correct? Philosophers of mathematics also aim to clarify the relationships between mathematics and logic, human capabilities such as intuition, and the material universe.

Philosophy of physics

Philosophy of physics is the study of the fundamental, philosophical questions underlying modern physics, the study of matter and energy and how they interact. The main questions concern the nature of space and time, atoms and atomism. Also included are the predictions of cosmology, the interpretation of quantum mechanics, the foundations of statistical mechanics, causality, determinism, and the nature of physical laws. Classically, several of these questions were studied as part of metaphysics (for example, those about causality, determinism, and space and time).

Philosophy of chemistry

Philosophy of chemistry is the philosophical study of the methodology and content of the science of chemistry. It is explored by philosophers, chemists, and philosopher-chemist teams. It includes research on general philosophy of science issues as applied to chemistry. For example, can all chemical phenomena be explained by quantum mechanics or is it not possible to reduce chemistry to physics? For another example, chemists have discussed the philosophy of how theories are confirmed in the context of confirming reaction mechanisms. Determining reaction mechanisms is difficult because they cannot be observed directly. Chemists can use a number of indirect measures as evidence to rule out certain mechanisms, but they are often unsure if the remaining mechanism is correct because there are many other possible mechanisms that they have not tested or even thought of. Philosophers have also sought to clarify the meaning of chemical concepts which do not refer to specific physical entities, such as chemical bonds.

Philosophy of astronomy

The philosophy of astronomy seeks to understand and analyze the methodologies and technologies used by experts in the discipline, focusing on how observations made about space and astrophysical phenomena can be studied. Given that astronomers rely and use theories and formulas from other scientific disciplines, such as chemistry and physics, the pursuit of understanding how knowledge can be obtained about the cosmos, as well as the relation in which Earth and the Solar System have within personal views of humanity's place in the universe, philosophical insights into how facts about space can be scientifically analyzed and configure with other established knowledge is a main point of inquiry.

Philosophy of Earth sciences

The philosophy of Earth science is concerned with how humans obtain and verify knowledge of the workings of the Earth system, including the atmosphere, hydrosphere, and geosphere (solid earth). Earth scientists' ways of knowing and habits of mind share important commonalities with other sciences, but also have distinctive attributes that emerge from the complex, heterogeneous, unique, long-lived, and non-manipulatable nature of the Earth system.

Philosophy of biology

Peter Godfrey-Smith was awarded the Lakatos Award for his 2009 book Darwinian Populations and Natural Selection, which discusses the philosophical foundations of the theory of evolution.

Philosophy of biology deals with epistemological, metaphysical, and ethical issues in the biological and biomedical sciences. Although philosophers of science and philosophers generally have long been interested in biology (e.g., Aristotle, Descartes, Leibniz and even Kant), philosophy of biology only emerged as an independent field of philosophy in the 1960s and 1970s. Philosophers of science began to pay increasing attention to developments in biology, from the rise of the modern synthesis in the 1930s and 1940s to the discovery of the structure of deoxyribonucleic acid (DNA) in 1953 to more recent advances in genetic engineering. Other key ideas such as the reduction of all life processes to biochemical reactions as well as the incorporation of psychology into a broader neuroscience are also addressed. Research in current philosophy of biology includes investigation of the foundations of evolutionary theory (such as Peter Godfrey-Smith's work), and the role of viruses as persistent symbionts in host genomes. As a consequence, the evolution of genetic content order is seen as the result of competent genome editors  in contrast to former narratives in which error replication events (mutations) dominated.

Philosophy of medicine

A fragment of the Hippocratic Oath from the third century

Beyond medical ethics and bioethics, the philosophy of medicine is a branch of philosophy that includes the epistemology and ontology/metaphysics of medicine. Within the epistemology of medicine, evidence-based medicine (EBM) (or evidence-based practice (EBP)) has attracted attention, most notably the roles of randomisation, blinding and placebo controls. Related to these areas of investigation, ontologies of specific interest to the philosophy of medicine include Cartesian dualism, the monogenetic conception of disease and the conceptualization of 'placebos' and 'placebo effects'. There is also a growing interest in the metaphysics of medicine, particularly the idea of causation. Philosophers of medicine might not only be interested in how medical knowledge is generated, but also in the nature of such phenomena. Causation is of interest because the purpose of much medical research is to establish causal relationships, e.g. what causes disease, or what causes people to get better.

Philosophy of psychiatry

Philosophy of psychiatry explores philosophical questions relating to psychiatry and mental illness. The philosopher of science and medicine Dominic Murphy identifies three areas of exploration in the philosophy of psychiatry. The first concerns the examination of psychiatry as a science, using the tools of the philosophy of science more broadly. The second entails the examination of the concepts employed in discussion of mental illness, including the experience of mental illness, and the normative questions it raises. The third area concerns the links and discontinuities between the philosophy of mind and psychopathology.

Philosophy of psychology

Wilhelm Wundt (seated) with colleagues in his psychological laboratory, the first of its kind

Philosophy of psychology refers to issues at the theoretical foundations of modern psychology. Some of these issues are epistemological concerns about the methodology of psychological investigation. For example, is the best method for studying psychology to focus only on the response of behavior to external stimuli or should psychologists focus on mental perception and thought processes? If the latter, an important question is how the internal experiences of others can be measured. Self-reports of feelings and beliefs may not be reliable because, even in cases in which there is no apparent incentive for subjects to intentionally deceive in their answers, self-deception or selective memory may affect their responses. Then even in the case of accurate self-reports, how can responses be compared across individuals? Even if two individuals respond with the same answer on a Likert scale, they may be experiencing very different things.

Other issues in philosophy of psychology are philosophical questions about the nature of mind, brain, and cognition, and are perhaps more commonly thought of as part of cognitive science, or philosophy of mind. For example, are humans rational creatures? Is there any sense in which they have free will, and how does that relate to the experience of making choices? Philosophy of psychology also closely monitors contemporary work conducted in cognitive neuroscience, psycholinguistics, and artificial intelligence, questioning what they can and cannot explain in psychology.

Philosophy of psychology is a relatively young field, because psychology only became a discipline of its own in the late 1800s. In particular, neurophilosophy has just recently become its own field with the works of Paul Churchland and Patricia Churchland. Philosophy of mind, by contrast, has been a well-established discipline since before psychology was a field of study at all. It is concerned with questions about the very nature of mind, the qualities of experience, and particular issues like the debate between dualism and monism.

Philosophy of social science

The philosophy of social science is the study of the logic and method of the social sciences, such as sociology and cultural anthropology. Philosophers of social science are concerned with the differences and similarities between the social and the natural sciences, causal relationships between social phenomena, the possible existence of social laws, and the ontological significance of structure and agency.

The French philosopher, Auguste Comte (1798–1857), established the epistemological perspective of positivism in The Course in Positivist Philosophy, a series of texts published between 1830 and 1842. The first three volumes of the Course dealt chiefly with the natural sciences already in existence (geoscience, astronomy, physics, chemistry, biology), whereas the latter two emphasised the inevitable coming of social science: "sociologie". For Comte, the natural sciences had to necessarily arrive first, before humanity could adequately channel its efforts into the most challenging and complex "Queen science" of human society itself. Comte offers an evolutionary system proposing that society undergoes three phases in its quest for the truth according to a general 'law of three stages'. These are (1) the theological, (2) the metaphysical, and (3) the positive.

Comte's positivism established the initial philosophical foundations for formal sociology and social research. Durkheim, Marx, and Weber are more typically cited as the fathers of contemporary social science. In psychology, a positivistic approach has historically been favoured in behaviourism. Positivism has also been espoused by 'technocrats' who believe in the inevitability of social progress through science and technology.

The positivist perspective has been associated with 'scientism'; the view that the methods of the natural sciences may be applied to all areas of investigation, be it philosophical, social scientific, or otherwise. Among most social scientists and historians, orthodox positivism has long since lost popular support. Today, practitioners of both social and physical sciences instead take into account the distorting effect of observer bias and structural limitations. This scepticism has been facilitated by a general weakening of deductivist accounts of science by philosophers such as Thomas Kuhn, and new philosophical movements such as critical realism and neopragmatism. The philosopher-sociologist Jürgen Habermas has critiqued pure instrumental rationality as meaning that scientific-thinking becomes something akin to ideology itself.

Philosophy of technology

The philosophy of technology is a sub-field of philosophy that studies the nature of technology. Specific research topics include study of the role of tacit and explicit knowledge in creating and using technology, the nature of functions in technological artifacts, the role of values in design, and ethics related to technology. Technology and engineering can both involve the application of scientific knowledge. The philosophy of engineering is an emerging sub-field of the broader philosophy of technology.

Empirical evidence

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Empirical_evidence Empirical evi...