Search This Blog

Thursday, July 31, 2025

Philosophy of mind

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Philosophy_of_mind

Philosophy of mind
is a branch of philosophy that deals with the nature of the mind and its relation to the body and the external world.

The mind–body problem is a paradigmatic issue in philosophy of mind, although a number of other issues are addressed, such as the hard problem of consciousness and the nature of particular mental states. Aspects of the mind that are studied include mental events, mental functions, mental properties, consciousness and its neural correlates, the ontology of the mind, the nature of cognition and of thought, and the relationship of the mind to the body.

Dualism and monism are the two central schools of thought on the mind–body problem, although nuanced views have arisen that do not fit one or the other category neatly.

  • Dualism finds its entry into Western philosophy thanks to René Descartes in the 17th century. Substance dualists like Descartes argue that the mind is an independently existing substance, whereas property dualists maintain that the mind is a group of independent properties that emerge from and cannot be reduced to the brain, but that it is not a distinct substance.
  • Monism is the position that mind and body are ontologically indiscernible entities, not dependent substances. This view was espoused by the 17th-century rationalist Baruch SpinozaPhysicalists argue that only entities postulated by physical theory exist, and that mental processes will eventually be explained in terms of these entities as physical theory continues to evolve. Physicalists maintain various positions on the prospects of reducing mental properties to physical properties (many of whom adopt compatible forms of property dualism), and the ontological status of such mental properties remains unclear. Idealists maintain that the mind is all that exists and that the external world is either mental itself, or an illusion created by the mind. Neutral monists such as Ernst Mach and William James argue that events in the world can be thought of as either mental (psychological) or physical depending on the network of relationships into which they enter, and dual-aspect monists such as Spinoza adhere to the position that there is some other, neutral substance, and that both matter and mind are properties of this unknown substance. The most common monisms in the 20th and 21st centuries have all been variations of physicalism; these positions include behaviorism, the type identity theory, anomalous monism and functionalism.

Most modern philosophers of mind adopt either a reductive physicalist or non-reductive physicalist position, maintaining in their different ways that the mind is not something separate from the body. These approaches have been particularly influential in the sciences, especially in the fields of sociobiology, computer science (specifically, artificial intelligence), evolutionary psychology and the various neurosciences. Reductive physicalists assert that all mental states and properties will eventually be explained by scientific accounts of physiological processes and states. Non-reductive physicalists argue that although the mind is not a separate substance, mental properties supervene on physical properties, or that the predicates and vocabulary used in mental descriptions and explanations are indispensable, and cannot be reduced to the language and lower-level explanations of physical science. Continued neuroscientific progress has helped to clarify some of these issues; however, they are far from being resolved. Modern philosophers of mind continue to ask how the subjective qualities and the intentionality of mental states and properties can be explained in naturalistic terms.

The problems of physicalist theories of the mind have led some contemporary philosophers to assert that the traditional view of substance dualism should be defended. From this perspective, this theory is coherent, and problems such as "the interaction of mind and body" can be rationally resolved.

Mind–body problem

Illustration of mind–body dualism by René Descartes. Inputs are passed by the sensory organs to the pineal gland, and from there to the immaterial spirit.

The mind–body problem concerns the explanation of the relationship that exists between minds, or mental processes, and bodily states or processes. The main aim of philosophers working in this area is to determine the nature of the mind and mental states/processes, and how—or even if—minds are affected by and can affect the body.

Perceptual experiences depend on stimuli that arrive at our various sensory organs from the external world, and these stimuli cause changes in our mental states, ultimately causing us to feel a sensation, which may be pleasant or unpleasant. For example, someone's desire for a slice of pizza will tend to cause that person to move his or her body in a specific manner and direction to obtain what he or she wants. The question, then, is how it can be possible for conscious experiences to arise out of a lump of gray matter endowed with nothing but electrochemical properties.

A related problem is how someone's propositional attitudes (e.g. beliefs and desires) cause that individual's neurons to fire and muscles to contract. These comprise some of the puzzles that have confronted epistemologists and philosophers of mind from the time of René Descartes.

Dualist solutions to the mind–body problem

Dualism is a set of views about the relationship between mind and matter (or body). It begins with the claim that mental phenomena are, in some respects, non-physical.

Wrongly conflating purusha ("spirit", or better, pure consciousness) with mind or manas - an evolute of non-conscious prakriti, while correctly distinguishing purusha and prakriti - two eternally-different ontological entities,  leads to the erroneous conclusion that Samkhya supports mind-body dualism, specifically substance dualism (see below). It does not. Both mind and body (better Indriya - sensors and effectors) are equally jaDaa (non-conscious) in Samkhya and while they are different evolutes of prakriti, they are both made up of gunas. It is true, however, that mind and sensors-effectors are different functions of prakriti and so a form of property dualism (also defined below) may be found in the ancient (ca. 6th c. BCE) Indian philosophical schools of Samkhya and Yoga.

In Western philosophy, the earliest discussions of dualist ideas are in the writings of Plato who suggested that humans' intelligence (a faculty of the mind or soul) could not be identified with, or explained in terms of, their physical body. However, the best-known version of dualism is due to René Descartes (1641), and holds that the mind is a non-extended, non-physical substance, a "res cogitans". Descartes was the first to clearly identify the mind with consciousness and self-awareness, and to distinguish this from the brain, which was the seat of intelligence. He was therefore the first to formulate the mind–body problem in the form in which it still exists today.

Arguments for dualism

The most frequently used argument in favor of dualism appeals to the common-sense intuition that conscious experience is distinct from inanimate matter. If asked what the mind is, the average person would usually respond by identifying it with their self, their personality, their soul, or another related entity. They would almost certainly deny that the mind simply is the brain, or vice versa, finding the idea that there is just one ontological entity at play to be too mechanistic or unintelligible. Modern philosophers of mind think that these intuitions are misleading, and that critical faculties, along with empirical evidence from the sciences, should be used to examine these assumptions and determine whether there is any real basis to them.

According to some, the mental and the physical seem to have quite different, and perhaps irreconcilable, properties. Mental events have a subjective quality, whereas physical events do not. So, for example, one can reasonably ask what a burnt finger feels like, or what a blue sky looks like, or what nice music sounds like to a person. But it is meaningless, or at least odd, to ask what a surge in the uptake of glutamate in the dorsolateral portion of the prefrontal cortex feels like.

Philosophers of mind call the subjective aspects of mental events "qualia" or "raw feels". There are qualia involved in these mental events that seem particularly difficult to reduce to anything physical. David Chalmers explains this argument by stating that we could conceivably know all the objective information about something, such as the brain states and wavelengths of light involved with seeing the color red, but still not know something fundamental about the situation – what it is like to see the color red.

If consciousness (the mind) can exist independently of physical reality (the brain), one must explain how physical memories are created concerning consciousness. Dualism must therefore explain how consciousness affects physical reality. One possible explanation is that of a miracle, proposed by Arnold Geulincx and Nicolas Malebranche, where all mind–body interactions require the direct intervention of God.

Another argument that has been proposed by C. S. Lewis is the Argument from Reason: if, as monism implies, all of our thoughts are the effects of physical causes, then we have no reason for assuming that they are also the consequent of a reasonable ground. Knowledge, however, is apprehended by reasoning from ground to consequent. Therefore, if monism is correct, there would be no way of knowing this—or anything else—we could not even suppose it, except by a fluke.

The zombie argument is based on a thought experiment proposed by Todd Moody, and developed by David Chalmers in his book The Conscious Mind. The basic idea is that one can imagine one's body, and therefore conceive the existence of one's body, without any conscious states being associated with this body. Chalmers' argument is that it seems possible that such a being could exist because all that is needed is that all and only the things that the physical sciences describe about a zombie must be true of it. Since none of the concepts involved in these sciences make reference to consciousness or other mental phenomena, and any physical entity can be by definition described scientifically via physics, the move from conceivability to possibility is not such a large one. Others such as Dennett have argued that the notion of a philosophical zombie is an incoherent, or unlikely, concept. It has been argued under physicalism that one must either believe that anyone including oneself might be a zombie, or that no one can be a zombie—following from the assertion that one's own conviction about being (or not being) a zombie is a product of the physical world and is therefore no different from anyone else's. This argument has been expressed by Dennett who argues that "Zombies think they are conscious, think they have qualia, think they suffer pains—they are just 'wrong' (according to this lamentable tradition) in ways that neither they nor we could ever discover!" See also the problem of other minds.

Avshalom Elitzur has described himself as a "reluctant dualist". One argument Elitzur makes in favor of dualism is an argument from bafflement. According to Elitzur, a conscious being can conceive of a P-zombie version of his/herself. However, a P-zombie cannot conceive of a version of itself that lacks corresponding qualia.

Christian List argues that the existence of first-person perspectives is evidence against physicalist views of consciousness. According to List, first-personal phenomenal facts cannot supervene on third-person physical facts. However, List argues that this also refutes versions of dualism that have purely third-personal metaphysics. List has proposed a model he calls the "many-worlds theory of consciousness" in order to reconcile the subjective nature of consciousness without lapsing into solipsism.

Interactionist dualism

Portrait of René Descartes by Frans Hals (1648)

Interactionist dualism, or simply interactionism, is the particular form of dualism first espoused by Descartes in the Meditations. In the 20th century, its major defenders have been Karl Popper and John Carew Eccles. It is the view that mental states, such as beliefs and desires, causally interact with physical states.

Descartes's argument for this position can be summarized as follows: Seth has a clear and distinct idea of his mind as a thinking thing that has no spatial extension (i.e., it cannot be measured in terms of length, weight, height, and so on). He also has a clear and distinct idea of his body as something that is spatially extended, subject to quantification and not able to think. It follows that mind and body are not identical because they have radically different properties.

Seth's mental states (desires, beliefs, etc.) have causal effects on his body and vice versa: A child touches a hot stove (physical event) which causes pain (mental event) and makes her yell (physical event), this in turn provokes a sense of fear and protectiveness in the caregiver (mental event), and so on.

Descartes' argument depends on the premise that what Seth believes to be "clear and distinct" ideas in his mind are necessarily true. Many contemporary philosophers doubt this. For example, Joseph Agassi suggests that several scientific discoveries made since the early 20th century have undermined the idea of privileged access to one's own ideas. Freud claimed that a psychologically-trained observer can understand a person's unconscious motivations better than the person himself does. Duhem has shown that a philosopher of science can know a person's methods of discovery better than that person herself does, while Malinowski has shown that an anthropologist can know a person's customs and habits better than the person whose customs and habits they are. He also asserts that modern psychological experiments that cause people to see things that are not there provide grounds for rejecting Descartes' argument, because scientists can describe a person's perceptions better than the person themself can.

Other forms of dualism

Four varieties of dualism. The arrows indicate the direction of the causal interactions. Occasionalism is not shown.
Psychophysical parallelism

Psychophysical parallelism, or simply parallelism, is the view that mind and body, while having distinct ontological statuses, do not causally influence one another. Instead, they run along parallel paths (mind events causally interact with mind events and brain events causally interact with brain events) and only seem to influence each other. This view was most prominently defended by Gottfried Leibniz. Although Leibniz was an ontological monist who believed that only one type of substance, the monad, exists in the universe, and that everything is reducible to it, he nonetheless maintained that there was an important distinction between "the mental" and "the physical" in terms of causation. He held that God had arranged things in advance so that minds and bodies would be in harmony with each other. This is known as the doctrine of pre-established harmony.

Occasionalism

Occasionalism is the view espoused by Nicholas Malebranche as well as Islamic philosophers such as Abu Hamid Muhammad ibn Muhammad al-Ghazali that asserts all supposedly causal relations between physical events, or between physical and mental events, are not really causal at all. While body and mind are different substances, causes (whether mental or physical) are related to their effects by an act of God's intervention on each specific occasion.

Property dualism

Property dualism is the view that the world is constituted of one kind of substance – the physical kind – and there exist two distinct kinds of properties: physical properties and mental properties. It is the view that non-physical, mental properties (such as beliefs, desires and emotions) inhere in some physical bodies (at least, brains). Sub-varieties of property dualism include:

  1. Emergent materialism asserts that when matter is organized in the appropriate way (i.e., in the way that living human bodies are organized), mental properties emerge in a way not fully accountable for by physical laws. These emergent properties have an independent ontological status and cannot be reduced to, or explained in terms of, the physical substrate from which they emerge. They are dependent on the physical properties from which they emerge, but opinions vary as to the coherence of top–down causation, that is, the causal effectiveness of such properties. A form of emergent materialism has been espoused by David Chalmers and the concept has undergone something of a renaissance in recent years, but it was already suggested in the 19th century by William James.
  2. Epiphenomenalism is a doctrine first formulated by Thomas Henry Huxley. It consists of the view that mental phenomena are causally ineffectual, where one or more mental states do not have any influence on physical states or mental phenomena are the effects, but not the causes, of physical phenomena. Physical events can cause other physical and mental events, but mental events cannot cause anything since they are just causally inert by-products (i.e., epiphenomena) of the physical world. This view has been defended by Frank Jackson.
  3. Non-reductive physicalism is the view that mental properties form a separate ontological class to physical properties: mental states (such as qualia) are not reducible to physical states. The ontological stance towards qualia in the case of non-reductive physicalism does not imply that qualia are causally inert; this is what distinguishes it from epiphenomenalism.
  4. Panpsychism is the view that all matter has a mental aspect, or, alternatively, all objects have a unified center of experience or point of view. Superficially, it seems to be a form of property dualism, since it regards everything as having both mental and physical properties. However, some panpsychists say that mechanical behaviour is derived from the primitive mentality of atoms and molecules—as are sophisticated mentality and organic behaviour, the difference being attributed to the presence or absence of complex structure in a compound object. So long as the reduction of non-mental properties to mental ones is in place, panpsychism is not a (strong) form of property dualism; otherwise it is.
Dual aspect theory

Dual aspect theory or dual-aspect monism is the view that the mental and the physical are two aspects of, or perspectives on, the same substance. (Thus it is a mixed position, which is monistic in some respects). In modern philosophical writings, the theory's relationship to neutral monism has become somewhat ill-defined, but one proffered distinction says that whereas neutral monism allows the context of a given group of neutral elements and the relationships into which they enter to determine whether the group can be thought of as mental, physical, both, or neither, dual-aspect theory suggests that the mental and the physical are manifestations (or aspects) of some underlying substance, entity or process that is itself neither mental nor physical as normally understood. Various formulations of dual-aspect monism also require the mental and the physical to be complementary, mutually irreducible and perhaps inseparable (though distinct).

Experiential dualism

This is a philosophy of mind that regards the degrees of freedom between mental and physical well-being as not synonymous thus implying an experiential dualism between body and mind. An example of these disparate degrees of freedom is given by Allan Wallace who notes that it is "experientially apparent that one may be physically uncomfortable—for instance, while engaging in a strenuous physical workout—while mentally cheerful; conversely, one may be mentally distraught while experiencing physical comfort". Experiential dualism notes that our subjective experience of merely seeing something in the physical world seems qualitatively different from mental processes like grief that comes from losing a loved one. This philosophy is a proponent of causal dualism, which is defined as the dual ability for mental states and physical states to affect one another. Mental states can cause changes in physical states and vice versa.

However, unlike cartesian dualism or some other systems, experiential dualism does not posit two fundamental substances in reality: mind and matter. Rather, experiential dualism is to be understood as a conceptual framework that gives credence to the qualitative difference between the experience of mental and physical states. Experiential dualism is accepted as the conceptual framework of Madhyamaka Buddhism.

Madhayamaka Buddhism goes further, finding fault with the monist view of physicalist philosophies of mind as well in that these generally posit matter and energy as the fundamental substance of reality. Nonetheless, this does not imply that the cartesian dualist view is correct, rather Madhyamaka regards as error any affirming view of a fundamental substance to reality.

In denying the independent self-existence of all the phenomena that make up the world of our experience, the Madhyamaka view departs from both the substance dualism of Descartes and the substance monism—namely, physicalism—that is characteristic of modern science. The physicalism propounded by many contemporary scientists seems to assert that the real world is composed of physical things-in-themselves, while all mental phenomena are regarded as mere appearances, devoid of any reality in and of themselves. Much is made of this difference between appearances and reality.

Indeed, physicalism, or the idea that matter is the only fundamental substance of reality, is explicitly rejected by Buddhism.

In the Madhyamaka view, mental events are no more or less real than physical events. In terms of our common-sense experience, differences of kind do exist between physical and mental phenomena. While the former commonly have mass, location, velocity, shape, size, and numerous other physical attributes, these are not generally characteristic of mental phenomena. For example, we do not commonly conceive of the feeling of affection for another person as having mass or location. These physical attributes are no more appropriate to other mental events such as sadness, a recalled image from one's childhood, the visual perception of a rose, or consciousness of any sort. Mental phenomena are, therefore, not regarded as being physical, for the simple reason that they lack many of the attributes that are uniquely characteristic of physical phenomena. Thus, Buddhism has never adopted the physicalist principle that regards only physical things as real.

Monist solutions to the mind–body problem

In contrast to dualism, monism does not accept any fundamental divisions. The fundamentally disparate nature of reality has been central to forms of eastern philosophies for over two millennia. In Indian and Chinese philosophy, monism is integral to how experience is understood. Today, the most common forms of monism in Western philosophy are physicalist. Physicalistic monism asserts that the only existing substance is physical, in some sense of that term to be clarified by our best science. However, a variety of formulations (see below) are possible. Another form of monism, idealism, states that the only existing substance is mental. Although pure idealism, such as that of George Berkeley, is uncommon in contemporary Western philosophy, a more sophisticated variant called panpsychism, according to which mental experience and properties may be at the foundation of physical experience and properties, has been espoused by some philosophers such as Alfred North Whitehead and David Ray Griffin.

Phenomenalism is the theory that representations (or sense data) of external objects are all that exist. Such a view was briefly adopted by Bertrand Russell and many of the logical positivists during the early 20th century. A third possibility is to accept the existence of a basic substance that is neither physical nor mental. The mental and physical would then both be properties of this neutral substance. Such a position was adopted by Baruch Spinoza and was popularized by Ernst Mach in the 19th century. This neutral monism, as it is called, resembles property dualism.

Physicalistic monisms

Behaviorism

Behaviorism dominated philosophy of mind for much of the 20th century, especially the first half. In psychology, behaviorism developed as a reaction to the inadequacies of introspectionism. Introspective reports on one's own interior mental life are not subject to careful examination for accuracy and cannot be used to form predictive generalizations. Without generalizability and the possibility of third-person examination, the behaviorists argued, psychology cannot be scientific. The way out, therefore, was to eliminate the idea of an interior mental life (and hence an ontologically independent mind) altogether and focus instead on the description of observable behavior.

Parallel to these developments in psychology, a philosophical behaviorism (sometimes called logical behaviorism) was developed. This is characterized by a strong verificationism, which generally considers unverifiable statements about interior mental life pointless. For the behaviorist, mental states are not interior states on which one can make introspective reports. They are just descriptions of behavior or dispositions to behave in certain ways, made by third parties to explain and predict another's behavior.

Philosophical behaviorism has fallen out of favor since the latter half of the 20th century, coinciding with the rise of cognitivism.

Identity theory

Type physicalism (or type-identity theory) was developed by Jack Smart and Ullin Place as a direct reaction to the failure of behaviorism. These philosophers reasoned that, if mental states are something material, but not behavioral, then mental states are probably identical to internal states of the brain. In very simplified terms: a mental state M is nothing other than brain state B. The mental state "desire for a cup of coffee" would thus be nothing more than the "firing of certain neurons in certain brain regions".

The classic Identity theory and Anomalous Monism in contrast. For the Identity theory, every token instantiation of a single mental type corresponds (as indicated by the arrows) to a physical token of a single physical type. For anomalous monism, the token–token correspondences can fall outside of the type–type correspondences. The result is token identity.

On the other hand, even granted the above, it does not follow that identity theories of all types must be abandoned. According to token identity theories, the fact that a certain brain state is connected with only one mental state of a person does not have to mean that there is an absolute correlation between types of mental state and types of brain state. The type–token distinction can be illustrated by a simple example: the word "green" contains four types of letters (g, r, e, n) with two tokens (occurrences) of the letter e along with one each of the others. The idea of token identity is that only particular occurrences of mental events are identical with particular occurrences or tokenings of physical events. Anomalous monism (see below) and most other non-reductive physicalisms are token-identity theories. Despite these problems, there is a renewed interest in the type identity theory today, primarily due to the influence of Jaegwon Kim.

Functionalism

Functionalism was formulated by Hilary Putnam and Jerry Fodor as a reaction to the inadequacies of the identity theory. Putnam and Fodor saw mental states in terms of an empirical computational theory of the mind. At about the same time or slightly after, D.M. Armstrong and David Kellogg Lewis formulated a version of functionalism that analyzed the mental concepts of folk psychology in terms of functional roles. Finally, Wittgenstein's idea of meaning as use led to a version of functionalism as a theory of meaning, further developed by Wilfrid Sellars and Gilbert Harman. Another one, psychofunctionalism, is an approach adopted by the naturalistic philosophy of mind associated with Jerry Fodor and Zenon Pylyshyn.

Mental states are characterized by their causal relations with other mental states and with sensory inputs and behavioral outputs. Functionalism abstracts away from the details of the physical implementation of a mental state by characterizing it in terms of non-mental functional properties. For example, a kidney is characterized scientifically by its functional role in filtering blood and maintaining certain chemical balances.

Non-reductive physicalism

Non-reductionist philosophers hold firmly to two essential convictions with regard to mind–body relations: 1) Physicalism is true and mental states must be physical states, but 2) All reductionist proposals are unsatisfactory: mental states cannot be reduced to behavior, brain states or functional states. Hence, the question arises whether there can still be a non-reductive physicalism. Donald Davidson's anomalous monism is an attempt to formulate such a physicalism. He "thinks that when one runs across what are traditionally seen as absurdities of Reason, such as akrasia or self-deception, the personal psychology framework is not to be given up in favor of the subpersonal one, but rather must be enlarged or extended so that the rationality set out by the principle of charity can be found elsewhere."

Davidson uses the thesis of supervenience: mental states supervene on physical states, but are not reducible to them. "Supervenience" therefore describes a functional dependence: there can be no change in the mental without some change in the physical–causal reducibility between the mental and physical without ontological reducibility.

Weak emergentism

Weak emergentism is a form of "non-reductive physicalism" that involves a layered view of nature, with the layers arranged in terms of increasing complexity and each corresponding to its own special science. Some philosophers hold that emergent properties causally interact with more fundamental levels, while others maintain that higher-order properties simply supervene over lower levels without direct causal interaction. The latter group therefore holds a less strict, or "weaker", definition of emergentism, which can be rigorously stated as follows: a property P of composite object O is emergent if it is metaphysically impossible for another object to lack property P if that object is composed of parts with intrinsic properties identical to those in O and has those parts in an identical configuration.

Sometimes emergentists use the example of water having a new property when Hydrogen H and Oxygen O combine to form H2O (water). In this example there "emerges" a new property of a transparent liquid that would not have been predicted by understanding hydrogen and oxygen as gases. This is analogous to physical properties of the brain giving rise to a mental state. Emergentists try to solve the notorious mind–body gap this way. One problem for emergentism is the idea of causal closure in the world that does not allow for a mind-to-body causation.

Eliminative materialism

If one is a materialist and believes that all aspects of our common-sense psychology will find reduction to a mature cognitive neuroscience, and that non-reductive materialism is mistaken, then one can adopt a final, more radical position: eliminative materialism.

There are several varieties of eliminative materialism, but all maintain that our common-sense "folk psychology" badly misrepresents the nature of some aspect of cognition. Eliminativists such as Patricia and Paul Churchland argue that while folk psychology treats cognition as fundamentally sentence-like, the non-linguistic vector/matrix model of neural network theory or connectionism will prove to be a much more accurate account of how the brain works.

The Churchlands often invoke the fate of other, erroneous popular theories and ontologies that have arisen in the course of history. For example, Ptolemaic astronomy served to explain and roughly predict the motions of the planets for centuries, but eventually this model of the Solar System was eliminated in favor of the Copernican model. The Churchlands believe the same eliminative fate awaits the "sentence-cruncher" model of the mind in which thought and behavior are the result of manipulating sentence-like states called "propositional attitudes". Sociologist Jacy Reese Anthis argues for eliminative materialism on all faculties of mind, including consciousness, stating, "The deepest mysteries of the mind are within our reach."

Mysterianism

Some philosophers take an epistemic approach and argue that the mind–body problem is currently unsolvable, and perhaps will always remain unsolvable to human beings. This is usually termed New mysterianism. Colin McGinn holds that human beings are cognitively closed in regards to their own minds. According to McGinn human minds lack the concept-forming procedures to fully grasp how mental properties such as consciousness arise from their causal basis. An example would be how an elephant is cognitively closed in regards to particle physics.

A more moderate conception has been expounded by Thomas Nagel, which holds that the mind–body problem is currently unsolvable at the present stage of scientific development and that it might take a future scientific paradigm shift or revolution to bridge the explanatory gap. Nagel posits that in the future a sort of "objective phenomenology" might be able to bridge the gap between subjective conscious experience and its physical basis.

Linguistic criticism of the mind–body problem

Each attempt to answer the mind–body problem encounters substantial problems. Some philosophers argue that this is because there is an underlying conceptual confusion. These philosophers, such as Ludwig Wittgenstein and his followers in the tradition of linguistic criticism, therefore reject the problem as illusory. They argue that it is an error to ask how mental and biological states fit together. Rather it should simply be accepted that human experience can be described in different ways—for instance, in a mental and in a biological vocabulary. Illusory problems arise if one tries to describe the one in terms of the other's vocabulary or if the mental vocabulary is used in the wrong contexts.[74] This is the case, for instance, if one searches for mental states of the brain. The brain is simply the wrong context for the use of mental vocabulary—the search for mental states of the brain is therefore a category error or a sort of fallacy of reasoning.

Today, such a position is often adopted by interpreters of Wittgenstein such as Peter Hacker. However, Hilary Putnam, the originator of functionalism, has also adopted the position that the mind–body problem is an illusory problem which should be dissolved according to the manner of Wittgenstein.

Naturalism and its problems

The thesis of physicalism is that the mind is part of the material (or physical) world. Such a position faces the problem that the mind has certain properties that no other material thing seems to possess. Physicalism must therefore explain how it is possible that these properties can nonetheless emerge from a material thing. The project of providing such an explanation is often referred to as the "naturalization of the mental". Some of the crucial problems that this project attempts to resolve include the existence of qualia and the nature of intentionality.

Qualia

Many mental states seem to be experienced subjectively in different ways by different individuals. And it is characteristic of a mental state that it has some experiential quality, e.g. of pain, that it hurts. However, the sensation of pain between two individuals may not be identical, since no one has a perfect way to measure how much something hurts or of describing exactly how it feels to hurt. Philosophers and scientists therefore ask where these experiences come from. The existence of cerebral events, in and of themselves, cannot explain why they are accompanied by these corresponding qualitative experiences. The puzzle of why many cerebral processes occur with an accompanying experiential aspect in consciousness seems impossible to explain.

Yet it also seems to many that science will eventually have to explain such experiences. This follows from an assumption about the possibility of reductive explanations. According to this view, if an attempt can be successfully made to explain a phenomenon reductively (e.g., water), then it can be explained why the phenomenon has all of its properties (e.g., fluidity, transparency). In the case of mental states, this means that there needs to be an explanation of why they have the property of being experienced in a certain way.

The 20th-century German philosopher Martin Heidegger criticized the ontological assumptions underpinning such a reductive model, and claimed that it was impossible to make sense of experience in these terms. This is because, according to Heidegger, the nature of our subjective experience and its qualities is impossible to understand in terms of Cartesian "substances" that bear "properties". Another way to put this is that the very concept of qualitative experience is incoherent in terms of—or is semantically incommensurable with the concept of—substances that bear properties.

This problem of explaining introspective first-person aspects of mental states and consciousness in general in terms of third-person quantitative neuroscience is called the explanatory gap. There are several different views of the nature of this gap among contemporary philosophers of mind. David Chalmers and the early Frank Jackson interpret the gap as ontological in nature; that is, they maintain that qualia can never be explained by science because physicalism is false. There are two separate categories involved and one cannot be reduced to the other. An alternative view is taken by philosophers such as Thomas Nagel and Colin McGinn. According to them, the gap is epistemological in nature. For Nagel, science is not yet able to explain subjective experience because it has not yet arrived at the level or kind of knowledge that is required. We are not even able to formulate the problem coherently. For McGinn, on other hand, the problem is one of permanent and inherent biological limitations. We are not able to resolve the explanatory gap because the realm of subjective experiences is cognitively closed to us in the same manner that quantum physics is cognitively closed to elephants. Other philosophers liquidate the gap as purely a semantic problem. This semantic problem, of course, led to the famous "Qualia Question", which is: Does Red cause Redness?

Intentionality

John Searle—one of the most influential philosophers of mind, proponent of biological naturalism (Berkeley 2002)

Intentionality is the capacity of mental states to be directed towards (about) or be in relation with something in the external world. This property of mental states entails that they have contents and semantic referents and can therefore be assigned truth values. When one tries to reduce these states to natural processes there arises a problem: natural processes are not true or false, they simply happen. It would not make any sense to say that a natural process is true or false. But mental ideas or judgments are true or false, so how then can mental states (ideas or judgments) be natural processes? The possibility of assigning semantic value to ideas must mean that such ideas are about facts. Thus, for example, the idea that Herodotus was a historian refers to Herodotus and to the fact that he was a historian. If the fact is true, then the idea is true; otherwise, it is false. But where does this relation come from? In the brain, there are only electrochemical processes and these seem not to have anything to do with Herodotus.

Philosophy of perception

Philosophy of perception is concerned with the nature of perceptual experience and the status of perceptual objects, in particular how perceptual experience relates to appearances and beliefs about the world. The main contemporary views within philosophy of perception include naive realism, enactivism and representational views.

Philosophy of mind and science

A phrenological mapping of the brainphrenology was among the first attempts to correlate mental functions with specific parts of the brain although it is now widely discredited.

Humans are corporeal beings and, as such, they are subject to examination and description by the natural sciences. Since mental processes are intimately related to bodily processes (e.g., embodied cognition theory of mind), the descriptions that the natural sciences furnish of human beings play an important role in the philosophy of mind. There are many scientific disciplines that study processes related to the mental. The list of such sciences includes: biology, computer science, cognitive science, cybernetics, linguistics, medicine, pharmacology, and psychology.

Neurobiology

The theoretical background of biology, as is the case with modern natural sciences in general, is fundamentally materialistic. The objects of study are, in the first place, physical processes, which are considered to be the foundations of mental activity and behavior. The increasing success of biology in the explanation of mental phenomena can be seen by the absence of any empirical refutation of its fundamental presupposition: "there can be no change in the mental states of a person without a change in brain states."

Within the field of neurobiology, there are many subdisciplines that are concerned with the relations between mental and physical states and processes: Sensory neurophysiology investigates the relation between the processes of perception and stimulationCognitive neuroscience studies the correlations between mental processes and neural processes. Neuropsychology describes the dependence of mental faculties on specific anatomical regions of the brain. Lastly, evolutionary biology studies the origins and development of the human nervous system and, in as much as this is the basis of the mind, also describes the ontogenetic and phylogenetic development of mental phenomena beginning from their most primitive stages. Evolutionary biology furthermore places tight constraints on any philosophical theory of the mind, as the gene-based mechanism of natural selection does not allow any giant leaps in the development of neural complexity or neural software but only incremental steps over long time periods.

Since the 1980s, sophisticated neuroimaging procedures, such as fMRI (above), have furnished increasing knowledge about the workings of the human brain, shedding light on ancient philosophical problems.

The methodological breakthroughs of the neurosciences, in particular the introduction of high-tech neuroimaging procedures, has propelled scientists toward the elaboration of increasingly ambitious research programs: one of the main goals is to describe and comprehend the neural processes which correspond to mental functions (see: neural correlate). Several groups are inspired by these advances.

Neurophilosophy

Neurophilosophy is an interdisciplinary field that examines the intersection of neuroscience and philosophy, particularly focusing on how neuroscientific findings inform and challenge traditional arguments in the philosophy of mind, offering insights into the nature of consciousness, cognition, and the mind-brain relationship.

Patricia Churchland argues for a deep integration of neuroscience and philosophy, emphasizing that understanding the mind requires grounding philosophical questions in empirical findings about the brain. Churchland challenges traditional dualistic and purely conceptual approaches to the mind, advocating for a materialistic framework where mental phenomena are understood as brain processes. She posits that philosophical theories of mind must be informed by advances in neuroscience, such as the study of neural networks, brain plasticity, and the biochemical basis of cognition and behavior. Churchland critiques the idea that introspection or purely conceptual analysis can sufficiently explain consciousness, arguing instead that empirical methods can illuminate how subjective experiences arise from neural mechanisms.

An unsolved question in neuroscience and the philosophy of mind is the binding problem, which is the problem of how objects, background, and abstract or emotional features are combined into a single experience. It is considered a "problem" because no complete model exists. The binding problem can be subdivided into the four areas of perception, neuroscience, cognitive science, and the philosophy of mind. It includes general considerations on coordination, the subjective unity of perception, and variable binding. Another related problem is known as the boundary problem. The boundary problem is essentially the inverse of the binding problem, and asks how binding stops occurring and what prevents other neurological phenomena from being included in first-person perspectives, giving first-person perspectives hard boundaries.

Computer science

Computer science concerns itself with the automatic processing of information (or at least with physical systems of symbols to which information is assigned) by means of such things as computers. From the beginning, computer programmers have been able to develop programs that permit computers to carry out tasks for which organic beings need a mind. A simple example is multiplication. It is not clear whether computers could be said to have a mind. Could they, someday, come to have what we call a mind? This question has been propelled into the forefront of much philosophical debate because of investigations in the field of artificial intelligence (AI).

Within AI, it is common to distinguish between a modest research program and a more ambitious one: this distinction was coined by John Searle in terms of a weak AI and strong AI. The exclusive objective of "weak AI", according to Searle, is the successful simulation of mental states, with no attempt to make computers become conscious or aware, etc. The objective of strong AI, on the contrary, is a computer with consciousness similar to that of human beings. The program of strong AI goes back to one of the pioneers of computation Alan Turing. As an answer to the question "Can computers think?", he formulated the famous Turing test. Turing believed that a computer could be said to "think" when, if placed in a room by itself next to another room that contained a human being and with the same questions being asked of both the computer and the human being by a third party human being, the computer's responses turned out to be indistinguishable from those of the human. Essentially, Turing's view of machine intelligence followed the behaviourist model of the mind—intelligence is as intelligence does. The Turing test has received many criticisms, among which the most famous is probably the Chinese room thought experiment formulated by Searle.

The question about the possible sensitivity (qualia) of computers or robots still remains open. Some computer scientists believe that the specialty of AI can still make new contributions to the resolution of the "mind–body problem". They suggest that based on the reciprocal influences between software and hardware that takes place in all computers, it is possible that someday theories can be discovered that help us to understand the reciprocal influences between the human mind and the brain (wetware).

Psychology

Psychology is the science that investigates mental states directly. It uses generally empirical methods to investigate concrete mental states like joy, fear or obsessions. Psychology investigates the laws that bind these mental states to each other or with inputs and outputs to the human organism.

An example of this is the psychology of perception. Scientists working in this field have discovered general principles of the perception of forms. A law of the psychology of forms says that objects that move in the same direction are perceived as related to each other. This law describes a relation between visual input and mental perceptual states. However, it does not suggest anything about the nature of perceptual states. The laws discovered by psychology are compatible with all the answers to the mind–body problem already described.

Cognitive science

Cognitive science is the interdisciplinary scientific study of the mind and its processes. It examines what cognition is, what it does, and how it works. It includes research on intelligence and behavior, especially focusing on how information is represented, processed, and transformed (in faculties such as perception, language, memory, reasoning, and emotion) within nervous systems (human or other animals) and machines (e.g. computers). Cognitive science consists of multiple research disciplines, including psychology, artificial intelligence, philosophy, neuroscience, linguistics, anthropology, sociology, and education. It spans many levels of analysis, from low-level learning and decision mechanisms to high-level logic and planning; from neural circuitry to modular brain organization. Over the years, cognitive science has evolved from a representational and information processing approach to explaining the mind to embrace an embodied perspective of it. Accordingly, bodily processes play a significant role in the acquisition, development, and shaping of cognitive capabilities. For instance, Rowlands (2012) argues that cognition is enactive, embodied, embedded, affective and (potentially) extended. The position is taken that the "classical sandwich" of cognition sandwiched between perception and action is artificial; cognition has to be seen as a product of a strongly coupled interaction that cannot be divided this way.

Near-death research

In the field of near-death research, the following phenomenon, among others, occurs: For example, during some brain operations the brain is artificially and measurably deactivated. Nevertheless, some patients report during this phase that they have perceived what is happening in their surroundings, that is, that they have had consciousness. Patients also report experiences during a cardiac arrest. There is the following problem: As soon as the brain is no longer supplied with blood and thus with oxygen after a cardiac arrest, the brain ceases its normal operation after about 15 seconds, that is, the brain falls into a state of unconsciousness.

Philosophy of mind in the continental tradition

Most of the discussion in this article has focused on one style or tradition of philosophy in modern Western culture, usually called analytic philosophy (sometimes described as Anglo-American philosophy). Many other schools of thought exist, however, which are sometimes subsumed under the broad (and vague) label of continental philosophy. In any case, though topics and methods here are numerous, in relation to the philosophy of mind the various schools that fall under this label (phenomenology, existentialism, etc.) can globally be seen to differ from the analytic school in that they focus less on language and logical analysis alone but also take in other forms of understanding human existence and experience. With reference specifically to the discussion of the mind, this tends to translate into attempts to grasp the concepts of thought and perceptual experience in some sense that does not merely involve the analysis of linguistic forms.

Immanuel Kant's Critique of Pure Reason, first published in 1781 and presented again with major revisions in 1787, represents a significant intervention into what will later become known as the philosophy of mind. Kant's first critique is generally recognized as among the most significant works of modern philosophy in the West. Kant is a figure whose influence is marked in both continental and analytic/Anglo-American philosophy. Kant's work develops an in-depth study of transcendental consciousness, or the life of the mind as conceived through the universal categories of understanding.

In Georg Wilhelm Friedrich Hegel's Philosophy of Mind (frequently translated as Philosophy of Spirit or Geist), the third part of his Encyclopedia of the Philosophical Sciences, Hegel discusses three distinct types of mind: the "subjective mind/spirit", the mind of an individual; the "objective mind/spirit", the mind of society and of the State; and the "Absolute mind/spirit", the position of religion, art, and philosophy. See also Hegel's The Phenomenology of Spirit. Nonetheless, Hegel's work differs radically from the style of Anglo-American philosophy of mind.

In 1896, Henri Bergson made in Matter and Memory "Essay on the relation of body and spirit" a forceful case for the ontological difference of body and mind by reducing the problem to the more definite one of memory, thus allowing for a solution built on the empirical test case of aphasia.

In modern times, the two main schools that have developed in response or opposition to this Hegelian tradition are phenomenology and existentialism. Phenomenology, founded by Edmund Husserl, focuses on the contents of the human mind (see noema) and how processes shape our experiences. Existentialism, a school of thought founded upon the work of Søren Kierkegaard, focuses on Human predicament and how people deal with the situation of being alive. Existential-phenomenology represents a major branch of continental philosophy (they are not contradictory), rooted in the work of Husserl but expressed in its fullest forms in the work of Martin Heidegger, Jean-Paul Sartre, Simone de Beauvoir and Maurice Merleau-Ponty. See Heidegger's Being and Time, Merleau-Ponty's Phenomenology of Perception, Sartre's Being and Nothingness, and Simone de Beauvoir's The Second Sex.

There are countless subjects that are affected by the ideas developed in the philosophy of mind. Clear examples of this are the nature of death and its definitive character, the nature of emotion, of perception and of memory. Questions about what a person is and what his or her identity have to do with the philosophy of mind. There are two subjects that, in connection with the philosophy of the mind, have aroused special attention: free will and the self.

Free will

In the context of philosophy of mind, the problem of free will takes on renewed intensity. This is the case for materialistic determinists. According to this position, natural laws completely determine the course of the material world. Mental states, and therefore the will as well, would be material states, which means human behavior and decisions would be completely determined by natural laws. Some take this reasoning a step further: people cannot determine by themselves what they want and what they do. Consequently, they are not free.

This argumentation is rejected, on the one hand, by the compatibilists. Those who adopt this position suggest that the question "Are we free?" can only be answered once we have determined what the term "free" means. The opposite of "free" is not "caused" but "compelled" or "coerced". It is not appropriate to identify freedom with indetermination. A free act is one where the agent could have done otherwise if it had chosen otherwise. In this sense a person can be free even though determinism is true. The most important compatibilist in the history of the philosophy was David Hume. More recently, this position was defended, for example, by Daniel Dennett.

On the other hand, there are also many incompatibilists who reject the argument because they believe that the will is free in a stronger sense called libertarianism. These philosophers affirm the course of the world is either a) not completely determined by natural law where natural law is intercepted by physically independent agency, b) determined by indeterministic natural law only, or c) determined by indeterministic natural law in line with the subjective effort of physically non-reducible agency. Under Libertarianism, the will does not have to be deterministic and, therefore, it is potentially free. Critics of the second proposition (b) accuse the incompatibilists of using an incoherent concept of freedom. They argue as follows: if our will is not determined by anything, then we desire what we desire by pure chance. And if what we desire is purely accidental, we are not free. So if our will is not determined by anything, we are not free.

Self

The philosophy of mind also has important consequences for the concept of "self". If by "self" or "I" one refers to an essential, immutable nucleus of the person, some modern philosophers of mind, such as Daniel Dennett believe that no such thing exists. According to Dennett and other contemporaries, the self is considered an illusion. The idea of a self as an immutable essential nucleus derives from the idea of an immaterial soul. Such an idea is unacceptable to modern philosophers with physicalist orientations and their general skepticism of the concept of "self" as postulated by David Hume, who could never catch himself not doing, thinking or feeling anything. However, in the light of empirical results from developmental psychology, developmental biology and neuroscience, the idea of an essential inconstant, material nucleus—an integrated representational system distributed over changing patterns of synaptic connections—seems reasonable.

One question central to the philosophy of personal identity is Benj Hellie's vertiginous question. The vertiginous question asks why, of all the subjects of experience out there, this one—the one corresponding to the human being referred to as Benj Hellie—is the one whose experiences are live? (The reader is supposed to substitute their own case for Hellie's.) In other words: Why am I me and not someone else? A common response to the question is that it reduces to "Why are Hellie's experiences live from Hellie's perspective," and thus the entire question is a tautology. However, Hellie argues, through a parable, that this response leaves something out. His parable describes two situations, one reflecting a broad global constellation view of the world and everyone's phenomenal features, and one describing an embedded view from the perspective of a single subject. Caspar Hare has discussed similar ideas with the concepts of egocentric presentism and perspectival realism.

In his book I am You: The Metaphysical Foundations for Global Ethics, Daniel Kolak advocates for a philosophy he calls open individualism. Open individualism states that individual personal identity is an illusion and all individual conscious minds are in reality the same being, similar to the idea of anattā in Buddhist philosophy. Kolak describes three opposing philosophical views of personal identity: closed individualism, empty individualism, and open individualism. Closed individualism is considered to be the default view of personal identity, which is that one's personal identity consists of a ray or line traveling through time, and that one has a future self. Empty individualism is another view, which is that personal identity exists, but one's "identity" only persists for an infinitesimally small amount of time, and the "you" that will exist in the future is an ontologically different being from the "you" that exists now. Similar ideas have been discussed by Derek Parfit in the book Reasons and Persons with thought experiments such as the teletransportation paradox.

Thomas Nagel further discusses the philosophy of self and perspective in the book The View from Nowhere. It contrasts passive and active points of view in how humanity interacts with the world, relying either on a subjective perspective that reflects a point of view or an objective perspective that takes a more detached perspective. Nagel describes the objective perspective as the "view from nowhere", one where the only valuable ideas are ones derived independently.

Problems in Philosophy of Mind

1) Is the emergent level autonomous?

2) Can constraint and constitution be causal relations for mental causation?

3) Does Downward causation violate fundamental micro-level explanation?

4) Can downward causation between two levels be generalised to other levels?

5) Is self-organization an answer to reductionism – anti-reductionism debate? Is it a paradigm shift

from substance and process philosophy?

6) What is meaning of causation in Downward Causation

7) Synchronic and diachronic identity of individual self, problem of identity - Ship of Theseus .

8) Is the reductionism – antireductionism debate same as physicalism – nonreductive physicalism .

9) How can the higher level of organization which is dependent on the lower level for its existence

have any causal impact on the lower?

10) Is causal asymmetry is violated in mental causation?

11) Is Artificial Intelligence theoretical psychology?

12) Are special sciences like psychology autonomous in their explanation or reducible to lower levels.

13) Is perception controlled form of hallucination?

14) Why is there a subjective feeling when the brain is processing information?

15) What is epiphenomenal causation?

Norse mythology

From Wikipedia, the free encyclopedia
The Tjängvide image stone with illustrations from Norse mythology

Norse, Nordic, or Scandinavian mythology, is the body of myths belonging to the North Germanic peoples, stemming from Old Norse religion and continuing after the Christianization of Scandinavia as the Nordic folklore of the modern period. The northernmost extension of Germanic mythology and stemming from Proto-Germanic folklore, Norse mythology consists of tales of various deities, beings, and heroes derived from numerous sources from both before and after the pagan period, including medieval manuscripts, archaeological representations, and folk tradition. The source texts mention numerous gods such as the thunder-god Thor, the raven-flanked god Odin, the goddess Freyja, and numerous other deities.

The god Loki, son of Fárbauti and Laufey

Most of the surviving mythology centers on the plights of the gods and their interaction with several other beings, such as humanity and the jötnar, beings who may be friends, lovers, foes, or family members of the gods. The cosmos in Norse mythology consists of Nine Worlds that flank a central sacred tree, Yggdrasil. Units of time and elements of the cosmology are personified as deities or beings. Various forms of a creation myth are recounted, where the world is created from the flesh of the primordial being Ymir, and the first two humans are Ask and Embla. These worlds are foretold to be reborn after the events of Ragnarök when an immense battle occurs between the gods and their enemies, and the world is enveloped in flames, only to be reborn anew. There the surviving gods will meet, and the land will be fertile and green, and two humans will repopulate the world.

Norse mythology has been the subject of scholarly discourse since the 17th century when key texts attracted the attention of the intellectual circles of Europe. By way of comparative mythology and historical linguistics, scholars have identified elements of Germanic mythology reaching as far back as Proto-Indo-European mythology. During the modern period, the Romanticist Viking revival re-awoke an interest in the subject matter, and references to Norse mythology may now be found throughout modern popular culture. The myths have further been revived in a religious context among adherents of Germanic Neopaganism.

Terminology

The historical religion of the Norse people is commonly referred to as Norse mythology. Other terms are Scandinavian mythologyNorth Germanic mythology or Nordic mythology.

Sources

The Rök runestone (Ög 136), located in Rök, Sweden, features a Younger Futhark runic inscription that makes various references to Norse mythology.

Norse mythology is primarily attested in dialects of Old Norse, a North Germanic language spoken by the Scandinavian people during the European Middle Ages and the ancestor of modern Scandinavian languages. The majority of these Old Norse texts were created in Iceland, where the oral tradition stemming from the pre-Christian inhabitants of the island was collected and recorded in manuscripts. This occurred primarily in the 13th century. These texts include the Prose Edda, composed in the 13th century by the Icelandic scholar, lawspeaker, and historian Snorri Sturluson, and the Poetic Edda, a collection of poems from earlier traditional material anonymously compiled in the 13th century.

The Prose Edda was composed as a prose manual for producing skaldic poetry—traditional Old Norse poetry composed by skalds. Originally composed and transmitted orally, skaldic poetry utilizes alliterative verse, kennings, and several metrical forms. The Prose Edda presents numerous examples of works by various skalds from before and after the Christianization process and also frequently refers back to the poems found in the Poetic Edda. The Poetic Edda consists almost entirely of poems, with some prose narrative added, and this poetry—Eddic poetry—utilizes fewer kennings. In comparison to skaldic poetry, Eddic poetry is relatively unadorned.

Title page of a late manuscript of the Prose Edda written by Snorri Sturluson (13th century), showing the Ancient Norse Gods Odin, Heimdallr, Sleipnir, and other figures from Norse mythology

The Prose Edda features layers of euhemerization, a process in which deities and supernatural beings are presented as having been either actual, magic-wielding human beings who have been deified in time or beings demonized by way of Christian mythology. Texts such as Heimskringla, composed in the 13th century by Snorri and Gesta Danorum, composed in Latin by Saxo Grammaticus in Denmark in the 12th century, are the results of heavy amounts of euhemerization.

Numerous additional texts, such as the sagas, provide further information. The saga corpus consists of thousands of tales recorded in Old Norse ranging from Icelandic family histories (Sagas of Icelanders) to Migration period tales mentioning historic figures such as Attila the Hun (legendary sagas). Objects and monuments such as the Rök runestone and the Kvinneby amulet feature runic inscriptions—texts written in the runic alphabet, the indigenous alphabet of the Germanic peoples—that mention figures and events from Norse mythology.

Objects from the archaeological record may also be interpreted as depictions of subjects from Norse mythology, such as amulets of the god Thor's hammer Mjölnir found among pagan burials and small silver female figures interpreted as valkyries or dísir, beings associated with war, fate or ancestor cults. By way of historical linguistics and comparative mythology, comparisons to other attested branches of Germanic mythology (such as the Old High German Merseburg Incantations) may also lend insight. Wider comparisons to the mythology of other Indo-European peoples by scholars has resulted in the potential reconstruction of far earlier myths.

Only a tiny amount of poems and tales survive of the many mythical tales and poems that are presumed to have existed during the Middle Ages, Viking Age, Migration Period, and before. Later sources reaching into the modern period, such as a medieval charm recorded as used by the Norwegian woman Ragnhild Tregagås—convicted of witchcraft in Norway in the 14th century—and spells found in the 17th century Icelandic Galdrabók grimoire also sometimes make references to Norse mythology. Other traces, such as place names bearing the names of gods may provide further information about deities, such as a potential association between deities based on the placement of locations bearing their names, their local popularity, and associations with geological features.

Mythology

Gods and other beings

The god Thor wades through a river, while the Æsir ride across the bridge, Bifröst, in an illustration by Lorenz Frølich (1895).

Central to accounts of Norse mythology are the plights of the gods and their interaction with various other beings, such as with the jötnar, who may be friends, lovers, foes, or family members of the gods. Numerous gods are mentioned in the source texts. As evidenced by records of personal names and place names, the most popular god among the Scandinavians during the Viking Age was Thor the thunder god, who is portrayed as unrelentingly pursuing his foes, his mountain-crushing, thunderous hammer Mjölnir in hand. In the mythology, Thor lays waste to numerous jötnar who are foes to the gods or humanity, and is wed to the beautiful, golden-haired goddess Sif.

The god Odin is also frequently mentioned in surviving texts. One-eyed, wolf- and raven-flanked, with a spear in hand, Odin pursues knowledge throughout the nine realms. In an act of self-sacrifice, Odin is described as having hung himself upside-down for nine days and nights on the cosmological tree Yggdrasil to gain knowledge of the runic alphabet, which he passed on to humanity; he is also associated closely with death, wisdom, and poetry. Odin is portrayed as the ruler of Asgard, and leader of the Aesir. Odin's wife is the powerful goddess Frigg who can see the future but tells no one, and together they have a beloved son, Baldr. After a series of dreams had by Baldr of his impending death, his death is engineered by Loki, and Baldr thereafter resides in Hel, a realm ruled over by an entity of the same name.

Odin must share half of his share of the dead with a powerful goddess, Freyja. She is beautiful, sensual, wears a feathered cloak, and practices seiðr. She rides to battle to choose among the slain and brings her chosen to her afterlife field Fólkvangr. Freyja weeps for her missing husband Óðr and seeks after him in faraway lands. Freyja's brother, the god Freyr, is also frequently mentioned in surviving texts, and in his association with the weather, royalty, human sexuality, and agriculture brings peace and pleasure to humanity. Deeply lovesick after catching sight of the beautiful jötunn Gerðr, Freyr seeks and wins her love, yet at the price of his future doom. Their father is the powerful god Njörðr. Njörðr is strongly associated with ships and seafaring, and so also wealth and prosperity. Freyja and Freyr's mother is Njörðr's unnamed sister (her name is unprovided in the source material). However, there is more information about his pairing with the skiing and hunting goddess Skaði. Their relationship is ill-fated, as Skaði cannot stand to be away from her beloved mountains, nor Njörðr from the seashore. Together, Freyja, Freyr, and Njörðr form a portion of gods known as the Vanir. While the Aesir and the Vanir retain distinct identification, they came together as the result of the Aesir–Vanir War.

While they receive less mention, numerous other gods and goddesses appear in the source material. (For a list of these deities, see List of Germanic deities.) Some of the gods heard less of include the apple-bearing goddess Iðunn and her husband, the skaldic god Bragi; the gold-toothed god Heimdallr, born of nine mothers; the ancient god Týr, who lost his right hand while binding the great wolf Fenrir; and the goddess Gefjon, who formed modern-day Zealand, Denmark.

Various beings outside of the gods are mentioned. Elves and dwarfs are commonly mentioned and appear to be connected, but their attributes are vague and the relation between the two is ambiguous. Elves are described as radiant and beautiful, whereas dwarfs often act as earthen smiths. A group of beings variously described as jötnar, thursar, and trolls (in English these are all often glossed as "giants") frequently appear. These beings may either aid, deter, or take their place among the gods. The Norns, dísir, and aforementioned valkyries also receive frequent mention. While their functions and roles may overlap and differ, all are collective female beings associated with fate.

Cosmology

The cosmological, central tree Yggdrasil is depicted in The Ash Yggdrasil by Friedrich Wilhelm Heine (1886).
Sól, the Sun, and Máni, the Moon, are chased by the wolves Sköll and Háti in The Wolves Pursuing Sol and Mani by J. C. Dollman (1909).

In Norse cosmology, all beings live in Nine Worlds that center around the cosmological tree Yggdrasil. The gods inhabit the heavenly realm of Asgard whereas humanity inhabits Midgard, a region in the center of the cosmos. Outside of the gods, humanity, and the jötnar, these Nine Worlds are inhabited by beings, such as elves and dwarfs. Travel between the worlds is frequently recounted in the myths, where the gods and other beings may interact directly with humanity. Numerous creatures live on Yggdrasil, such as the insulting messenger squirrel Ratatoskr and the perching hawk Veðrfölnir. The tree itself has three major roots, and at the base of one of these roots live the Norns, female entities associated with fate. Elements of the cosmos are personified, such as the Sun (Sól, a goddess), the Moon (Máni, a god), and Earth (Jörð, a goddess), as well as units of time, such as day (Dagr, a god) and night (Nótt, a jötunn).

The afterlife is a complex matter in Norse mythology. The dead may go to the murky realm of Hel—a realm ruled over by a female being of the same name, may be ferried away by valkyries to Odin's martial hall Valhalla, or may be chosen by the goddess Freyja to dwell in her field Fólkvangr. The goddess Rán may claim those that die at sea, and the goddess Gefjon is said to be attended by virgins upon their death. Texts also make reference to reincarnation. Time itself is presented between cyclic and linear, and some scholars have argued that cyclic time was the original format for the mythology. Various forms of a cosmological creation story are provided in Icelandic sources, and references to a future destruction and rebirth of the world—Ragnarok—are frequently mentioned in some texts.

Humanity

According to the Prose Edda and the Poetic Edda poem, Völuspá, the first human couple consisted of Ask and Embla; driftwood found by a trio of gods and imbued with life in the form of three gifts. After the cataclysm of Ragnarok, this process is mirrored in the survival of two humans from a wood; Líf and Lífþrasir. From these two humankind is foretold to repopulate the new and green earth.

Wednesday, July 30, 2025

History of randomness

From Wikipedia, the free encyclopedia
Ancient fresco of dice players in Pompei

In ancient history, the concepts of chance and randomness were intertwined with that of fate. Many ancient peoples threw dice to determine fate, and this later evolved into games of chance. At the same time, most ancient cultures used various methods of divination to attempt to circumvent randomness and fate. Beyond religion and games of chance, randomness has been attested for sortition since at least ancient Athenian democracy in the form of a kleroterion.

The formalization of odds and chance was perhaps earliest done by the Chinese 3,000 years ago. The Greek philosophers discussed randomness at length, but only in non-quantitative forms. It was only in the sixteenth century that Italian mathematicians began to formalize the odds associated with various games of chance. The invention of modern calculus had a positive impact on the formal study of randomness. In the 19th century the concept of entropy was introduced in physics.

The early part of the twentieth century saw a rapid growth in the formal analysis of randomness, and mathematical foundations for probability were introduced, leading to its axiomatization in 1933. At the same time, the advent of quantum mechanics changed the scientific perspective on determinacy. In the mid to late 20th-century, ideas of algorithmic information theory introduced new dimensions to the field via the concept of algorithmic randomness.

Although randomness had often been viewed as an obstacle and a nuisance for many centuries, in the twentieth century computer scientists began to realize that the deliberate introduction of randomness into computations can be an effective tool for designing better algorithms. In some cases, such randomized algorithms are able to outperform the best deterministic methods.

Antiquity to the Middle Ages

Depiction of Roman goddess Fortuna who determined fate, by Hans Beham, 1541

Pre-Christian people along the Mediterranean threw dice to determine fate, and this later evolved into games of chance. There is also evidence of games of chance played by ancient Egyptians, Hindus and Chinese, dating back to 2100 BC. The Chinese used dice before the Europeans, and have a long history of playing games of chance.

Over 3,000 years ago, the problems concerned with the tossing of several coins were considered in the I Ching, one of the oldest Chinese mathematical texts, that probably dates to 1150 BC. The two principal elements yin and yang were combined in the I Ching in various forms to produce Heads and Tails permutations of the type HH, TH, HT, etc. and the Chinese seem to have been aware of Pascal's triangle long before the Europeans formalized it in the 17th century. However, Western philosophy focused on the non-mathematical aspects of chance and randomness until the 16th century.

The development of the concept of chance throughout history has been very gradual. Historians have wondered why progress in the field of randomness was so slow, given that humans have encountered chance since antiquity. Deborah J. Bennett suggests that ordinary people face an inherent difficulty in understanding randomness, although the concept is often taken as being obvious and self-evident. She cites studies by Kahneman and Tversky; these concluded that statistical principles are not learned from everyday experience because people do not attend to the detail necessary to gain such knowledge.

The Greek philosophers were the earliest Western thinkers to address chance and randomness. Around 400 BC, Democritus presented a view of the world as governed by the unambiguous laws of order and considered randomness as a subjective concept that only originated from the inability of humans to understand the nature of events. He used the example of two men who would send their servants to bring water at the same time to cause them to meet. The servants, unaware of the plan, would view the meeting as random.

Aristotle saw chance and necessity as opposite forces. He argued that nature had rich and constant patterns that could not be the result of chance alone, but that these patterns never displayed the machine-like uniformity of necessary determinism. He viewed randomness as a genuine and widespread part of the world, but as subordinate to necessity and order. Aristotle classified events into three types: certain events that happen necessarily; probable events that happen in most cases; and unknowable events that happen by pure chance. He considered the outcome of games of chance as unknowable.

Around 300 BC Epicurus proposed the concept that randomness exists by itself, independent of human knowledge. He believed that in the atomic world, atoms would swerve at random along their paths, bringing about randomness at higher levels.

Hotei, the deity of fortune observing a cock fight in a 16th-century Japanese print

For several centuries thereafter, the idea of chance continued to be intertwined with fate. Divination was practiced in many cultures, using diverse methods. The Chinese analyzed the cracks in turtle shells, while the Germans, who according to Tacitus had the highest regards for lots and omens, utilized strips of bark. In the Roman Empire, chance was personified by the Goddess Fortuna. The Romans would partake in games of chance to simulate what Fortuna would have decided. In 49 BC, Julius Caesar allegedly decided on his fateful decision to cross the Rubicon after throwing dice.

Aristotle's classification of events into the three classes: certain, probable and unknowable was adopted by Roman philosophers, but they had to reconcile it with deterministic Christian teachings in which even events unknowable to man were considered to be predetermined by God. About 960 Bishop Wibold of Cambrai correctly enumerated the 56 different outcomes (without permutations) of playing with three dice. No reference to playing cards has been found in Europe before 1350. The Church preached against card playing, and card games spread much more slowly than games based on dice. The Christian Church specifically forbade divination; and wherever Christianity went, divination lost most of its old-time power.

Over the centuries, many Christian scholars wrestled with the conflict between the belief in free will and its implied randomness, and the idea that God knows everything that happens. Saints Augustine and Aquinas tried to reach an accommodation between foreknowledge and free will, but Martin Luther argued against randomness and took the position that God's omniscience renders human actions unavoidable and determined. In the 13th century, Thomas Aquinas viewed randomness not as the result of a single cause, but of several causes coming together by chance. While he believed in the existence of randomness, he rejected it as an explanation of the end-directedness of nature, for he saw too many patterns in nature to have been obtained by chance.

The Greeks and Romans had not noticed the magnitudes of the relative frequencies of the games of chance. For centuries, chance was discussed in Europe with no mathematical foundation and it was only in the 16th century that Italian mathematicians began to discuss the outcomes of games of chance as ratios. In his 1565 Liber de Lude Aleae (a gambler's manual published after his death) Gerolamo Cardano wrote one of the first formal tracts to analyze the odds of winning at various games.

17th–19th centuries

Statue of Blaise Pascal, Louvre

Around 1620 Galileo wrote a paper called On a discovery concerning dice that used an early probabilistic model to address specific questions. In 1654, prompted by Chevalier de Méré's interest in gambling, Blaise Pascal corresponded with Pierre de Fermat, and much of the groundwork for probability theory was laid. Pascal's Wager was noted for its early use of the concept of infinity, and the first formal use of decision theory. The work of Pascal and Fermat influenced Leibniz's work on the infinitesimal calculus, which in turn provided further momentum for the formal analysis of probability and randomness.

The first known suggestion for viewing randomness in terms of complexity was made by Leibniz in an obscure 17th-century document discovered after his death. Leibniz asked how one could know if a set of points on a piece of paper were selected at random (e.g. by splattering ink) or not. Given that for any set of finite points there is always a mathematical equation that can describe the points, (e.g. by Lagrangian interpolation) the question focuses on the way the points are expressed mathematically. Leibniz viewed the points as random if the function describing them had to be extremely complex. Three centuries later, the same concept was formalized as algorithmic randomness by A. N. Kolmogorov and Gregory Chaitin as the minimal length of a computer program needed to describe a finite string as random.

The Doctrine of Chances, the first textbook on probability theory was published in 1718 and the field continued to grow thereafter. The frequency theory approach to probability was first developed by Robert Ellis and John Venn late in the 19th century.

The Fortune Teller by Vouet, 1617

While the mathematical elite was making progress in understanding randomness from the 17th to the 19th century, the public at large continued to rely on practices such as fortune telling in the hope of taming chance. Fortunes were told in a multitude of ways both in the Orient (where fortune telling was later termed an addiction) and in Europe by gypsies and others. English practices such as the reading of eggs dropped in a glass were exported to Puritan communities in North America.

"I know of scarcely anything so apt to impress the imagination as the wonderful form of cosmic order expressed by the "Law of Frequency of Error." The law would have been personified by the Greeks and deified, if they had known of it. It reigns with serenity and in complete self-effacement amidst the wildest confusion. The huger the mob, and the greater the apparent anarchy, the more perfect is its sway. It is the supreme law of Unreason. Whenever a large sample of chaotic elements are taken in hand and marshalled in the order of their magnitude, an unsuspected and most beautiful form of regularity proves to have been latent all along. The tops of the marshalled row form a flowing curve of invariable proportions; and each element, as it is sorted into place, finds, as it were, a pre-ordained niche, accurately adapted to fit it."

Galton (1894)

The term entropy, which is now a key element in the study of randomness, was coined by Rudolf Clausius in 1865 as he studied heat engines in the context of the second law of thermodynamics. Clausius was the first to state "entropy always increases".

From the time of Newton until about 1890, it was generally believed that if one knows the initial state of a system with great accuracy, and if all the forces acting on the system can be formulated with equal accuracy, it would be possible, in principle, to make predictions of the state of the universe for an infinitely long time. The limits to such predictions in physical systems became clear as early as 1893 when Henri Poincaré showed that in the three-body problem in astronomy, small changes to the initial state could result in large changes in trajectories during the numerical integration of the equations.

During the 19th century, as probability theory was formalized and better understood, the attitude towards "randomness as nuisance" began to be questioned. Goethe wrote:

The tissue of the world is built from necessities and randomness; the intellect of men places itself between both and can control them; it considers the necessity and the reason of its existence; it knows how randomness can be managed, controlled, and used.

The words of Goethe proved prophetic, when in the 20th century randomized algorithms were discovered as powerful tools. By the end of the 19th century, Newton's model of a mechanical universe was fading away as the statistical view of the collision of molecules in gases was studied by Maxwell and Boltzmann. Boltzmann's equation S = k loge W (inscribed on his tombstone) first related entropy with logarithms.

20th century

Antony Gormley's Quantum Cloud sculpture in London was designed by a computer using a random walk algorithm.

During the 20th century, the five main interpretations of probability theory (e.g., classical, logical, frequency, propensity and subjective) became better understood, were discussed, compared and contrasted. A significant number of application areas were developed in this century, from finance to physics. In 1900 Louis Bachelier applied Brownian motion to evaluate stock options, effectively launching the fields of financial mathematics and stochastic processes.

Émile Borel was one of the first mathematicians to formally address randomness in 1909, and introduced normal numbers. In 1919 Richard von Mises gave the first definition of algorithmic randomness via the impossibility of a gambling system. He advanced the frequency theory of randomness in terms of what he called the collective, i.e. a random sequence. Von Mises regarded the randomness of a collective as an empirical law, established by experience. He related the "disorder" or randomness of a collective to the lack of success of attempted gambling systems. This approach led him to suggest a definition of randomness that was later refined and made mathematically rigorous by Alonzo Church by using computable functions in 1940. Von Mises likened the principle of the impossibility of a gambling system to the principle of the conservation of energy, a law that cannot be proven, but has held true in repeated experiments.

Von Mises never totally formalized his rules for sub-sequence selection, but in his 1940 paper "On the concept of random sequence", Alonzo Church suggested that the functions used for place settings in the formalism of von Mises be computable functions rather than arbitrary functions of the initial segments of the sequence, appealing to the Church–Turing thesis on effectiveness.

The advent of quantum mechanics in the early 20th century and the formulation of the Heisenberg uncertainty principle in 1927 saw the end to the Newtonian mindset among physicists regarding the determinacy of nature. In quantum mechanics, there is not even a way to consider all observable elements in a system as random variables at once, since many observables do not commute.

Café Central, one of the early meeting places of the Vienna circle

By the early 1940s, the frequency theory approach to probability was well accepted within the Vienna circle, but in the 1950s Karl Popper proposed the propensity theory. Given that the frequency approach cannot deal with "a single toss" of a coin, and can only address large ensembles or collectives, the single-case probabilities were treated as propensities or chances. The concept of propensity was also driven by the desire to handle single-case probability settings in quantum mechanics, e.g. the probability of decay of a specific atom at a specific moment. In more general terms, the frequency approach can not deal with the probability of the death of a specific person given that the death can not be repeated multiple times for that person. Karl Popper echoed the same sentiment as Aristotle in viewing randomness as subordinate to order when he wrote that "the concept of chance is not opposed to the concept of law" in nature, provided one considers the laws of chance.

Claude Shannon's development of Information theory in 1948 gave rise to the entropy view of randomness. In this view, randomness is the opposite of determinism in a stochastic process. Hence if a stochastic system has entropy zero it has no randomness and any increase in entropy increases randomness. Shannon's formulation defaults to Boltzmann's 19th century formulation of entropy in case all probabilities are equal. Entropy is now widely used in diverse fields of science from thermodynamics to quantum chemistry.

Martingales for the study of chance and betting strategies were introduced by Paul Lévy in the 1930s and were formalized by Joseph L. Doob in the 1950s. The application of random walk hypothesis in financial theory was first proposed by Maurice Kendall in 1953. It was later promoted by Eugene Fama and Burton Malkiel.

Random strings were first studied in the 1960s by A. N. Kolmogorov (who had provided the first axiomatic definition of probability theory in 1933), Chaitin and Martin-Löf. The algorithmic randomness of a string was defined as the minimum size of a program (e.g. in bits) executed on a universal computer that yields the string. Chaitin's Omega number later related randomness and the halting probability for programs.

In 1964, Benoît Mandelbrot suggested that most statistical models approached only a first stage of dealing with indeterminism, and that they ignored many aspects of real world turbulence. In his 1997 he defined seven states of randomness ranging from "mild to wild", with traditional randomness being at the mild end of the scale.

Despite mathematical advances, reliance on other methods of dealing with chance, such as fortune telling and astrology continued in the 20th century. The government of Myanmar reportedly shaped 20th century economic policy based on fortune telling and planned the move of the capital of the country based on the advice of astrologers. White House Chief of Staff Donald Regan criticized the involvement of astrologer Joan Quigley in decisions made during Ronald Reagan's presidency in the 1980s. Quigley claims to have been the White House astrologer for seven years.

During the 20th century, limits in dealing with randomness were better understood. The best-known example of both theoretical and operational limits on predictability is weather forecasting, simply because models have been used in the field since the 1950s. Predictions of weather and climate are necessarily uncertain. Observations of weather and climate are uncertain and incomplete, and the models into which the data are fed are uncertain. In 1961, Edward Lorenz noticed that a very small change to the initial data submitted to a computer program for weather simulation could result in a completely different weather scenario. This later became known as the butterfly effect, often paraphrased as the question: "Does the flap of a butterfly’s wings in Brazil set off a tornado in Texas?". A key example of serious practical limits on predictability is in geology, where the ability to predict earthquakes either on an individual or on a statistical basis remains a remote prospect.

In the late 1970s and early 1980s, computer scientists began to realize that the deliberate introduction of randomness into computations can be an effective tool for designing better algorithms. In some cases, such randomized algorithms outperform the best deterministic methods.

Consilience

From Wikipedia, the free encyclopedia

In science and history, consilience (also convergence of evidence or concordance of evidence) is the principle that evidence from independent, unrelated sources can "converge" on strong conclusions. That is, when multiple sources of evidence are in agreement, the conclusion can be very strong even when none of the individual sources of evidence is significantly so on its own. Most established scientific knowledge is supported by a convergence of evidence: if not, the evidence is comparatively weak, and there will probably not be a strong scientific consensus.

The principle is based on unity of knowledge; measuring the same result by several different methods should lead to the same answer. For example, it should not matter whether one measures distances within the Giza pyramid complex by laser rangefinding, by satellite imaging, or with a metre-stick – in all three cases, the answer should be approximately the same. For the same reason, different dating methods in geochronology should concur, a result in chemistry should not contradict a result in geology, etc.

The word consilience was originally coined as the phrase "consilience of inductions" by William Whewell (consilience refers to a "jumping together" of knowledge). The word comes from Latin com- "together" and -siliens "jumping" (as in resilience).

Description

Consilience requires the use of independent methods of measurement, meaning that the methods have few shared characteristics. That is, the mechanism by which the measurement is made is different; each method is dependent on an unrelated natural phenomenon. For example, the accuracy of laser range-finding measurements is based on the scientific understanding of lasers, while satellite pictures and metre-sticks (or yardsticks) rely on different phenomena. Because the methods are independent, when one of several methods is in error, it is very unlikely to be in error in the same way as any of the other methods, and a difference between the measurements will be observed. If the scientific understanding of the properties of lasers was inaccurate, then the laser measurement would be inaccurate but the others would not.

As a result, when several different methods agree, this is strong evidence that none of the methods are in error and the conclusion is correct. This is because of a greatly reduced likelihood of errors: for a consensus estimate from multiple measurements to be wrong, the errors would have to be similar for all samples and all methods of measurement, which is extremely unlikely. Random errors will tend to cancel out as more measurements are made, due to regression to the mean; systematic errors will be detected by differences between the measurements and will also tend to cancel out since the direction of the error will still be random. This is how scientific theories reach high confidence—over time, they build up a large degree of evidence which converges on the same conclusion.

When results from different strong methods do appear to conflict, this is treated as a serious problem to be reconciled. For example, in the 19th century, the Sun appeared to be no more than 20 million years old, but the Earth appeared to be no less than 300 million years (resolved by the discovery of nuclear fusion and radioactivity, and the theory of quantum mechanics); or current attempts to resolve theoretical differences between quantum mechanics and general relativity.

Significance

Because of consilience, the strength of evidence for any particular conclusion is related to how many independent methods are supporting the conclusion, as well as how different these methods are. Those techniques with the fewest (or no) shared characteristics provide the strongest consilience and result in the strongest conclusions. This also means that confidence is usually strongest when considering evidence from different fields because the techniques are usually very different.

For example, the theory of evolution is supported by a convergence of evidence from genetics, molecular biology, paleontology, geology, biogeography, comparative anatomy, comparative physiology, and many other fields. In fact, the evidence within each of these fields is itself a convergence providing evidence for the theory. As a result, to disprove evolution, most or all of these independent lines of evidence would have to be found to be in error. The strength of the evidence, considered together as a whole, results in the strong scientific consensus that the theory is correct. In a similar way, evidence about the history of the universe is drawn from astronomy, astrophysics, planetary geology, and physics.

Finding similar conclusions from multiple independent methods is also evidence for the reliability of the methods themselves, because consilience eliminates the possibility of all potential errors that do not affect all the methods equally. This is also used for the validation of new techniques through comparison with the consilient ones. If only partial consilience is observed, this allows for the detection of errors in methodology; any weaknesses in one technique can be compensated for by the strengths of the others. Alternatively, if using more than one or two techniques for every experiment is infeasible, some of the benefits of consilience may still be obtained if it is well-established that these techniques usually give the same result.

Consilience is important across all of science, including the social sciences, and is often used as an argument for scientific realism by philosophers of science. Each branch of science studies a subset of reality that depends on factors studied in other branches. Atomic physics underlies the workings of chemistry, which studies emergent properties that in turn are the basis of biology. Psychology is not separate from the study of properties emergent from the interaction of neurons and synapses. Sociology, economics, and anthropology are each, in turn, studies of properties emergent from the interaction of countless individual humans. The concept that all the different areas of research are studying one real, existing universe is an apparent explanation of why scientific knowledge determined in one field of inquiry has often helped in understanding other fields.

Deviations

Consilience does not forbid deviations: in fact, since not all experiments are perfect, some deviations from established knowledge are expected. However, when the convergence is strong enough, then new evidence inconsistent with the previous conclusion is not usually enough to outweigh that convergence. Without an equally strong convergence on the new result, the weight of evidence will still favor the established result. This means that the new evidence is most likely to be wrong.

Science denialism (for example, AIDS denialism) is often based on a misunderstanding of this property of consilience. A denier may promote small gaps not yet accounted for by the consilient evidence, or small amounts of evidence contradicting a conclusion without accounting for the pre-existing strength resulting from consilience. More generally, to insist that all evidence converge precisely with no deviations would be naïve falsificationism, equivalent to considering a single contrary result to falsify a theory when another explanation, such as equipment malfunction or misinterpretation of results, is much more likely.

In history

Historical evidence also converges in an analogous way. For example: if five ancient historians, none of whom knew each other, all claim that Julius Caesar seized power in Rome in 49 BCE, this is strong evidence in favor of that event occurring even if each individual historian is only partially reliable. By contrast, if the same historian had made the same claim five times in five different places (and no other types of evidence were available), the claim is much weaker because it originates from a single source. The evidence from the ancient historians could also converge with evidence from other fields, such as archaeology: for example, evidence that many senators fled Rome at the time, that the battles of Caesar's civil war occurred, and so forth.

Consilience has also been discussed in reference to Holocaust denial.

"We [have now discussed] eighteen proofs all converging on one conclusion...the deniers shift the burden of proof to historians by demanding that each piece of evidence, independently and without corroboration between them, prove the Holocaust. Yet no historian has ever claimed that one piece of evidence proves the Holocaust. We must examine the collective whole."

That is, individually the evidence may underdetermine the conclusion, but together they overdetermine it. A similar way to state this is that to ask for one particular piece of evidence in favor of a conclusion is a flawed question.

Outside the sciences

In addition to the sciences, consilience can be important to the arts, ethics and religion. Both artists and scientists have identified the importance of biology in the process of artistic innovation.

History of the concept

Consilience has its roots in the ancient Greek concept of an intrinsic orderliness that governs our cosmos, inherently comprehensible by logical process, a vision at odds with mystical views in many cultures that surrounded the Hellenes. The rational view was recovered during the high Middle Ages, separated from theology during the Renaissance and found its apogee in the Age of Enlightenment.

Whewell's definition was that:

The Consilience of Inductions takes place when an Induction, obtained from one class of facts, coincides with an Induction obtained from another different class. Thus Consilience is a test of the truth of the Theory in which it occurs.

More recent descriptions include:

"Where there is a convergence of evidence, where the same explanation is implied, there is increased confidence in the explanation. Where there is divergence, then either the explanation is at fault or one or more of the sources of information is in error or requires reinterpretation."

"Proof is derived through a convergence of evidence from numerous lines of inquiry—multiple, independent inductions, all of which point to an unmistakable conclusion."

Edward O. Wilson

Although the concept of consilience in Whewell's sense was widely discussed by philosophers of science, the term was unfamiliar to the broader public until the end of the 20th century, when it was revived in Consilience: The Unity of Knowledge, a 1998 book by the author and biologist E. O. Wilson, as an attempt to bridge the cultural gap between the sciences and the humanities that was the subject of C. P. Snow's The Two Cultures and the Scientific Revolution (1959). Wilson believed that "the humanities, ranging from philosophy and history to moral reasoning, comparative religion, and interpretation of the arts, will draw closer to the sciences and partly fuse with them" with the result that science and the scientific method, from within this fusion, would not only explain the physical phenomenon but also provide moral guidance and be the ultimate source of all truths.

Wilson held that with the rise of the modern sciences, the sense of unity gradually was lost in the increasing fragmentation and specialization of knowledge in the last two centuries. He asserted that the sciences, humanities, and arts have a common goal: to give a purpose to understand the details, to lend to all inquirers "a conviction, far deeper than a mere working proposition, that the world is orderly and can be explained by a small number of natural laws." An important point made by Wilson is that hereditary human nature and evolution itself profoundly affect the evolution of culture, in essence, a sociobiological concept. Wilson's concept is a much broader notion of consilience than that of Whewell, who was merely pointing out that generalizations invented to account for one set of phenomena often account for others as well.

A parallel view lies in the term universology, which literally means "the science of the universe." Universology was first promoted for the study of the interconnecting principles and truths of all domains of knowledge by Stephen Pearl Andrews, a 19th-century utopian futurist and anarchist.

Epithelium

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Epitheliu...