Search This Blog

Thursday, September 30, 2021

Interpretations of quantum mechanics

An interpretation of quantum mechanics is an attempt to explain how the mathematical theory of quantum mechanics "corresponds" to reality. Although quantum mechanics has held up to rigorous and extremely precise tests in an extraordinarily broad range of experiments (not one prediction from quantum mechanics has been found to be contradicted by experiments), there exist a number of contending schools of thought over their interpretation. These views on interpretation differ on such fundamental questions as whether quantum mechanics is deterministic or stochastic, which elements of quantum mechanics can be considered real, and what the nature of measurement is, among other matters.

Despite nearly a century of debate and experiment, no consensus has been reached among physicists and philosophers of physics concerning which interpretation best "represents" reality.

History

The definition of quantum theorists' terms, such as wave function and matrix mechanics, progressed through many stages. For instance, Erwin Schrödinger originally viewed the electron's wave function as its charge density smeared across space, but Max Born reinterpreted the absolute square value of the wave function as the electron's probability density distributed across space.

The views of several early pioneers of quantum mechanics, such as Niels Bohr and Werner Heisenberg, are often grouped together as the "Copenhagen interpretation", though physicists and historians of physics have argued that this terminology obscures differences between the views so designated. Copenhagen-type ideas were never universally embraced, and challenges to a perceived Copenhagen orthodoxy gained increasing attention in the 1950s with the pilot-wave interpretation of David Bohm and the many-worlds interpretation of Hugh Everett III.

Moreover, the strictly formalist position, shunning interpretation, has been challenged by proposals for experiments that might one day distinguish among interpretations, as by measuring an AI consciousness or via quantum computing.

The physicist N. David Mermin once quipped, "New interpretations appear every year. None ever disappear." As a rough guide to development of the mainstream view during the 1990s and 2000s, a "snapshot" of opinions was collected in a poll by Schlosshauer et al. at the "Quantum Physics and the Nature of Reality" conference of July 2011. The authors reference a similarly informal poll carried out by Max Tegmark at the "Fundamental Problems in Quantum Theory" conference in August 1997. The main conclusion of the authors is that "the Copenhagen interpretation still reigns supreme", receiving the most votes in their poll (42%), besides the rise to mainstream notability of the many-worlds interpretations: "The Copenhagen interpretation still reigns supreme here, especially if we lump it together with intellectual offsprings such as information-based interpretations and the Quantum Bayesian interpretation. In Tegmark's poll, the Everett interpretation received 17% of the vote, which is similar to the number of votes (18%) in our poll."

Nature

More or less, all interpretations of quantum mechanics share two qualities:

  1. They interpret a formalism—a set of equations and principles to generate predictions via input of initial conditions
  2. They interpret a phenomenology—a set of observations, including those obtained by empirical research and those obtained informally, such as humans' experience of an unequivocal world

Two qualities vary among interpretations:

  1. Ontology—claims about what things, such as categories and entities, exist in the world
  2. Epistemology—claims about the possibility, scope, and means toward relevant knowledge of the world

In philosophy of science, the distinction of knowledge versus reality is termed epistemic versus ontic. A general law is a regularity of outcomes (epistemic), whereas a causal mechanism may regulate the outcomes (ontic). A phenomenon can receive interpretation either ontic or epistemic. For instance, indeterminism may be attributed to limitations of human observation and perception (epistemic), or may be explained as a real existing maybe encoded in the universe (ontic). Confusing the epistemic with the ontic, if for example one were to presume that a general law actually "governs" outcomes—and that the statement of a regularity has the role of a causal mechanism—is a category mistake.

In a broad sense, scientific theory can be viewed as offering scientific realism—approximately true description or explanation of the natural world—or might be perceived with antirealism. A realist stance seeks the epistemic and the ontic, whereas an antirealist stance seeks epistemic but not the ontic. In the 20th century's first half, antirealism was mainly logical positivism, which sought to exclude unobservable aspects of reality from scientific theory.

Since the 1950s, antirealism is more modest, usually instrumentalism, permitting talk of unobservable aspects, but ultimately discarding the very question of realism and posing scientific theory as a tool to help humans make predictions, not to attain metaphysical understanding of the world. The instrumentalist view is carried by the famous quote of David Mermin, "Shut up and calculate", often misattributed to Richard Feynman.

Other approaches to resolve conceptual problems introduce new mathematical formalism, and so propose alternative theories with their interpretations. An example is Bohmian mechanics, whose empirical equivalence with the three standard formalisms—Schrödinger's wave mechanics, Heisenberg's matrix mechanics, and Feynman's path integral formalism—has been demonstrated.

Interpretive challenges

  1. Abstract, mathematical nature of quantum field theories: the mathematical structure of quantum mechanics is abstract without clear interpretation of its quantities.
  2. Existence of apparently indeterministic and irreversible processes: in classical field theory, a physical property at a given location in the field is readily derived. In most mathematical formulations of quantum mechanics, measurement is given a special role in the theory, as it is the sole process that can cause a nonunitary, irreversible evolution of the state.
  3. Role of the observer in determining outcomes: the Copenhagen Interpretation implies that the wavefunction is a calculational tool, and represents reality only immediately after a measurement, perhaps performed by an observer; Everettian interpretations grant that all the possibilities can be real, and that the process of measurement-type interactions cause an effective branching process.
  4. Classically unexpected correlations between remote objects: entangled quantum systems, as illustrated in the EPR paradox, obey statistics that seem to violate principles of local causality.
  5. Complementarity of proffered descriptions: complementarity holds that no set of classical physical concepts can simultaneously refer to all properties of a quantum system. For instance, wave description A and particulate description B can each describe quantum system S, but not simultaneously. This implies the composition of physical properties of S does not obey the rules of classical propositional logic when using propositional connectives (see "Quantum logic"). Like contextuality, the "origin of complementarity lies in the non-commutativity of operators" that describe quantum objects (Omnès 1999).
  6. Rapidly rising intricacy, far exceeding humans' present calculational capacity, as a system's size increases: since the state space of a quantum system is exponential in the number of subsystems, it is difficult to derive classical approximations.
  7. Contextual behaviour of systems locally: Quantum contextuality demonstrates that classical intuitions, in which properties of a system hold definite values independent of the manner of their measurement, fail even for local systems. Also, physical principles such as Leibniz's Principle of the identity of indiscernibles no longer apply in the quantum domain, signalling that most classical intuitions may be incorrect about the quantum world.

Influential interpretations

Copenhagen interpretation

The Copenhagen interpretation is a collection of views about the meaning of quantum mechanics principally attributed to Niels Bohr and Werner Heisenberg. It is one of the oldest of numerous proposed interpretations of quantum mechanics, as features of it date to the development of quantum mechanics during 1925–1927, and it remains one of the most commonly taught. There is no definitive historical statement of what is the Copenhagen interpretation. There are some fundamental agreements and disagreements between the views of Bohr and Heisenberg.

Hans Primas describes nine theses of the Copenhagen interpretation: quantum physics applies to individual objects, not only ensembles of objects; their description is probabilistic; their description is the result of experiments described in terms of classical (non-quantum) physics; the "frontier" that separates the classical from the quantum can be chosen arbitrarily; the act of "observation" or "measurement" is irreversible; the act of "observation" or "measurement" involves an action upon the object measured and reduces the wave packet; complementary properties cannot be observed simultaneously; no truth can be attributed to an object except according to the results of its measurement; and that quantum descriptions are objective, in that they are independent of physicists' mental arbitrariness.

Heisenberg emphasized a sharp "cut" between the observer (or the instrument) and the system being observed, while Bohr offered an interpretation that is independent of a subjective observer, or measurement, or collapse: there is an "irreversible" or effectively irreversible process causing the decay of quantum coherence or the wave packet which imparts the classical behavior of "observation" or "measurement".

Many worlds

The many-worlds interpretation is an interpretation of quantum mechanics in which a universal wavefunction obeys the same deterministic, reversible laws at all times; in particular there is no (indeterministic and irreversible) wavefunction collapse associated with measurement. The phenomena associated with measurement are claimed to be explained by decoherence, which occurs when states interact with the environment. More precisely, the parts of the wavefunction describing observers become increasingly entangled with the parts of the wavefunction describing their experiments. Although all possible outcomes of experiments continue to lie in the wavefunction's support, the times at which they become correlated with observers effectively "split" the universe into mutually unobservable alternate histories.

Quantum information theories

Quantum informational approaches have attracted growing support. They subdivide into two kinds.

  • Information ontologies, such as J. A. Wheeler's "it from bit". These approaches have been described as a revival of immaterialism.
  • Interpretations where quantum mechanics is said to describe an observer's knowledge of the world, rather than the world itself. This approach has some similarity with Bohr's thinking. Collapse (also known as reduction) is often interpreted as an observer acquiring information from a measurement, rather than as an objective event. These approaches have been appraised as similar to instrumentalism. James Hartle writes,

The state is not an objective property of an individual system but is that information, obtained from a knowledge of how a system was prepared, which can be used for making predictions about future measurements. ...A quantum mechanical state being a summary of the observer's information about an individual physical system changes both by dynamical laws, and whenever the observer acquires new information about the system through the process of measurement. The existence of two laws for the evolution of the state vector...becomes problematical only if it is believed that the state vector is an objective property of the system...The "reduction of the wavepacket" does take place in the consciousness of the observer, not because of any unique physical process which takes place there, but only because the state is a construct of the observer and not an objective property of the physical system.

Relational quantum mechanics

The essential idea behind relational quantum mechanics, following the precedent of special relativity, is that different observers may give different accounts of the same series of events: for example, to one observer at a given point in time, a system may be in a single, "collapsed" eigenstate, while to another observer at the same time, it may be in a superposition of two or more states. Consequently, if quantum mechanics is to be a complete theory, relational quantum mechanics argues that the notion of "state" describes not the observed system itself, but the relationship, or correlation, between the system and its observer(s). The state vector of conventional quantum mechanics becomes a description of the correlation of some degrees of freedom in the observer, with respect to the observed system. However, it is held by relational quantum mechanics that this applies to all physical objects, whether or not they are conscious or macroscopic. Any "measurement event" is seen simply as an ordinary physical interaction, an establishment of the sort of correlation discussed above. Thus the physical content of the theory has to do not with objects themselves, but the relations between them.

QBism

QBism, which originally stood for "quantum Bayesianism", is an interpretation of quantum mechanics that takes an agent's actions and experiences as the central concerns of the theory. This interpretation is distinguished by its use of a subjective Bayesian account of probabilities to understand the quantum mechanical Born rule as a normative addition to good decision-making. QBism draws from the fields of quantum information and Bayesian probability and aims to eliminate the interpretational conundrums that have beset quantum theory.

QBism deals with common questions in the interpretation of quantum theory about the nature of wavefunction superposition, quantum measurement, and entanglement. According to QBism, many, but not all, aspects of the quantum formalism are subjective in nature. For example, in this interpretation, a quantum state is not an element of reality—instead it represents the degrees of belief an agent has about the possible outcomes of measurements. For this reason, some philosophers of science have deemed QBism a form of anti-realism. The originators of the interpretation disagree with this characterization, proposing instead that the theory more properly aligns with a kind of realism they call "participatory realism", wherein reality consists of more than can be captured by any putative third-person account of it.

Consistent histories

The consistent histories interpretation generalizes the conventional Copenhagen interpretation and attempts to provide a natural interpretation of quantum cosmology. The theory is based on a consistency criterion that allows the history of a system to be described so that the probabilities for each history obey the additive rules of classical probability. It is claimed to be consistent with the Schrödinger equation.

According to this interpretation, the purpose of a quantum-mechanical theory is to predict the relative probabilities of various alternative histories (for example, of a particle).

Ensemble interpretation

The ensemble interpretation, also called the statistical interpretation, can be viewed as a minimalist interpretation. That is, it claims to make the fewest assumptions associated with the standard mathematics. It takes the statistical interpretation of Born to the fullest extent. The interpretation states that the wave function does not apply to an individual system – for example, a single particle – but is an abstract statistical quantity that only applies to an ensemble (a vast multitude) of similarly prepared systems or particles. In the words of Einstein:

The attempt to conceive the quantum-theoretical description as the complete description of the individual systems leads to unnatural theoretical interpretations, which become immediately unnecessary if one accepts the interpretation that the description refers to ensembles of systems and not to individual systems.

— Einstein in Albert Einstein: Philosopher-Scientist, ed. P.A. Schilpp (Harper & Row, New York)

The most prominent current advocate of the ensemble interpretation is Leslie E. Ballentine, professor at Simon Fraser University, author of the text book Quantum Mechanics, A Modern Development.

De Broglie–Bohm theory

The de Broglie–Bohm theory of quantum mechanics (also known as the pilot wave theory) is a theory by Louis de Broglie and extended later by David Bohm to include measurements. Particles, which always have positions, are guided by the wavefunction. The wavefunction evolves according to the Schrödinger wave equation, and the wavefunction never collapses. The theory takes place in a single spacetime, is non-local, and is deterministic. The simultaneous determination of a particle's position and velocity is subject to the usual uncertainty principle constraint. The theory is considered to be a hidden-variable theory, and by embracing non-locality it satisfies Bell's inequality. The measurement problem is resolved, since the particles have definite positions at all times. Collapse is explained as phenomenological.

Quantum Darwinism

Quantum Darwinism is a theory meant to explain the emergence of the classical world from the quantum world as due to a process of Darwinian natural selection induced by the environment interacting with the quantum system; where the many possible quantum states are selected against in favor of a stable pointer state. It was proposed in 2003 by Wojciech Zurek and a group of collaborators including Ollivier, Poulin, Paz and Blume-Kohout. The development of the theory is due to the integration of a number of Zurek's research topics pursued over the course of twenty-five years including: pointer states, einselection and decoherence.

Transactional interpretation

The transactional interpretation of quantum mechanics (TIQM) by John G. Cramer is an interpretation of quantum mechanics inspired by the Wheeler–Feynman absorber theory. It describes the collapse of the wave function as resulting from a time-symmetric transaction between a possibility wave from the source to the receiver (the wave function) and a possibility wave from the receiver to source (the complex conjugate of the wave function). This interpretation of quantum mechanics is unique in that it not only views the wave function as a real entity, but the complex conjugate of the wave function, which appears in the Born rule for calculating the expected value for an observable, as also real.

Objective collapse theories

Objective collapse theories differ from the Copenhagen interpretation by regarding both the wave function and the process of collapse as ontologically objective (meaning these exist and occur independent of the observer). In objective theories, collapse occurs either randomly ("spontaneous localization") or when some physical threshold is reached, with observers having no special role. Thus, objective-collapse theories are realistic, indeterministic, no-hidden-variables theories. Standard quantum mechanics does not specify any mechanism of collapse; QM would need to be extended if objective collapse is correct. The requirement for an extension to QM means that objective collapse is more of a theory than an interpretation. Examples include

Consciousness causes collapse (von Neumann–Wigner interpretation)

In his treatise The Mathematical Foundations of Quantum Mechanics, John von Neumann deeply analyzed the so-called measurement problem. He concluded that the entire physical universe could be made subject to the Schrödinger equation (the universal wave function). He also described how measurement could cause a collapse of the wave function. This point of view was prominently expanded on by Eugene Wigner, who argued that human experimenter consciousness (or maybe even dog consciousness) was critical for the collapse, but he later abandoned this interpretation.

Quantum logic

Quantum logic can be regarded as a kind of propositional logic suitable for understanding the apparent anomalies regarding quantum measurement, most notably those concerning composition of measurement operations of complementary variables. This research area and its name originated in the 1936 paper by Garrett Birkhoff and John von Neumann, who attempted to reconcile some of the apparent inconsistencies of classical boolean logic with the facts related to measurement and observation in quantum mechanics.

Modal interpretations of quantum theory

Modal interpretations of quantum mechanics were first conceived of in 1972 by Bas van Fraassen, in his paper "A formal approach to the philosophy of science." However, this term now is used to describe a larger set of models that grew out of this approach. The Stanford Encyclopedia of Philosophy describes several versions:

  • The Copenhagen variant
  • KochenDieksHealey interpretations
  • Motivating early modal interpretations, based on the work of R. Clifton, M. Dickson and J. Bub.

Time-symmetric theories

Time-symmetric interpretations of quantum mechanics were first suggested by Walter Schottky in 1921. Several theories have been proposed which modify the equations of quantum mechanics to be symmetric with respect to time reversal.) This creates retrocausality: events in the future can affect ones in the past, exactly as events in the past can affect ones in the future. In these theories, a single measurement cannot fully determine the state of a system (making them a type of hidden-variables theory), but given two measurements performed at different times, it is possible to calculate the exact state of the system at all intermediate times. The collapse of the wavefunction is therefore not a physical change to the system, just a change in our knowledge of it due to the second measurement. Similarly, they explain entanglement as not being a true physical state but just an illusion created by ignoring retrocausality. The point where two particles appear to "become entangled" is simply a point where each particle is being influenced by events that occur to the other particle in the future.

Not all advocates of time-symmetric causality favour modifying the unitary dynamics of standard quantum mechanics. Thus a leading exponent of the two-state vector formalism, Lev Vaidman, states that the two-state vector formalism dovetails well with Hugh Everett's many-worlds interpretation.

Other interpretations

As well as the mainstream interpretations discussed above, a number of other interpretations have been proposed which have not made a significant scientific impact for whatever reason. These range from proposals by mainstream physicists to the more occult ideas of quantum mysticism.

Comparisons

The most common interpretations are summarized in the table below. The values shown in the cells of the table are not without controversy, for the precise meanings of some of the concepts involved are unclear and, in fact, are themselves at the center of the controversy surrounding the given interpretation. For another table comparing interpretations of quantum theory, see reference.

No experimental evidence exists that distinguishes among these interpretations. To that extent, the physical theory stands, and is consistent with itself and with reality; difficulties arise only when one attempts to "interpret" the theory. Nevertheless, designing experiments which would test the various interpretations is the subject of active research.

Most of these interpretations have variants. For example, it is difficult to get a precise definition of the Copenhagen interpretation as it was developed and argued about by many people.

Determinism

From Wikipedia, the free encyclopedia

Determinism is the philosophical view that all events are determined completely by previously existing causes. Deterministic theories throughout the history of philosophy have sprung from diverse and sometimes overlapping motives and considerations. The opposite of determinism is some kind of indeterminism (otherwise called nondeterminism) or randomness. Determinism is often contrasted with free will, although some philosophers claim that the two are compatible.

Determinism often is taken to mean causal determinism, which in physics is known as cause-and-effect. It is the concept that events within a given paradigm are bound by causality in such a way that any state (of an object or event) is completely determined by prior states. This meaning can be distinguished from other varieties of determinism mentioned below.

Other debates often concern the scope of determined systems, with some maintaining that the entire universe is a single determinate system and others identifying other more limited determinate systems (or multiverse). Numerous historical debates involve many philosophical positions and varieties of determinism. They include debates concerning determinism and free will, technically denoted as compatibilistic (allowing the two to coexist) and incompatibilistic (denying their coexistence is a possibility).

Determinism should not be confused with self-determination of human actions by reasons, motives, and desires. Determinism is about interactions which affect our cognitive processes in our life. It is about the cause and the result of what we have done in our life. Cause and result are always bounded together in our cognitive processes. It assumes that if an observer has sufficient information about an object or human being, that such an observer might be able to predict every consequent move of that object or human being. Determinism rarely requires that perfect prediction be practically possible.

Varieties

"Determinism" may commonly refer to any of the following viewpoints.

Causal

Causal determinism, sometimes synonymous with historical determinism (a sort of path dependence), is "the idea that every event is necessitated by antecedent events and conditions together with the laws of nature." However, it is a broad enough term to consider that:

...one's deliberations, choices, and actions will often be necessary links in the causal chain that brings something about. In other words, even though our deliberations, choices, and actions are themselves determined like everything else, it is still the case, according to causal determinism, that the occurrence or existence of yet other things depends upon our deliberating, choosing and acting in a certain way.

Causal determinism proposes that there is an unbroken chain of prior occurrences stretching back to the origin of the universe. The relation between events may not be specified, nor the origin of that universe. Causal determinists believe that there is nothing in the universe that is uncaused or self-caused. Causal determinism has also been considered more generally as the idea that everything that happens or exists is caused by antecedent conditions. In the case of nomological determinism, these conditions are considered events also, implying that the future is determined completely by preceding events—a combination of prior states of the universe and the laws of nature. Yet they can also be considered metaphysical of origin (such as in the case of theological determinism).

Many philosophical theories of determinism frame themselves with the idea that reality follows a sort of predetermined path.

Nomological

Nomological determinism, generally synonymous with physical determinism (its opposite being physical indeterminism), the most common form of causal determinism, is the notion that the past and the present dictate the future entirely and necessarily by rigid natural laws, that every occurrence results inevitably from prior events. Nomological determinism is sometimes illustrated by the thought experiment of Laplace's demon. Nomological determinism is sometimes called scientific determinism, although that is a misnomer.

Necessitarianism

Necessitarianism is closely related to the causal determinism described above. It is a metaphysical principle that denies all mere possibility; there is exactly one way for the world to be. Leucippus claimed there were no uncaused events, and that everything occurs for a reason and by necessity.

Predeterminism

Predeterminism is the idea that all events are determined in advance. The concept is often argued by invoking causal determinism, implying that there is an unbroken chain of prior occurrences stretching back to the origin of the universe. In the case of predeterminism, this chain of events has been pre-established, and human actions cannot interfere with the outcomes of this pre-established chain.

Predeterminism can be used to mean such pre-established causal determinism, in which case it is categorised as a specific type of determinism. It can also be used interchangeably with causal determinism—in the context of its capacity to determine future events. Despite this, predeterminism is often considered as independent of causal determinism.

Biological

The term predeterminism is also frequently used in the context of biology and heredity, in which case it represents a form of biological determinism, sometimes called genetic determinism. Biological determinism is the idea that each of human behaviors, beliefs, and desires are fixed by human genetic nature.

Fatalism

Fatalism is normally distinguished from "determinism", as a form of teleological determinism. Fatalism is the idea that everything is fated to happen, so that humans have no control over their future. Fate has arbitrary power, and need not follow any causal or otherwise deterministic laws. Types of fatalism include hard theological determinism and the idea of predestination, where there is a God who determines all that humans will do. This may be accomplished either by knowing their actions in advance, via some form of omniscience or by decreeing their actions in advance.

Theological determinism

Theological determinism is a form of determinism that holds that all events that happen are either preordained (i.e., predestined) to happen by a monotheistic deity, or are destined to occur given its omniscience. Two forms of theological determinism exist, referred to as strong and weak theological determinism.

Strong theological determinism is based on the concept of a creator deity dictating all events in history: "everything that happens has been predestined to happen by an omniscient, omnipotent divinity."

Weak theological determinism is based on the concept of divine foreknowledge—"because God's omniscience is perfect, what God knows about the future will inevitably happen, which means, consequently, that the future is already fixed." There exist slight variations on this categorisation, however. Some claim either that theological determinism requires predestination of all events and outcomes by the divinity—i.e., they do not classify the weaker version as theological determinism unless libertarian free will is assumed to be denied as a consequence—or that the weaker version does not constitute theological determinism at all.

With respect to free will, "theological determinism is the thesis that God exists and has infallible knowledge of all true propositions including propositions about our future actions," more minimal criteria designed to encapsulate all forms of theological determinism.

Theological determinism can also be seen as a form of causal determinism, in which the antecedent conditions are the nature and will of God. Some have asserted that Augustine of Hippo introduced theological determinism into Christianity in 412 CE, whereas all prior Christian authors supported free will against Stoic and Gnostic determinism. However, there are many Biblical passages that seem to support the idea of some kind of theological determinism.

Logical determinism

Adequate determinism focuses on the fact that, even without a full understanding of microscopic physics, we can predict the distribution of 1000 coin tosses.

Logical determinism, or determinateness, is the notion that all propositions, whether about the past, present, or future, are either true or false. Note that one can support causal determinism without necessarily supporting logical determinism and vice versa (depending on one's views on the nature of time, but also randomness). The problem of free will is especially salient now with logical determinism: how can choices be free, given that propositions about the future already have a truth value in the present. This is referred to as the "problem of future contingents".

Often synonymous with logical determinism are the ideas behind spatio-temporal determinism or eternalism: the view of special relativity. J. J. C. Smart, a proponent of this view, uses the term tenselessness to describe the simultaneous existence of past, present, and future. In physics, the "block universe" of Hermann Minkowski and Albert Einstein assumes that time is a fourth dimension (like the three spatial dimensions).

Adequate determinism

Adequate determinism is the idea, because of quantum decoherence, that quantum indeterminacy can be ignored for most macroscopic events. Random quantum events "average out" in the limit of large numbers of particles (where the laws of quantum mechanics asymptotically approach the laws of classical mechanics). Stephen Hawking explains a similar idea: he says that the microscopic world of quantum mechanics is one of determined probabilities. That is, quantum effects rarely alter the predictions of classical mechanics, which are quite accurate (albeit still not perfectly certain) at larger scales. Something as large as an animal cell, then, would be "adequately determined" (even in light of quantum indeterminacy).

Many-worlds

The many-worlds interpretation accepts the linear causal sets of sequential events with adequate consistency yet also suggests constant forking of causal chains creating "multiple universes" to account for multiple outcomes from single events. Meaning the causal set of events leading to the present are all valid yet appear as a singular linear time stream within a much broader unseen conic probability field of other outcomes that "split off" from the locally observed timeline. Under this model causal sets are still "consistent" yet not exclusive to singular iterated outcomes.

The interpretation side steps the exclusive retrospective causal chain problem of "could not have done otherwise" by suggesting "the other outcome does exist" in a set of parallel universe time streams that split off when the action occurred. This theory is sometimes described with the example of agent based choices but more involved models argue that recursive causal splitting occurs with all particle wave functions at play. This model is highly contested with multiple objections from the scientific community.

Philosophical varieties

Determinism in nature/nurture controversy

Nature and nurture interact in humans. A scientist looking at a sculpture after some time does not ask whether we are seeing the effects of the starting materials or of environmental influences.

Although some of the above forms of determinism concern human behaviors and cognition, others frame themselves as an answer to the debate on nature and nurture. They will suggest that one factor will entirely determine behavior. As scientific understanding has grown, however, the strongest versions of these theories have been widely rejected as a single-cause fallacy. In other words, the modern deterministic theories attempt to explain how the interaction of both nature and nurture is entirely predictable. The concept of heritability has been helpful in making this distinction.

Determinism and prediction

A technological determinist might suggest that technology like the mobile phone is the greatest factor shaping human civilization.

Other 'deterministic' theories actually seek only to highlight the importance of a particular factor in predicting the future. These theories often use the factor as a sort of guide or constraint on the future. They need not suppose that complete knowledge of that one factor would allow us to make perfect predictions.

Structural determinism

Structural determinism is the philosophical view that actions, events, and processes are predicated on and determined by structural factors. Given any particular structure or set of estimable components, it is a concept that emphasises rational and predictable outcomes. Chilean biologists Humberto Maturana and Francisco Varela popularised the notion, writing that a living system's general order is maintained via a circular process of ongoing self-referral, and thus its organisation and structure defines the changes it undergoes. According to the authors, a system can undergo changes of state (alteration of structure without loss of identity) or disintegrations (alteration of structure with loss of identity). Such changes or disintegrations are not ascertained by the elements of the disturbing agent, as each disturbance will only trigger responses in the respective system, which in turn, are determined by each system’s own structure.

On an individualistic level, what this means is that human beings as free and independent entities are triggered to react by external stimuli or change in circumstance. However, their own internal state and existing physical and mental capacities determine their responses to those triggers. On a much broader societal level, structural determinists believe that larger issues in the society—especially those pertaining to minorities and subjugated communities—are predominantly assessed through existing structural conditions, making change of prevailing conditions difficult, and sometimes outright impossible. For example, the concept has been applied to the politics of race in the United States of America and other Western countries such as the United Kingdom and Australia, with structural determinists lamenting structural factors for the prevalence of racism in these countries. Additionally, Marxists have conceptualised the writings of Karl Marx within the context of structural determinism as well. For example, Louis Althusser, a structural Marxist, argues that the state, in its political, economic, and legal structures, reproduces the discourse of capitalism, in turn, allowing for the burgeoning of capitalistic structures.

Proponents of the notion highlight the usefulness of structural determinism to study complicated issues related to race and gender, as it highlights often gilded structural conditions that block meaningful change. Critics call it too rigid, reductionist and inflexible. Additionally, they also criticise the notion for overemphasising deterministic forces such as structure over the role of human agency and the ability of the people to act. These critics argue that politicians, academics, and social activists have the capability to bring about significant change despite stringent structural conditions.

With free will

Philosophers have debated both the truth of determinism, and the truth of free will. This creates the four possible positions in the figure. Compatibilism refers to the view that free will is, in some sense, compatible with determinism. The three incompatibilist positions deny this possibility. The hard incompatibilists hold that free will is incompatible with both determinism and indeterminism, the libertarianists that determinism does not hold, and free will might exist, and the hard determinists that determinism does hold and free will does not exist. The Dutch philosopher Baruch Spinoza was a determinist thinker, and argued that human freedom can be achieved through knowledge of the causes that determine our desire and affections. He defined human servitude as the state of bondage of anyone who is aware of their own desires, but ignorant of the causes that determined them. However, the free or virtuous person becomes capable, through reason and knowledge, to be genuinely free, even as they are being "determined". For the Dutch philosopher, acting out of one's own internal necessity is genuine freedom while being driven by exterior determinations is akin to bondage. Spinoza's thoughts on human servitude and liberty are respectively detailed in the fourth and fifth volumes of his work Ethics.

The standard argument against free will, according to philosopher J. J. C. Smart, focuses on the implications of determinism for free will. He suggests free will is denied whether determinism is true or not. For if determinism is true, all actions are predicted and no one is assumed to be free; however, if determinism is false, all actions are presumed to be random and as such no one seems free because they have no part in controlling what happens.

With the soul

Some determinists argue that materialism does not present a complete understanding of the universe, because while it can describe determinate interactions among material things, it ignores the minds or souls of conscious beings.

A number of positions can be delineated:

  • Immaterial souls are all that exist (idealism).
  • Immaterial souls exist and exert a non-deterministic causal influence on bodies (traditional free-will, interactionist dualism).
  • Immaterial souls exist, but are part of a deterministic framework.
  • Immaterial souls exist, but exert no causal influence, free or determined (epiphenomenalism, occasionalism)
  • Immaterial souls do not exist – there is no mind-body dichotomy, and there is a materialistic explanation for intuitions to the contrary.

With ethics and morality

Another topic of debate is the implication that determinism has on morality. Hard determinism is particularly criticized for seeming to make traditional moral judgments impossible. Some philosophers find this an acceptable conclusion.

Philosopher and incompatibilist Peter van Inwagen introduces this thesis, when arguments that free will is required for moral judgments, as such:

  1. The moral judgment that X should not have been done implies that something else should have been done instead
  2. That something else should have been done instead implies that there was something else to do
  3. That there was something else to do implies that something else could have been done
  4. That something else could have been done implies that there is free will
  5. If there is no free will to have done other than X we cannot make the moral judgment that X should not have been done.

History

Determinism was developed by the Greek philosophers during the 7th and 6th centuries BCE by the Pre-socratic philosophers Heraclitus and Leucippus, later Aristotle, and mainly by the Stoics. Some of the main philosophers who have dealt with this issue are Marcus Aurelius, Omar Khayyám, Thomas Hobbes, Baruch Spinoza, Gottfried Leibniz, David Hume, Baron d'Holbach (Paul Heinrich Dietrich), Pierre-Simon Laplace, Arthur Schopenhauer, William James, Friedrich Nietzsche, Albert Einstein, Niels Bohr, Ralph Waldo Emerson and, more recently, John Searle, Ted Honderich, and Daniel Dennett.

Mecca Chiesa notes that the probabilistic or selectionistic determinism of B. F. Skinner comprised a wholly separate conception of determinism that was not mechanistic at all. Mechanistic determinism assumes that every event has an unbroken chain of prior occurrences, but a selectionistic or probabilistic model does not.

Western tradition

In the West, some elements of determinism have been expressed in Greece from the 6th century BCE by the Presocratics Heraclitus and Leucippus. The first full-fledged notion of determinism appears to originate with the Stoics, as part of their theory of universal causal determinism. The resulting philosophical debates, which involved the confluence of elements of Aristotelian Ethics with Stoic psychology, led in the 1st–3rd centuries CE in the works of Alexander of Aphrodisias to the first recorded Western debate over determinism and freedom, an issue that is known in theology as the paradox of free will. The writings of Epictetus as well as middle Platonist and early Christian thought were instrumental in this development. Jewish philosopher Moses Maimonides said of the deterministic implications of an omniscient god: "Does God know or does He not know that a certain individual will be good or bad? If thou sayest 'He knows', then it necessarily follows that [that] man is compelled to act as God knew beforehand he would act, otherwise God's knowledge would be imperfect."

Newtonian mechanics

Determinism in the West is often associated with Newtonian mechanics/physics, which depicts the physical matter of the universe as operating according to a set of fixed, knowable laws. The "billiard ball" hypothesis, a product of Newtonian physics, argues that once the initial conditions of the universe have been established, the rest of the history of the universe follows inevitably. If it were actually possible to have complete knowledge of physical matter and all of the laws governing that matter at any one time, then it would be theoretically possible to compute the time and place of every event that will ever occur (Laplace's demon). In this sense, the basic particles of the universe operate in the same fashion as the rolling balls on a billiard table, moving and striking each other in predictable ways to produce predictable results.

Whether or not it is all-encompassing in so doing, Newtonian mechanics deals only with caused events; for example, if an object begins in a known position and is hit dead on by an object with some known velocity, then it will be pushed straight toward another predictable point. If it goes somewhere else, the Newtonians argue, one must question one's measurements of the original position of the object, the exact direction of the striking object, gravitational or other fields that were inadvertently ignored, etc. Then, they maintain, repeated experiments and improvements in accuracy will always bring one's observations closer to the theoretically predicted results. When dealing with situations on an ordinary human scale, Newtonian physics has been so enormously successful that it has no competition. But it fails spectacularly as velocities become some substantial fraction of the speed of light and when interactions at the atomic scale are studied. Before the discovery of quantum effects and other challenges to Newtonian physics, "uncertainty" was always a term that applied to the accuracy of human knowledge about causes and effects, and not to the causes and effects themselves.

Newtonian mechanics, as well as any following physical theories, are results of observations and experiments, and so they describe "how it all works" within a tolerance. However, old western scientists believed if there are any logical connections found between an observed cause and effect, there must be also some absolute natural laws behind. Belief in perfect natural laws driving everything, instead of just describing what we should expect, led to searching for a set of universal simple laws that rule the world. This movement significantly encouraged deterministic views in Western philosophy, as well as the related theological views of classical pantheism.

Eastern tradition

The idea that the entire universe is a deterministic system has been articulated in both Eastern and non-Eastern religion, philosophy, and literature.

In the I Ching and philosophical Taoism, the ebb and flow of favorable and unfavorable conditions suggests the path of least resistance is effortless (see Wu wei).

In the philosophical schools of the Indian Subcontinent, the concept of karma deals with similar philosophical issues to the western concept of determinism. Karma is understood as a spiritual mechanism which causes the entire cycle of rebirth (i.e. Saṃsāra). Karma, either positive or negative, accumulates according to an individual's actions throughout their life, and at their death determines the nature of their next life in the cycle of Saṃsāra. Most major religions originating in India hold this belief to some degree, most notably Hinduism, Jainism, Sikhism, and Buddhism.

The views on the interaction of karma and free will are numerous, and diverge from each other greatly. For example, in Sikhism, god's grace, gained through worship, can erase one's karmic debts, a belief which reconciles the principle of karma with a monotheistic god one must freely choose to worship. Jainists believe in a sort of compatibilism, in which the cycle of Saṃsara is a completely mechanistic process, occurring without any divine intervention. The Jains hold an atomic view of reality, in which particles of karma form the fundamental microscopic building material of the universe.

Buddhism

Buddhist philosophy contains several concepts which some scholars describe as deterministic to various levels. However, the direct analysis of Buddhist metaphysics through the lens of determinism is difficult, due to the differences between European and Buddhist traditions of thought.

One concept which is argued to support a hard determinism is the idea of dependent origination, which claims that all phenomena (dharma) are necessarily caused by some other phenomenon, which it can be said to be dependent on, like links in a massive chain. In traditional Buddhist philosophy, this concept is used to explain the functioning of the cycle of saṃsāra; all actions exert a karmic force, which will manifest results in future lives. In other words, righteous or unrighteous actions in one life will necessarily cause good or bad responses in another.

Another Buddhist concept which many scholars perceive to be deterministic is the idea of non-self, or anatta. In Buddhism, attaining enlightenment involves one realizing that in humans there is no fundamental core of being which can be called the "soul", and that humans are instead made of several constantly changing factors which bind them to the cycle of Saṃsāra.

Some scholars argue that the concept of non-self necessarily disproves the ideas of free will and moral culpability. If there is no autonomous self, in this view, and all events are necessarily and unchangeably caused by others, then no type of autonomy can be said to exist, moral or otherwise. However, other scholars disagree, claiming that the Buddhist conception of the universe allows for a form of compatibilism. Buddhism perceives reality occurring on two different levels, the ultimate reality which can only be truly understood by the enlightened, and the illusory and false material reality. Therefore, Buddhism perceives free will as a notion belonging to material reality, while concepts like non-self and dependent origination belong to the ultimate reality; the transition between the two can be truly understood, Buddhists claim, by one who has attained enlightenment.

Modern scientific perspective

Generative processes

Although it was once thought by scientists that any indeterminism in quantum mechanics occurred at too small a scale to influence biological or neurological systems, there is indication that nervous systems are influenced by quantum indeterminism due to chaos theory. It is unclear what implications this has for the problem of free will given various possible reactions to the problem in the first place. Many biologists do not grant determinism: Christof Koch, for instance, argues against it, and in favour of libertarian free will, by making arguments based on generative processes (emergence). Other proponents of emergentist or generative philosophy, cognitive sciences, and evolutionary psychology, argue that a certain form of determinism (not necessarily causal) is true. They suggest instead that an illusion of free will is experienced due to the generation of infinite behaviour from the interaction of finite-deterministic set of rules and parameters. Thus the unpredictability of the emerging behaviour from deterministic processes leads to a perception of free will, even though free will as an ontological entity does not exist.

 

In Conway's Game of Life, the interaction of just four simple rules creates patterns that seem somehow "alive".

As an illustration, the strategy board-games chess and Go have rigorous rules in which no information (such as cards' face-values) is hidden from either player and no random events (such as dice-rolling) happen within the game. Yet, chess and especially Go with its extremely simple deterministic rules, can still have an extremely large number of unpredictable moves. When chess is simplified to 7 or fewer pieces, however, endgame tables are available that dictate which moves to play to achieve a perfect game. This implies that, given a less complex environment (with the original 32 pieces reduced to 7 or fewer pieces), a perfectly predictable game of chess is possible. In this scenario, the winning player can announce that a checkmate will happen within a given number of moves, assuming a perfect defense by the losing player, or fewer moves if the defending player chooses sub-optimal moves as the game progresses into its inevitable, predicted conclusion. By this analogy, it is suggested, the experience of free will emerges from the interaction of finite rules and deterministic parameters that generate nearly infinite and practically unpredictable behavioural responses. In theory, if all these events could be accounted for, and there were a known way to evaluate these events, the seemingly unpredictable behaviour would become predictable. Another hands-on example of generative processes is John Horton Conway's playable Game of Life. Nassim Taleb is wary of such models, and coined the term "ludic fallacy."

Compatibility with the existence of science

Certain philosophers of science argue that, while causal determinism (in which everything including the brain/mind is subject to the laws of causality) is compatible with minds capable of science, fatalism and predestination is not. These philosophers make the distinction that causal determinism means that each step is determined by the step before and therefore allows sensory input from observational data to determine what conclusions the brain reaches, while fatalism in which the steps between do not connect an initial cause to the results would make it impossible for observational data to correct false hypotheses. This is often combined with the argument that if the brain had fixed views and the arguments were mere after-constructs with no causal effect on the conclusions, science would have been impossible and the use of arguments would have been a meaningless waste of energy with no persuasive effect on brains with fixed views.

Mathematical models

Many mathematical models of physical systems are deterministic. This is true of most models involving differential equations (notably, those measuring rate of change over time). Mathematical models that are not deterministic because they involve randomness are called stochastic. Because of sensitive dependence on initial conditions, some deterministic models may appear to behave non-deterministically; in such cases, a deterministic interpretation of the model may not be useful due to numerical instability and a finite amount of precision in measurement. Such considerations can motivate the consideration of a stochastic model even though the underlying system is governed by deterministic equations.

Quantum and classical mechanics

Day-to-day physics

Since the beginning of the 20th century, quantum mechanics—the physics of the extremely small—has revealed previously concealed aspects of events. Before that, Newtonian physics—the physics of everyday life—dominated. Taken in isolation (rather than as an approximation to quantum mechanics), Newtonian physics depicts a universe in which objects move in perfectly determined ways. At the scale where humans exist and interact with the universe, Newtonian mechanics remain useful, and make relatively accurate predictions (e.g. calculating the trajectory of a bullet). But whereas in theory, absolute knowledge of the forces accelerating a bullet would produce an absolutely accurate prediction of its path, modern quantum mechanics casts reasonable doubt on this main thesis of determinism.

Quantum realm

Quantum physics works differently in many ways from Newtonian physics. Physicist Aaron D. O'Connell explains that understanding our universe, at such small scales as atoms, requires a different logic than day-to-day life does. O'Connell does not deny that it is all interconnected: the scale of human existence ultimately does emerge from the quantum scale. O'Connell argues that we must simply use different models and constructs when dealing with the quantum world. Quantum mechanics is the product of a careful application of the scientific method, logic and empiricism. The Heisenberg uncertainty principle is frequently confused with the observer effect. The uncertainty principle actually describes how precisely we may measure the position and momentum of a particle at the same time—if we increase the accuracy in measuring one quantity, we are forced to lose accuracy in measuring the other. "These uncertainty relations give us that measure of freedom from the limitations of classical concepts which is necessary for a consistent description of atomic processes."

Although it is not possible to predict the trajectory of any one particle, they all obey determined probabilities which do permit some prediction

This is where statistical mechanics come into play, and where physicists begin to require rather unintuitive mental models: A particle's path simply cannot be exactly specified in its full quantum description. "Path" is a classical, practical attribute in our everyday life, but one that quantum particles do not meaningfully possess. The probabilities discovered in quantum mechanics do nevertheless arise from measurement (of the perceived path of the particle). As Stephen Hawking explains, the result is not traditional determinism, but rather determined probabilities. In some cases, a quantum particle may indeed trace an exact path, and the probability of finding the particles in that path is one (certain to be true). In fact, as far as prediction goes, the quantum development is at least as predictable as the classical motion, but the key is that it describes wave functions that cannot be easily expressed in ordinary language. As far as the thesis of determinism is concerned, these probabilities, at least, are quite determined. These findings from quantum mechanics have found many applications, and allow us to build transistors and lasers. Put another way: personal computers, Blu-ray players and the Internet all work because humankind discovered the determined probabilities of the quantum world.

On the topic of predictable probabilities, the double-slit experiments are a popular example. Photons are fired one-by-one through a double-slit apparatus at a distant screen. They do not arrive at any single point, nor even the two points lined up with the slits (the way it might be expected of bullets fired by a fixed gun at a distant target). Instead, the light arrives in varying concentrations at widely separated points, and the distribution of its collisions with the target can be calculated reliably. In that sense the behavior of light in this apparatus is deterministic, but there is no way to predict where in the resulting interference pattern any individual photon will make its contribution (although, there may be ways to use weak measurement to acquire more information without violating the uncertainty principle).

Some (including Albert Einstein) have argued that the inability to predict any more than probabilities is simply due to ignorance. The idea is that, beyond the conditions and laws can be observed or deduced, there are also hidden factors or "Hidden variable theoryvariables" that determine absolutely in which order photons reach the detector screen. They argue that the course of the universe is absolutely determined, but that humans are screened from knowledge of the determinative factors. So, they say, it only appears that things proceed in a merely probabilistically determinative way. In actuality, they proceed in an absolutely deterministic way.

John S. Bell criticized Einstein's work in his famous Bell's theorem, which proved that quantum mechanics can make statistical predictions that would be violated if local hidden variables really existed. A number of experiments have tried to verify such predictions, and so far they do not appear to be violated. Current experiments continue to verify the result, including the 2015 "Loophole Free Test" that plugged all known sources of error and the 2017 "Cosmic Bell Test" experiment that used cosmic data streaming from different directions toward the Earth, precluding the possibility the sources of data could have had prior interactions. However, it is possible to augment quantum mechanics with non-local hidden variables to achieve a deterministic theory that is in agreement with experiment. An example is the Bohm interpretation of quantum mechanics. Bohm's Interpretation, though, violates special relativity and it is highly controversial whether or not it can be reconciled without giving up on determinism.

More advanced variations on these arguments include quantum contextuality, by Bell, Simon B. Kochen and Ernst Specker, which argues that hidden variable theories cannot be "sensible", meaning that the values of the hidden variables inherently depend on the devices used to measure them.

This debate is relevant because there are possibly specific situations in which the arrival of an electron at a screen at a certain point and time would trigger one event, whereas its arrival at another point would trigger an entirely different event (e.g. see Schrödinger's cat – a thought experiment used as part of a deeper debate).

Thus, quantum physics casts reasonable doubt on the traditional determinism of classical, Newtonian physics in so far as reality does not seem to be absolutely determined. This was the subject of the famous Bohr–Einstein debates between Einstein and Niels Bohr and there is still no consensus. Adequate determinism (see Varieties, above) is the reason that Stephen Hawking calls libertarian free will "just an illusion".

Lie point symmetry

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Lie_point_symmetry     ...