Search This Blog

Sunday, May 3, 2020

Newcomb's paradox

From Wikipedia, the free encyclopedia
 
In philosophy and mathematics, Newcomb's paradox, also referred to as Newcomb's problem, is a thought experiment involving a game between two players, one of whom is able to predict the future.
Newcomb's paradox was created by William Newcomb of the University of California's Lawrence Livermore Laboratory. However, it was first analyzed in a philosophy paper by Robert Nozick in 1969, and appeared in the March 1973 issue of Scientific American, in Martin Gardner's "Mathematical Games." Today it is a much debated problem in the philosophical branch of decision theory.

The problem

There is an infallible predictor, a player, and two boxes designated A and B. The player is given a choice between taking only box B, or taking both boxes A and B. The player knows the following:[4]
  • Box A is clear, and always contains a visible $1,000.
  • Box B is opaque, and its content has already been set by the predictor:
    • If the predictor has predicted the player will take both boxes A and B, then box B contains nothing.
    • If the predictor has predicted that the player will take only box B, then box B contains $1,000,000.
The player does not know what the predictor predicted or what box B contains while making his/her choice.

Game theory strategies

Predicted choice Actual choice Payout
A + B A + B $1,000
A + B B $0
B A + B $1,001,000
B B $1,000,000

In his 1969 article, Nozick noted that "To almost everyone, it is perfectly clear and obvious what should be done. The difficulty is that these people seem to divide almost evenly on the problem, with large numbers thinking that the opposing half is just being silly." The problem continues to divide philosophers today.

Game theory offers two strategies for this game that rely on different principles: the expected utility principle and the strategic dominance principle. The problem is called a paradox because two analyses that both sound intuitively logical give conflicting answers to the question of what choice maximizes the player's payout.
  • Considering the expected utility when the probability of the predictor being right is almost certain or certain, the player should choose box B. This choice statistically maximizes the player's winnings, setting them at about $1,000,000 per game.
  • Under the dominance principle, the player should choose the strategy that is always better; choosing both boxes A and B will always yield $1,000 more than only choosing B. However, the expected utility of "always $1,000 more than B" depends on the statistical payout of the game; when the predictor's prediction is almost certain or certain, choosing both A and B sets player's winnings at about $1,000 per game.
David Wolpert and Gregory Benford point out that paradoxes arise when not all relevant details of a problem are specified, and there is more than one "intuitively obvious" way to fill in those missing details. They suggest that in the case of Newcomb's paradox, the conflict over which of the two strategies is "obviously correct" reflects the fact that filling in the details in Newcomb's problem can result in two different noncooperative games, and each of the strategies is reasonable for one game but not the other. They then derive the optimal strategies for both of the games, which turn out to be independent of the predictor's infallibility, questions of causality, determinism, and free will.

Causality and free will

Predicted choice Actual choice Payout
A + B A + B $1,000
B B $1,000,000

Causality issues arise when the predictor is posited as infallible and incapable of error; Nozick avoids this issue by positing that the predictor's predictions are "almost certainly" correct, thus sidestepping any issues of infallibility and causality. Nozick also stipulates that if the predictor predicts that the player will choose randomly, then box B will contain nothing. This assumes that inherently random or unpredictable events would not come into play anyway during the process of making the choice, such as free will or quantum mind processes. However, these issues can still be explored in the case of an infallible predictor. Under this condition, it seems that taking only B is the correct option. This analysis argues that we can ignore the possibilities that return $0 and $1,001,000, as they both require that the predictor has made an incorrect prediction, and the problem states that the predictor is never wrong. Thus, the choice becomes whether to take both boxes with $1,000 or to take only box B with $1,000,000—so taking only box B is always better.

William Lane Craig has suggested that, in a world with perfect predictors (or time machines, because a time machine could be used as a mechanism for making a prediction), retrocausality can occur. If a person truly knows the future, and that knowledge affects their actions, then events in the future will be causing effects in the past. The chooser's choice will have already caused the predictor's action. Some have concluded that if time machines or perfect predictors can exist, then there can be no free will and choosers will do whatever they're fated to do. Taken together, the paradox is a restatement of the old contention that free will and determinism are incompatible, since determinism enables the existence of perfect predictors. Put another way, this paradox can be equivalent to the grandfather paradox; the paradox presupposes a perfect predictor, implying the "chooser" is not free to choose, yet simultaneously presumes a choice can be debated and decided. This suggests to some that the paradox is an artifact of these contradictory assumptions.

Gary Drescher argues in his book Good and Real that the correct decision is to take only box B, by appealing to a situation he argues is analogous—a rational agent in a deterministic universe deciding whether or not to cross a potentially busy street.

Andrew Irvine argues that the problem is structurally isomorphic to Braess' paradox, a non-intuitive but ultimately non-paradoxical result concerning equilibrium points in physical systems of various kinds.

Simon Burgess has argued that the problem can be divided into two stages: the stage before the predictor has gained all the information on which the prediction will be based, and the stage after it. While the player is still in the first stage, they are presumably able to influence the predictor's prediction, for example by committing to taking only one box. Burgess argues that after the first stage is done, the player can decide to take both boxes A and B without influencing the predictor, thus reaching the maximum payout. This assumes that the predictor cannot predict the player's thought process in the second stage, and that the player can change their mind at the second stage without influencing the predictor's prediction. Burgess says that given his analysis, Newcomb's problem is akin to the toxin puzzle. This is because both problems highlight the fact that one can have a reason to intend to do something without having a reason to actually do it.

Consciousness

Newcomb's paradox can also be related to the question of machine consciousness, specifically if a perfect simulation of a person's brain will generate the consciousness of that person. Suppose we take the predictor to be a machine that arrives at its prediction by simulating the brain of the chooser when confronted with the problem of which box to choose. If that simulation generates the consciousness of the chooser, then the chooser cannot tell whether they are standing in front of the boxes in the real world or in the virtual world generated by the simulation in the past. The "virtual" chooser would thus tell the predictor which choice the "real" chooser is going to make.

Fatalism

Newcomb's paradox is related to logical fatalism in that they both suppose absolute certainty of the future. In logical fatalism, this assumption of certainty creates circular reasoning ("a future event is certain to happen, therefore it is certain to happen"), while Newcomb's paradox considers whether the participants of its game are able to affect a predestined outcome.

Extensions to Newcomb's problem

Many thought experiments similar to or based on Newcomb's problem have been discussed in the literature. For example, a quantum-theoretical version of Newcomb's problem in which box B is entangled with box A has been proposed.

The meta-Newcomb problem

Another related problem is the meta-Newcomb problem. The setup of this problem is similar to the original Newcomb problem. However, the twist here is that the predictor may elect to decide whether to fill box B after the player has made a choice, and the player does not know whether box B has already been filled. There is also another predictor: a "meta-predictor" who has reliably predicted both the players and the predictor in the past, and who predicts the following: "Either you will choose both boxes, and the predictor will make its decision after you, or you will choose only box B, and the predictor will already have made its decision."

In this situation, a proponent of choosing both boxes is faced with the following dilemma: if the player chooses both boxes, the predictor will not yet have made its decision, and therefore a more rational choice would be for the player to choose box B only. But if the player so chooses, the predictor will already have made its decision, making it impossible for the player's decision to affect the predictor's decision.

Ensemble interpretation

From Wikipedia, the free encyclopedia
 
The ensemble interpretation of quantum mechanics considers the quantum state description to apply only to an ensemble of similarly prepared systems, rather than supposing that it exhaustively represents an individual physical system.

The advocates of the ensemble interpretation of quantum mechanics claim that it is minimalist, making the fewest physical assumptions about the meaning of the standard mathematical formalism. It proposes to take to the fullest extent the statistical interpretation of Max Born, for which he won the Nobel Prize in Physics. For example, a new version of the ensemble interpretation that relies on a new formulation of probability theory was introduced by Raed Shaiia, which showed that the laws of quantum mechanics are the inevitable result of this new formulation. On the face of it, the ensemble interpretation might appear to contradict the doctrine proposed by Niels Bohr, that the wave function describes an individual system or particle, not an ensemble, though he accepted Born's statistical interpretation of quantum mechanics. It is not quite clear exactly what kind of ensemble Bohr intended to exclude, since he did not describe probability in terms of ensembles. The ensemble interpretation is sometimes, especially by its proponents, called "the statistical interpretation", but it seems perhaps different from Born's statistical interpretation.

As is the case for "the" Copenhagen interpretation, "the" ensemble interpretation might not be uniquely defined. In one view, the ensemble interpretation may be defined as that advocated by Leslie E. Ballentine, Professor at Simon Fraser University. His interpretation does not attempt to justify, or otherwise derive, or explain quantum mechanics from any deterministic process, or make any other statement about the real nature of quantum phenomena; it intends simply to interpret the wave function. It does not propose to lead to actual results that differ from orthodox interpretations. It makes the statistical operator primary in reading the wave function, deriving the notion of a pure state from that. In the opinion of Ballentine, perhaps the most notable supporter of such an interpretation was Albert Einstein:
The attempt to conceive the quantum-theoretical description as the complete description of the individual systems leads to unnatural theoretical interpretations, which become immediately unnecessary if one accepts the interpretation that the description refers to ensembles of systems and not to individual systems.
— Albert Einstein
Nevertheless, one may doubt as to whether Einstein, over the years, had in mind one definite kind of ensemble.

Meaning of "ensemble" and "system"

Perhaps the first expression of an ensemble interpretation was that of Max Born. In a 1968 article, he used the German words 'Haufen gleicher', which are often translated into English, in this context, as 'ensemble' or 'assembly'. The atoms in his assembly were uncoupled, meaning that they were an imaginary set of independent atoms that defines its observable statistical properties. Born did not mean an ensemble of instances of a certain kind of wave function, nor one composed of instances of a certain kind of state vector. There may be room here for confusion or miscommunication.

An example of an ensemble is composed by preparing and observing many copies of one and the same kind of quantum system. This is referred to as an ensemble of systems. It is not, for example, a single preparation and observation of one simultaneous set ("ensemble") of particles. A single body of many particles, as in a gas, is not an "ensemble" of particles in the sense of the "ensemble interpretation", although a repeated preparation and observation of many copies of one and the same kind of body of particles may constitute an "ensemble" of systems, each system being a body of many particles. The ensemble is not in principle confined to such a laboratory paradigm, but may be a natural system conceived of as occurring repeatedly in nature; it is not quite clear whether or how this might be realized.

The members of the ensemble are said to be in the same state, and this defines the term 'state'. The state is mathematically denoted by a mathematical object called a statistical operator. Such an operator is a map from a certain corresponding Hilbert space to itself, and may be written as a density matrix. It is characteristic of the ensemble interpretation to define the state by the statistical operator. Other interpretations may instead define the state by the corresponding Hilbert space. Such a difference between the modes of definition of state seems to make no difference to the physical meaning. Indeed, according to Ballentine, one can define the state by an ensemble of identically prepared systems, denoted by a point in the Hilbert space, as is perhaps more customary. The link is established by making the observing procedure a copy of the preparative procedure; mathematically the corresponding Hilbert spaces are mutually dual. Since Bohr's concern was that the specimen phenomena are joint preparation-observation occasions, it is not evident that the Copenhagen and ensemble interpretations differ substantially in this respect.

According to Ballentine, the distinguishing difference between the Copenhagen interpretation (CI) and the ensemble interpretation (EI) is the following:

CI: A pure state provides a "complete" description of an individual system, in the sense that a dynamical variable represented by the operator has a definite value (, say) if and only if

EI: A pure state describes the statistical properties of an ensemble of identically prepared systems, of which the statistical operator is idempotent. 

Ballentine emphasizes that the meaning of the "Quantum State" or "State Vector" may be described, essentially, by a one-to-one correspondence to the probability distributions of measurement results, not the individual measurement results themselves. A mixed state is a description only of the probabilities, and of positions, not a description of actual individual positions. A mixed state is a mixture of probabilities of physical states, not a coherent superposition of physical states.

Ensemble interpretation applied to single systems

The statement that the quantum mechanical wave function itself does not apply to a single system in one sense does not imply that the ensemble interpretation itself does not apply to single systems in the sense meant by the ensemble interpretation. The condition is that there is not a direct one-to-one correspondence of the wave function with an individual system that might imply, for example, that an object might physically exist in two states simultaneously. The ensemble interpretation may well be applied to a single system or particle, and predict what is the probability that that single system will have for a value of one of its properties, on repeated measurements. 

Consider the throwing of two dice simultaneously on a craps table. The system in this case would consist of only the two dice. There are probabilities of various results, e.g. two fives, two twos, a one and a six etc. Throwing the pair of dice 100 times, would result in an ensemble of 100 trials. Classical statistics would then be able predict what typically would be the number of times that certain results would occur. However, classical statistics would not be able to predict what definite single result would occur with a single throw of the pair of dice. That is, probabilities applied to single one off events are, essentially, meaningless, except in the case of a probability equal to 0 or 1. It is in this way that the ensemble interpretation states that the wave function does not apply to an individual system. That is, by individual system, it is meant a single experiment or single throw of the dice, of that system. 

The Craps throws could equally well have been of only one dice, that is, a single system or particle. Classical statistics would also equally account for repeated throws of this single dice. It is in this manner, that the ensemble interpretation is quite able to deal with "single" or individual systems on a probabilistic basis. The standard Copenhagen Interpretation (CI) is no different in this respect. A fundamental principle of QM is that only probabilistic statements may be made, whether for individual systems/particles, a simultaneous group of systems/particles, or a collection (ensemble) of systems/particles. An identification that the wave function applies to an individual system in standard CI QM, does not defeat the inherent probabilistic nature of any statement that can be made within standard QM. To verify the probabilities of quantum mechanical predictions, however interpreted, inherently requires the repetition of experiments, i.e. an ensemble of systems in the sense meant by the ensemble interpretation. QM cannot state that a single particle will definitely be in a certain position, with a certain momentum at a later time, irrespective of whether or not the wave function is taken to apply to that single particle. In this way, the standard CI also "fails" to completely describe "single" systems. 

However, it should be stressed that, in contrast to classical systems and older ensemble interpretations, the modern ensemble interpretation as discussed here, does not assume, nor require, that there exist specific values for the properties of the objects of the ensemble, prior to measurement.

Preparative and observing devices as origins of quantum randomness

An isolated quantum mechanical system, specified by a wave function, evolves in time in a deterministic way according to the Schrödinger equation that is characteristic of the system. Though the wave function can generate probabilities, no randomness or probability is involved in the temporal evolution of the wave function itself. This is agreed, for example, by Born, Dirac, von Neumann, London & Bauer, Messiah, and Feynman & Hibbs. An isolated system is not subject to observation; in quantum theory, this is because observation is an intervention that violates isolation.

The system's initial state is defined by the preparative procedure; this is recognized in the ensemble interpretation, as well as in the Copenhagen approach. The system's state as prepared, however, does not entirely fix all properties of the system. The fixing of properties goes only as far as is physically possible, and is not physically exhaustive; it is, however, physically complete in the sense that no physical procedure can make it more detailed. This is stated clearly by Heisenberg in his 1927 paper. It leaves room for further unspecified properties. For example, if the system is prepared with a definite energy, then the quantum mechanical phase of the wave function is left undetermined by the mode of preparation. The ensemble of prepared systems, in a definite pure state, then consists of a set of individual systems, all having one and the same the definite energy, but each having a different quantum mechanical phase, regarded as probabilistically random. The wave function, however, does have a definite phase, and thus specification by a wave function is more detailed than specification by state as prepared. The members of the ensemble are logically distinguishable by their distinct phases, though the phases are not defined by the preparative procedure. The wave function can be multiplied by a complex number of unit magnitude without changing the state as defined by the preparative procedure. 

The preparative state, with unspecified phase, leaves room for the several members of the ensemble to interact in respectively several various ways with other systems. An example is when an individual system is passed to an observing device so as to interact with it. Individual systems with various phases are scattered in various respective directions in the analyzing part of the observing device, in a probabilistic way. In each such direction, a detector is placed, in order to complete the observation. When the system hits the analyzing part of the observing device, that scatters it, it ceases to be adequately described by its own wave function in isolation. Instead it interacts with the observing device in ways partly determined by the properties of the observing device. In particular, there is in general no phase coherence between system and observing device. This lack of coherence introduces an element of probabilistic randomness to the system–device interaction. It is this randomness that is described by the probability calculated by the Born rule. There are two independent originative random processes, one that of preparative phase, the other that of the phase of the observing device. The random process that is actually observed, however, is neither of those originative ones. It is the phase difference between them, a single derived random process. 

The Born rule describes that derived random process, the observation of a single member of the preparative ensemble. In the ordinary language of classical or Aristotelian scholarship, the preparative ensemble consists of many specimens of a species. The quantum mechanical technical term 'system' refers to a single specimen, a particular object that may be prepared or observed. Such an object, as is generally so for objects, is in a sense a conceptual abstraction, because, according to the Copenhagen approach, it is defined, not in its own right as an actual entity, but by the two macroscopic devices that should prepare and observe it. The random variability of the prepared specimens does not exhaust the randomness of a detected specimen. Further randomness is injected by the quantum randomness of the observing device. It is this further randomness that makes Bohr emphasize that there is randomness in the observation that is not fully described by the randomness of the preparation. This is what Bohr means when he says that the wave function describes "a single system". He is focusing on the phenomenon as a whole, recognizing that the preparative state leaves the phase unfixed, and therefore does not exhaust the properties of the individual system. The phase of the wave function encodes further detail of the properties of the individual system. The interaction with the observing device reveals that further encoded detail. It seems that this point, emphasized by Bohr, is not explicitly recognized by the ensemble interpretation, and this may be what distinguishes the two interpretations. It seems, however, that this point is not explicitly denied by the ensemble interpretation. 

Einstein perhaps sometimes seemed to interpret the probabilistic "ensemble" as a preparative ensemble, recognizing that the preparative procedure does not exhaustively fix the properties of the system; therefore he said that the theory is "incomplete". Bohr, however, insisted that the physically important probabilistic "ensemble" was the combined prepared-and-observed one. Bohr expressed this by demanding that an actually observed single fact should be a complete "phenomenon", not a system alone, but always with reference to both the preparing and the observing devices. The Einstein–Podolsky–Rosen criterion of "completeness" is clearly and importantly different from Bohr's. Bohr regarded his concept of "phenomenon" as a major contribution that he offered for quantum theoretical understanding. The decisive randomness comes from both preparation and observation, and may be summarized in a single randomness, that of the phase difference between preparative and observing devices. The distinction between these two devices is an important point of agreement between Copenhagen and ensemble interpretations. Though Ballentine claims that Einstein advocated "the ensemble approach", a detached scholar would not necessarily be convinced by that claim of Ballentine. There is room for confusion about how "the ensemble" might be defined.

"Each photon interferes only with itself"

Niels Bohr famously insisted that the wave function refers to a single individual quantum system. He was expressing the idea that Dirac expressed when he famously wrote: "Each photon then interferes only with itself. Interference between different photons never occurs.". Dirac clarified this by writing: "This, of course, is true only provided the two states that are superposed refer to the same beam of light, i.e. all that is known about the position and momentum of a photon in either of these states must be the same for each." Bohr wanted to emphasize that a superposition is different from a mixture. He seemed to think that those who spoke of a "statistical interpretation" were not taking that into account. To create, by a superposition experiment, a new and different pure state, from an original pure beam, one can put absorbers and phase-shifters into some of the sub-beams, so as to alter the composition of the re-constituted superposition. But one cannot do so by mixing a fragment of the original unsplit beam with component split sub-beams. That is because one photon cannot both go into the unsplit fragment and go into the split component sub-beams. Bohr felt that talk in statistical terms might hide this fact. 

The physics here is that the effect of the randomness contributed by the observing apparatus depends on whether the detector is in the path of a component sub-beam, or in the path of the single superposed beam. This is not explained by the randomness contributed by the preparative device.

Measurement and collapse

Bras and kets

The ensemble interpretation is notable for its relative de-emphasis on the duality and theoretical symmetry between bras and kets. The approach emphasizes the ket as signifying a physical preparation procedure. There is little or no expression of the dual role of the bra as signifying a physical observational procedure. The bra is mostly regarded as a mere mathematical object, without very much physical significance. It is the absence of the physical interpretation of the bra that allows the ensemble approach to by-pass the notion of "collapse". Instead, the density operator expresses the observational side of the ensemble interpretation. It hardly needs saying that this account could be expressed in a dual way, with bras and kets interchanged, mutatis mutandis. In the ensemble approach, the notion of the pure state is conceptually derived by analysis of the density operator, rather than the density operator being conceived as conceptually synthesized from the notion of the pure state. 

An attraction of the ensemble interpretation is that it appears to dispense with the metaphysical issues associated with reduction of the state vector, Schrödinger cat states, and other issues related to the concepts of multiple simultaneous states. The ensemble interpretation postulates that the wave function only applies to an ensemble of systems as prepared, but not observed. There is no recognition of the notion that a single specimen system could manifest more than one state at a time, as assumed, for example, by Dirac. Hence, the wave function is not envisaged as being physically required to be "reduced". This can be illustrated by an example:

Consider a quantum die. If this is expressed in Dirac notation, the "state" of the die can be represented by a "wave" function describing the probability of an outcome given by:
Where the "+" sign of a probabilistic equation is not an addition operator, it is a standard probabilistic or Boolean logical OR operator. The state vector is inherently defined as a probabilistic mathematical object such that the result of a measurement is one outcome OR another outcome. 

It is clear that on each throw, only one of the states will be observed, but this is not expressed by a bra. Consequently, there appears to be no requirement for a notion of collapse of the wave function/reduction of the state vector, or for the die to physically exist in the summed state. In the ensemble interpretation, wave function collapse would make as much sense as saying that the number of children a couple produced, collapsed to 3 from its average value of 2.4.

The state function is not taken to be physically real, or be a literal summation of states. The wave function, is taken to be an abstract statistical function, only applicable to the statistics of repeated preparation procedures. The ket does not directly apply to a single particle detection, but only the statistical results of many. This is why the account does not refer to bras, and mentions only kets.

Diffraction

The ensemble approach differs significantly from the Copenhagen approach in its view of diffraction. The Copenhagen interpretation of diffraction, especially in the viewpoint of Niels Bohr, puts weight on the doctrine of wave–particle duality. In this view, a particle that is diffracted by a diffractive object, such as for example a crystal, is regarded as really and physically behaving like a wave, split into components, more or less corresponding to the peaks of intensity in the diffraction pattern. Though Dirac does not speak of wave–particle duality, he does speak of "conflict" between wave and particle conceptions. He indeed does describe a particle, before it is detected, as being somehow simultaneously and jointly or partly present in the several beams into which the original beam is diffracted. So does Feynman, who speaks of this as "mysterious".

The ensemble approach points out that this seems perhaps reasonable for a wave function that describes a single particle, but hardly makes sense for a wave function that describes a system of several particles. The ensemble approach demystifies this situation along the lines advocated by Alfred Landé, accepting Duane's hypothesis. In this view, the particle really and definitely goes into one or other of the beams, according to a probability given by the wave function appropriately interpreted. There is definite quantal transfer of translative momentum between particle and diffractive object. This is recognized also in Heisenberg's 1930 textbook, though usually not recognized as part of the doctrine of the so-called "Copenhagen interpretation". This gives a clear and utterly non-mysterious physical or direct explanation instead of the debated concept of wave function "collapse". It is presented in terms of quantum mechanics by other present day writers also, for example, Van Vliet. For those who prefer physical clarity rather than mysterianism, this is an advantage of the ensemble approach, though it is not the sole property of the ensemble approach. With a few exceptions, this demystification is not recognized or emphasized in many textbooks and journal articles.

Criticism

David Mermin sees the ensemble interpretation as being motivated by an adherence ("not always acknowledged") to classical principles.
"[...] the notion that probabilistic theories must be about ensembles implicitly assumes that probability is about ignorance. (The 'hidden variables' are whatever it is that we are ignorant of.) But in a non-deterministic world probability has nothing to do with incomplete knowledge, and ought not to require an ensemble of systems for its interpretation".
However, according to Einstein and others, a key motivation for the ensemble interpretation is not about any alleged, implicitly assumed probabilistic ignorance, but the removal of "…unnatural theoretical interpretations…". A specific example being the Schrödinger cat problem stated above, but this concept applies to any system where there is an interpretation that postulates, for example, that an object might exist in two positions at once. 

Mermin also emphasises the importance of describing single systems, rather than ensembles.
"The second motivation for an ensemble interpretation is the intuition that because quantum mechanics is inherently probabilistic, it only needs to make sense as a theory of ensembles. Whether or not probabilities can be given a sensible meaning for individual systems, this motivation is not compelling. For a theory ought to be able to describe as well as predict the behavior of the world. The fact that physics cannot make deterministic predictions about individual systems does not excuse us from pursuing the goal of being able to describe them as they currently are."

Single particles

According to proponents of this interpretation, no single system is ever required to be postulated to exist in a physical mixed state so the state vector does not need to collapse.

It can also be argued that this notion is consistent with the standard interpretation in that, in the Copenhagen interpretation, statements about the exact system state prior to measurement cannot be made. That is, if it were possible to absolutely, physically measure say, a particle in two positions at once, then quantum mechanics would be falsified as quantum mechanics explicitly postulates that the result of any measurement must be a single eigenvalue of a single eigenstate.

Criticism

Arnold Neumaier finds limitations with the applicability of the ensemble interpretation to small systems.
"Among the traditional interpretations, the statistical interpretation discussed by Ballentine in Rev. Mod. Phys. 42, 358-381 (1970) is the least demanding (assumes less than the Copenhagen interpretation and the Many Worlds interpretation) and the most consistent one. It explains almost everything, and only has the disadvantage that it explicitly excludes the applicability of QM to single systems or very small ensembles (such as the few solar neutrinos or top quarks actually detected so far), and does not bridge the gulf between the classical domain (for the description of detectors) and the quantum domain (for the description of the microscopic system)".
(spelling amended)
However, the "ensemble" of the ensemble interpretation is not directly related to a real, existing collection of actual particles, such as a few solar neutrinos, but it is concerned with the ensemble collection of a virtual set of experimental preparations repeated many times. This ensemble of experiments may include just one particle/one system or many particles/many systems. In this light, it is arguably, difficult to understand Neumaier's criticism, other than that Neumaier possibly misunderstands the basic premise of the ensemble interpretation itself.

Schrödinger's cat

The ensemble interpretation states that superpositions are nothing but subensembles of a larger statistical ensemble. That being the case, the state vector would not apply to individual cat experiments, but only to the statistics of many similar prepared cat experiments. Proponents of this interpretation state that this makes the Schrödinger's cat paradox a trivial non-issue. However, the application of state vectors to individual systems, rather than ensembles, has claimed explanatory benefits, in areas like single-particle twin-slit experiments and quantum computing. As an avowedly minimalist approach, the ensemble interpretation does not offer any specific alternative explanation for these phenomena.

The frequentist probability variation

The claim that the wave functional approach fails to apply to single particle experiments cannot be taken as a claim that quantum mechanics fails in describing single-particle phenomena. In fact, it gives correct results within the limits of a probabilistic or stochastic theory.

Probability always requires a set of multiple data, and thus single-particle experiments are really part of an ensemble — an ensemble of individual experiments that are performed one after the other over time. In particular, the interference fringes seen in the double-slit experiment require repeated trials to be observed.

The quantum Zeno effect

Leslie Ballentine promoted the ensemble interpretation in his book Quantum Mechanics, A Modern Development. In it, he described what he called the "Watched Pot Experiment". His argument was that, under certain circumstances, a repeatedly measured system, such as an unstable nucleus, would be prevented from decaying by the act of measurement itself. He initially presented this as a kind of reductio ad absurdum of wave function collapse.

The effect has been shown to be real. Ballentine later wrote papers claiming that it could be explained without wave function collapse.

Classical ensemble ideas

These views regard the randomness of the ensemble as fully defined by the preparation, neglecting the subsequent random contribution of the observing process. This neglect was particularly criticized by Bohr.

Einstein

Early proponents, for example Einstein, of statistical approaches regarded quantum mechanics as an approximation to a classical theory. John Gribbin writes:
"The basic idea is that each quantum entity (such as an electron or a photon) has precise quantum properties (such as position or momentum) and the quantum wavefunction is related to the probability of getting a particular experimental result when one member (or many members) of the ensemble is selected by an experiment"
But hopes for turning quantum mechanics back into a classical theory were dashed. Gribbin continues:
"There are many difficulties with the idea, but the killer blow was struck when individual quantum entities such as photons were observed behaving in experiments in line with the quantum wave function description. The Ensemble interpretation is now only of historical interest."
In 1936 Einstein wrote a paper, in German, in which, amongst other matters, he considered quantum mechanics in general conspectus.

He asked "How far does the ψ-function describe a real state of a mechanical system?" Following this, Einstein offers some argument that leads him to infer that "It seems to be clear, therefore, that the Born statistical interpretation of the quantum theory is the only possible one." At this point a neutral student may ask do Heisenberg and Bohr, considered respectively in their own rights, agree with that result? Born in 1971 wrote about the situation in 1936: "All theoretical physicists were in fact working with the statistical concept by then; this was particularly true of Niels Bohr and his school, who also made a vital contribution to the clarification of the concept."

Where, then, is to be found disagreement between Bohr and Einstein on the statistical interpretation? Not in the basic link between theory and experiment; they agree on the Born "statistical" interpretation". They disagree on the metaphysical question of the determinism or indeterminism of evolution of the natural world. Einstein believed in determinism while Bohr (and it seems many physicists) believed in indeterminism; the context is atomic and sub-atomic physics. It seems that this is a fine question. Physicists generally believe that the Schrödinger equation describes deterministic evolution for atomic and sub-atomic physics. Exactly how that might relate to the evolution of the natural world may be a fine question.

Objective-realist version

Willem de Muynck describes an "objective-realist" version of the ensemble interpretation featuring counterfactual definiteness and the "possessed values principle", in which values of the quantum mechanical observables may be attributed to the object as objective properties the object possesses independent of observation. He states that there are "strong indications, if not proofs" that neither is a possible assumption.

Causal loop

From Wikipedia, the free encyclopedia
Top: original billiard ball trajectory.Middle: the ball emerges from the future at a different trajectory from the original, and collides with its past self, changing its trajectory.Bottom: the changed trajectory causes the ball to enter and exit the time machine in exactly the same way that changed its trajectory. The changed trajectory is its own cause, without an origin.
 
A causal loop is a theoretical proposition in which, by means of either retrocausality or time travel, a sequence of events (actions, information, objects, people) is among the causes of another event, which is in turn among the causes of the first-mentioned event. Such causally looped events then exist in spacetime, but their origin cannot be determined. A hypothetical example of a causality loop is given of a billiard ball striking its past self: the billiard ball moves in a path towards a time machine, and the future self of the billiard ball emerges from the time machine before its past self enters it, giving its past self a glancing blow, altering the past ball's path and causing it to enter the time machine at an angle that would cause its future self to strike its past self the very glancing blow that altered its path. So, the question here in this paradox is, how was the ball struck in the first place? 

Terminology in physics, philosophy, and fiction

Backwards time travel would allow for causal loops involving events, information, people or objects whose histories form a closed loop, and thus seem to "come from nowhere." The notion of objects or information that are "self-existing" in this way is often viewed as paradoxical, with several authors referring to a causal loop involving information or objects without origin as a bootstrap paradox, an information paradox, or an ontological paradox. The use of "bootstrap" in this context refers to the expression "pulling yourself up by your bootstraps" and to Robert A. Heinlein's time travel story "By His Bootstraps". The term "time loop" is sometimes used to refer to a causal loop, but although they appear similar, causal loops are unchanging and self-originating, whereas time loops are constantly resetting.

An example of a causal loop paradox involving information is given by Everett: suppose a time traveler copies a mathematical proof from a textbook, then travels back in time to meet the mathematician who first published the proof, at a date prior to publication, and allows the mathematician to simply copy the proof. In this case, the information in the proof has no origin. A similar example is given in the television series Doctor Who of a hypothetical time-traveler who copies Beethoven's music from the future and publishes it in Beethoven's time in Beethoven's name. Everett gives the movie Somewhere in Time as an example involving an object with no origin: an old woman gives a watch to a playwright who later travels back in time and meets the same woman when she was young, and gives her the same watch that she will later give to him.

Krasnikov writes that these bootstrap paradoxes – information or an object looping through time – are the same; the primary apparent paradox is a physical system evolving into a state in a way that is not governed by its laws. He does not find this paradoxical, and attributes problems regarding the validity of time travel to other factors in the interpretation of general relativity.

A 1992 paper by physicists Andrei Lossev and Igor Novikov labeled such items without origin as Jinn, with the singular term Jinnee. This terminology was inspired by the Jinn of the Quran, which are described as leaving no trace when they disappear. Lossev and Novikov allowed the term "Jinn" to cover both objects and information with reflexive origin; they called the former "Jinn of the first kind", and the latter "Jinn of the second kind". They point out that an object making circular passage through time must be identical whenever it is brought back to the past, otherwise it would create an inconsistency; the second law of thermodynamics seems to require that the object become more disordered over the course of its history, and such objects that are identical in repeating points in their history seem to contradict this, but Lossev and Novikov argued that since the second law only requires disorder to increase in closed systems, a Jinnee could interact with its environment in such a way as to regain lost order. They emphasize that there is no "strict difference" between Jinn of the first and second kind. Krasnikov equivocates between "Jinn", "self-sufficient loops", and "self-existing objects", calling them "lions" or "looping or intruding objects", and asserts that they are no less physical than conventional objects, "which, after all, also could appear only from either infinity, or a singularity."

The term predestination paradox is used in the Star Trek franchise to mean "a time loop in which a time traveler who has gone into the past causes an event that ultimately causes the original future version of the person to go back into the past." This use of the phrase was created for a sequence in a 1996 episode of Star Trek: Deep Space Nine titled "Trials and Tribble-ations", although the phrase had been used previously to refer to belief systems such as Calvinism and some forms of Marxism that encouraged followers to strive to produce certain outcomes while at the same time teaching that the outcomes were predetermined. Smeenk and Morgenstern use the term "predestination paradox" to refer specifically to situations in which a time traveler goes back in time to try to prevent some event in the past, but ends up helping to cause that same event.

Self-fulfilling prophecy

A self-fulfilling prophecy may be a form of causality loop, only when the prophecy can be said to be truly known to occur, since only then events in the future will be causing effects in the past. Otherwise, it would be a simple case of events in the past causing events in the future. Predestination does not necessarily involve a supernatural power, and could be the result of other "infallible foreknowledge" mechanisms. Problems arising from infallibility and influencing the future are explored in Newcomb's paradox. A notable fictional example of a self-fulfilling prophecy occurs in the classical play Oedipus Rex, in which Oedipus becomes the king of Thebes and in the process unwittingly fulfills a prophecy that he would kill his father and marry his mother. The prophecy itself serves as the impetus for his actions, and thus it is self-fulfilling. The movie 12 Monkeys heavily deals with themes of predestination and the Cassandra complex, where the protagonist who travels back in time explains that he can't change the past.

Novikov self-consistency principle

General relativity permits some exact solutions that allow for time travel. Some of these exact solutions describe universes that contain closed timelike curves, or world lines that lead back to the same point in spacetime. Physicist Igor Dmitriyevich Novikov discussed the possibility of closed timelike curves in his books in 1975 and 1983, offering the opinion that only self-consistent trips back in time would be permitted. In a 1990 paper by Novikov and several others, "Cauchy problem in spacetimes with closed timelike curves", the authors suggested the principle of self-consistency, which states that the only solutions to the laws of physics that can occur locally in the real Universe are those which are globally self-consistent. The authors later concluded that time travel need not lead to unresolvable paradoxes, regardless of what type of object was sent to the past.

Physicist Joseph Polchinski argued that one could avoid questions of free will by considering a potentially paradoxical situation involving a billiard ball sent back in time. In this situation, the ball is fired into a wormhole at an angle such that, if it continues along its course, it will exit in the past at just the right angle to hit its earlier self, knocking it off course, which would stop it from entering the wormhole in the first place. Thorne referred to this problem as "Polchinski's paradox". Two students at Caltech, Fernando Echeverria and Gunnar Klinkhammer, went on to find a solution that avoided any inconsistencies. In the revised scenario, the ball would emerge from the future at a different angle than the one that had generated the paradox, and delivers its past self a glancing blow instead of knocking it completely away from the wormhole. This blow changes its trajectory by just the right degree, meaning it will travel back in time with the angle required to deliver its younger self the necessary glancing blow. Echeverria and Klinkhammer actually found that there was more than one self-consistent solution, with slightly different angles for the glancing blow in each case. Later analysis by Thorne and Robert Forward showed that for certain initial trajectories of the billiard ball, there could actually be an infinite number of self-consistent solutions.

Echeverria, Klinkhammer and Thorne published a paper discussing these results in 1991; in addition, they reported that they had tried to see if they could find any initial conditions for the billiard ball for which there were no self-consistent extensions, but were unable to do so. Thus it is plausible that there exist self-consistent extensions for every possible initial trajectory, although this has not been proven. The lack of constraints on initial conditions only applies to spacetime outside of the chronology-violating region of spacetime; the constraints on the chronology-violating region might prove to be paradoxical, but this is not yet known.

Novikov's views are not widely accepted. Visser views causal loops and Novikov's self-consistency principle as an ad hoc solution, and supposes that there are far more damaging implications of time travel. Krasnikov similarly finds no inherent fault in causal loops, but finds other problems with time travel in general relativity.

Quantum computation with negative delay

Physicist David Deutsch shows in a 1991 paper that quantum computation with a negative delay—backwards time travel—could solve NP problems in polynomial time, and Scott Aaronson later extended this result to show that the model could also be used to solve PSPACE problems in polynomial time. Deutsch shows that quantum computation with a negative delay produces only self-consistent solutions, and the chronology-violating region imposes constraints that are not apparent through classical reasoning. Researchers published in 2014 a simulation validating Deutsch's model with photons. However, it was shown in an article by Tolksdorf and Verch that Deutsch's CTC (closed timelike curve, or a causal loop) fixed point condition can be fulfilled to arbitrary precision in any quantum system described according to relativistic quantum field theory on spacetimes where CTCs are excluded, casting doubts on whether Deutsch's condition is really characteristic of quantum processes mimicking CTCs in the sense of general relativity.

Indeterminism

From Wikipedia, the free encyclopedia

Indeterminism is the idea that events (or certain events, or events of certain types) are not caused, or not caused deterministically.

It is the opposite of determinism and related to chance. It is highly relevant to the philosophical problem of free will, particularly in the form of metaphysical libertarianism. In science, most specifically quantum theory in physics, indeterminism is the belief that no event is certain and the entire outcome of anything is probabilistic. The Heisenberg uncertainty relations and the "Born rule", proposed by Max Born, are often starting points in support of the indeterministic nature of the universe. Indeterminism is also asserted by Sir Arthur Eddington, and Murray Gell-Mann. Indeterminism has been promoted by the French biologist Jacques Monod's essay "Chance and Necessity". The physicist-chemist Ilya Prigogine argued for indeterminism in complex systems.

Necessary but insufficient causation

Indeterminists do not have to deny that causes exist. Instead, they can maintain that the only causes that exist are of a type that do not constrain the future to a single course; for instance, they can maintain that only necessary and not sufficient causes exist. The necessary/sufficient distinction works as follows: 

If x is a necessary cause of y; then the presence of y implies that x definitely preceded it. The presence of x, however, does not imply that y will occur.

If x is a sufficient cause of y, then the presence of y implies that x may have preceded it. (However, another cause z may alternatively cause y. Thus the presence of y does not imply the presence of x, or z, or any other suspect.)

It is possible for everything to have a necessary cause, even while indeterminism holds and the future is open, because a necessary condition does not lead to a single inevitable effect. Indeterministic (or probabilistic) causation is a proposed possibility, such that "everything has a cause" is not a clear statement of determinism.

Probabilistic causation

Interpreting causation as a deterministic relation means that if A causes B, then A must always be followed by B. In this sense, war does not cause deaths, nor does smoking cause cancer. As a result, many turn to a notion of probabilistic causation. Informally, A probabilistically causes B if A's occurrence increases the probability of B. This is sometimes interpreted to reflect the imperfect knowledge of a deterministic system but other times interpreted to mean that the causal system under study has an inherently indeterministic nature. (Propensity probability is an analogous idea, according to which probabilities have an objective existence and are not just limitations in a subject's knowledge).

It can be proved that realizations of any probability distribution other than the uniform one are mathematically equal to applying a (deterministic) function (namely, an inverse distribution function) on a random variable following the latter (i.e. an "absolutely random" one); the probabilities are contained in the deterministic element. A simple form of demonstrating it would be shooting randomly within a square and then (deterministically) interpreting a relatively large subsquare as the more probable outcome.

Intrinsic indeterminism versus unpredictability

A distinction is generally made between indeterminism and the mere inability to measure the variables (limits of precision). This is especially the case for physical indeterminism (as proposed by various interpretations of quantum mechanics). Yet some philosophers have argued that indeterminism and unpredictability are synonymous.

Philosophy

One of the important philosophical implications of determinism is that, according to incompatibilists, it undermines many versions of free will, also undermining the sense of moral responsibility and the judgement of regret. You wouldn’t even pass the judgement of regret since moral responsibility is irrelevant; murdering a man would be no different than drinking water when you are thirsty. First of all, this lack of moral responsibility is chaotic in and of itself; the act of drinking water is certainly morally distinct from murdering a man. To clarify, a deterministic world would consider your action, such as murdering a man, to be the only possibility of what could have happened; the outcome of not murdering the man is literally impossible. If this was true, as Kant states, if our will is determined by antecedent causes, then we are no longer the ones responsible for those actions, because those actions that are determined by a force outside of ourselves. The moral reality of our world is greatly disturbed by determinism, because murdering a man is clearly morally wrong. The judgement of regret is also irrelevant in a deterministic world according to William James in his “Dilemma of Determinism”. We simply would have no logical reason to regret, to consider an “impossible” event to happen in place of “necessity”, to make moral judgements on past events that could not possibly obtain any other outcome. Our ability and will to pass the judgement of regret, on the contrary, is proof that our world is in fact indeterministic and reaffirms the uncertainty of the outcomes of events. The judgement of regret can be effectively passed, because our will is not determined by antecedent causes. Bertrand Russell presents an argument in his essay “Elements of Ethics” against these antecedent causes. Imagine this, we are presented with two alternative choices; determinism maintains that our will to choose one of them is driven by an antecedent cause, and the other two alternatives would be impossible, “but that does not prevent our will from being itself the cause of the other effects (Russell).” The fact that different possibilities are able to be caused and chosen by our will means that morality (right and wrong) is able to be distinguished from the choices. The ability to effectively judge the different possible outcomes is rock hard proof that moral responsibility exists and should be kept in check, and it lines up perfectly with indeterminism.

Ancient Greek philosophy

Leucippus

The oldest mention of the concept of chance is by the earliest philosopher of atomism, Leucippus, who said:
"The cosmos, then, became like a spherical form in this way: the atoms being submitted to a casual and unpredictable movement, quickly and incessantly".

Aristotle

Aristotle described four possible causes (material, efficient, formal, and final). Aristotle's word for these causes was αἰτίαι (aitiai, as in aetiology), which translates as causes in the sense of the multiple factors responsible for an event. Aristotle did not subscribe to the simplistic "every event has a (single) cause" idea that was to come later.

In his Physics and Metaphysics, Aristotle said there were accidents (συμβεβηκός, sumbebekos) caused by nothing but chance (τύχη, tukhe). He noted that he and the early physicists found no place for chance among their causes.
We have seen how far Aristotle distances himself from any view which makes chance a crucial factor in the general explanation of things. And he does so on conceptual grounds: chance events are, he thinks, by definition unusual and lacking certain explanatory features: as such they form the complement class to those things which can be given full natural explanations.
— R.J. Hankinson, "Causes" in Blackwell Companion to Aristotle
Aristotle opposed his accidental chance to necessity:
Nor is there any definite cause for an accident, but only chance (τυχόν), namely an indefinite (ἀόριστον) cause.
It is obvious that there are principles and causes which are generable and destructible apart from the actual processes of generation and destruction; for if this is not true, everything will be of necessity: that is, if there must necessarily be some cause, other than accidental, of that which is generated and destroyed. Will this be, or not? Yes, if this happens; otherwise not.

Pyrrhonism

The philosopher Sextus Empiricus described the Pyrrhonist position on causes as follows:
...we show the existence of causes are plausible, and if those, too, are plausible which prove that it is incorrect to assert the existence of a cause, and if there is no way to give preference to any of these over others – since we have no agreed-upon sign, criterion, or proof, as has been pointed out earlier – then, if we go by the statements of the Dogmatists, it is necessary to suspend judgment about the existence of causes, too, saying that they are no more existent than non-existent

Epicureanism

Epicurus argued that as atoms moved through the void, there were occasions when they would "swerve" (clinamen) from their otherwise determined paths, thus initiating new causal chains. Epicurus argued that these swerves would allow us to be more responsible for our actions, something impossible if every action was deterministically caused. For Epicureanism, the occasional interventions of arbitrary gods would be preferable to strict determinism.

Early modern philosophy

In 1729 theTestament of Jean Meslier states:
"The matter, by virtue of its own active force, moves and acts in blind manner".
Soon after Julien Offroy de la Mettrie in his L'Homme Machine. (1748, anon.) wrote:
"Perhaps, the cause of man's existence is just in existence itself? Perhaps he is by chance thrown in some point of this terrestrial surface without any how and why".
In his Anti-Sénèque [Traité de la vie heureuse, par Sénèque, avec un Discours du traducteur sur le même sujet, 1750] we read:
"Then, the chance has thrown us in life".
In the 19th century the French Philosopher Antoine-Augustin Cournot theorized chance in a new way, as series of not-linear causes. He wrote in Essai sur les fondements de nos connaissances (1851):
"It is not because of rarity that the chance is actual. On the contrary, it is because of chance they produce many possible others."

Modern philosophy

Charles Peirce

Tychism (Greek: τύχη "chance") is a thesis proposed by the American philosopher Charles Sanders Peirce in the 1890s. It holds that absolute chance, also called spontaneity, is a real factor operative in the universe. It may be considered both the direct opposite of Albert Einstein's oft quoted dictum that: "God does not play dice with the universe" and an early philosophical anticipation of Werner Heisenberg's uncertainty principle.

Peirce does not, of course, assert that there is no law in the universe. On the contrary, he maintains that an absolutely chance world would be a contradiction and thus impossible. Complete lack of order is itself a sort of order. The position he advocates is rather that there are in the universe both regularities and irregularities.

Karl Popper comments that Peirce's theory received little contemporary attention, and that other philosophers did not adopt indeterminism until the rise of quantum mechanics.

Arthur Holly Compton

In 1931, Arthur Holly Compton championed the idea of human freedom based on quantum indeterminacy and invented the notion of amplification of microscopic quantum events to bring chance into the macroscopic world. In his somewhat bizarre mechanism, he imagined sticks of dynamite attached to his amplifier, anticipating the Schrödinger's cat paradox.

Reacting to criticisms that his ideas made chance the direct cause of our actions, Compton clarified the two-stage nature of his idea in an Atlantic Monthly article in 1955. First there is a range of random possible events, then one adds a determining factor in the act of choice.
A set of known physical conditions is not adequate to specify precisely what a forthcoming event will be. These conditions, insofar as they can be known, define instead a range of possible events from among which some particular event will occur. When one exercises freedom, by his act of choice he is himself adding a factor not supplied by the physical conditions and is thus himself determining what will occur. That he does so is known only to the person himself. From the outside one can see in his act only the working of physical law. It is the inner knowledge that he is in fact doing what he intends to do that tells the actor himself that he is free.
Compton welcomed the rise of indeterminism in 20th century science, writing:
In my own thinking on this vital subject I am in a much more satisfied state of mind than I could have been at any earlier stage of science. If the statements of the laws of physics were assumed correct, one would have had to suppose (as did most philosophers) that the feeling of freedom is illusory, or if [free] choice were considered effective, that the laws of physics ... [were] unreliable. The dilemma has been an uncomfortable one.
Together with Arthur Eddington in Britain, Compton was one of those rare distinguished physicists in the English speaking world of the late 1920s and throughout the 1930s arguing for the “liberation of free will” with the help of Heisenberg’s indeterminacy principle, but their efforts had been met not only with physical and philosophical criticism but most primarily with fierce political and ideological campaigns.

Karl Popper

In his essay Of Clouds and Clocks, included in his book Objective Knowledge, Popper contrasted "clouds", his metaphor for indeterministic systems, with "clocks", meaning deterministic ones. He sided with indeterminism, writing
I believe Peirce was right in holding that all clocks are clouds to some considerable degree — even the most precise of clocks. This, I think, is the most important inversion of the mistaken determinist view that all clouds are clocks.
Popper was also a promoter of propensity probability.

Robert Kane

Kane is one of the leading contemporary philosophers on free will. Advocating what is termed within philosophical circles "libertarian freedom", Kane argues that "(1) the existence of alternative possibilities (or the agent's power to do otherwise) is a necessary condition for acting freely, and (2) determinism is not compatible with alternative possibilities (it precludes the power to do otherwise)". It is important to note that the crux of Kane's position is grounded not in a defense of alternative possibilities (AP) but in the notion of what Kane refers to as ultimate responsibility (UR). Thus, AP is a necessary but insufficient criterion for free will. It is necessary that there be (metaphysically) real alternatives for our actions, but that is not enough; our actions could be random without being in our control. The control is found in "ultimate responsibility".

What allows for ultimate responsibility of creation in Kane's picture are what he refers to as "self-forming actions" or SFAs — those moments of indecision during which people experience conflicting wills. These SFAs are the undetermined, regress-stopping voluntary actions or refrainings in the life histories of agents that are required for UR. UR does not require that every act done of our own free will be undetermined and thus that, for every act or choice, we could have done otherwise; it requires only that certain of our choices and actions be undetermined (and thus that we could have done otherwise), namely SFAs. These form our character or nature; they inform our future choices, reasons and motivations in action. If a person has had the opportunity to make a character-forming decision (SFA), he is responsible for the actions that are a result of his character.

Mark Balaguer

Mark Balaguer, in his book Free Will as an Open Scientific Problem argues similarly to Kane. He believes that, conceptually, free will requires indeterminism, and the question of whether the brain behaves indeterministically is open to further empirical research. He has also written on this matter "A Scientifically Reputable Version of Indeterministic Libertarian Free Will".

Science

Mathematics

In probability theory, a stochastic process /stˈkæstɪk/, or sometimes random process, is the counterpart to a deterministic process (or deterministic system). Instead of dealing with only one possible reality of how the process might evolve over time (as is the case, for example, for solutions of an ordinary differential equation), in a stochastic or random process there is some indeterminacy in its future evolution described by probability distributions. This means that even if the initial condition (or starting point) is known, there are many possibilities the process might go to, but some paths may be more probable and others less so.

Classical and relativistic physics

The idea that Newtonian physics proved causal determinism was highly influential in the early modern period. "Thus physical determinism [..] became the ruling faith among enlightened men; and everybody who did not embrace this new faith was held to be an obscurantist and a reactionary". However: "Newton himself may be counted among the few dissenters, for he regarded the solar system as imperfect, and consequently as likely to perish".

Classical chaos is not usually considered an example of indeterminism, as it can occur in deterministic systems such as the three-body problem.

John Earman has argued that most physical theories are indeterministic. For instance, Newtonian physics admits solutions where particles accelerate continuously, heading out towards infinity. By the time reversibility of the laws in question, particles could also head inwards, unprompted by any pre-existing state. He calls such hypothetical particles "space invaders".

John D. Norton has suggested another indeterministic scenario, known as Norton's Dome, where a particle is initially situated on the exact apex of a dome.

Branching space-time is a theory uniting indeterminism and the special theory of relativity. The idea was originated by Nuel Belnap. The equations of general relativity admit of both indeterministic and deterministic solutions.

Boltzmann

Ludwig Boltzmann, was one of the founders of statistical mechanics and the modern atomic theory of matter. He is remembered for his discovery that the second law of thermodynamics is a statistical law stemming from disorder. He also speculated that the ordered universe we see is only a small bubble in much larger sea of chaos. The Boltzmann brain is a similar idea. He can be considered one of few indeterminists to embrace pure chance.

Evolution and biology

Darwinian evolution has an enhanced reliance on the chance element of random mutation compared to the earlier evolutionary theory of Herbert Spencer. However, the question of whether evolution requires genuine ontological indeterminism is open to debate.

In the essay Chance and Necessity (1970) Jacques Monod rejected the role of final causation in biology, instead arguing that a mixture of efficient causation and "pure chance" lead to teleonomy, or merely apparent purposefulness.

The Japanese theoretical population geneticist Motoo Kimura emphasises the role of indeterminism in evolution. According to neutral theory of molecular evolution: "at the molecular level most evolutionary change is caused by random drift of gene mutants that are equivalent in the face of selection.

Prigogine

In his 1997 book, The End of Certainty, Prigogine contends that determinism is no longer a viable scientific belief. "The more we know about our universe, the more difficult it becomes to believe in determinism." This is a major departure from the approach of Newton, Einstein and Schrödinger, all of whom expressed their theories in terms of deterministic equations. According to Prigogine, determinism loses its explanatory power in the face of irreversibility and instability.

Prigogine traces the dispute over determinism back to Darwin, whose attempt to explain individual variability according to evolving populations inspired Ludwig Boltzmann to explain the behavior of gases in terms of populations of particles rather than individual particles. This led to the field of statistical mechanics and the realization that gases undergo irreversible processes. In deterministic physics, all processes are time-reversible, meaning that they can proceed backward as well as forward through time. As Prigogine explains, determinism is fundamentally a denial of the arrow of time. With no arrow of time, there is no longer a privileged moment known as the "present," which follows a determined "past" and precedes an undetermined "future." All of time is simply given, with the future as determined or undetermined as the past. With irreversibility, the arrow of time is reintroduced to physics. Prigogine notes numerous examples of irreversibility, including diffusion, radioactive decay, solar radiation, weather and the emergence and evolution of life. Like weather systems, organisms are unstable systems existing far from thermodynamic equilibrium. Instability resists standard deterministic explanation. Instead, due to sensitivity to initial conditions, unstable systems can only be explained statistically, that is, in terms of probability.

Prigogine asserts that Newtonian physics has now been "extended" three times, first with the use of the wave function in quantum mechanics, then with the introduction of spacetime in general relativity and finally with the recognition of indeterminism in the study of unstable systems.

Quantum mechanics

At one time, it was assumed in the physical sciences that if the behavior observed in a system cannot be predicted, the problem is due to lack of fine-grained information, so that a sufficiently detailed investigation would eventually result in a deterministic theory ("If you knew exactly all the forces acting on the dice, you would be able to predict which number comes up").

However, the advent of quantum mechanics removed the underpinning from that approach, with the claim that (at least according to the Copenhagen interpretation) the most basic constituents of matter at times behave indeterministically. This comes from the collapse of the wave function, in which the state of a system upon measurement cannot in general be predicted. Quantum mechanics only predicts the probabilities of possible outcomes, which are given by the Born rule. Non-deterministic behavior upon wave function collapse is not only a feature of the Copenhagen interpretation, with its observer-dependence, but also of objective collapse theories.

Opponents of quantum indeterminism suggested that determinism could be restored by formulating a new theory in which additional information, so-called hidden variables, would allow definite outcomes to be determined. For instance, in 1935, Einstein, Podolsky and Rosen wrote a paper titled "Can Quantum-Mechanical Description of Physical Reality Be Considered Complete?" arguing that such a theory was in fact necessary to preserve the principle of locality. In 1964, John S. Bell was able to define a theoretical test for these local hidden variable theories, which was reformulated as a workable experimental test through the work of Clauser, Horne, Shimony and Holt. The negative result of the 1980s tests by Alain Aspect ruled such theories out, provided certain assumptions about the experiment hold. Thus any interpretation of quantum mechanics, including deterministic reformulations, must either reject locality or reject counterfactual definiteness altogether. David Bohm's theory is the main example of a non-local deterministic quantum theory.

The many-worlds interpretation is said to be deterministic, but experimental results still cannot be predicted: experimenters do not know which 'world' they will end up in. Technically, counterfactual definiteness is lacking.

A notable consequence of quantum indeterminism is the Heisenberg uncertainty principle, which prevents the simultaneous accurate measurement of all a particle's properties.

Cosmology

Primordial fluctuations are density variations in the early universe which are considered the seeds of all structure in the universe. Currently, the most widely accepted explanation for their origin is in the context of cosmic inflation. According to the inflationary paradigm, the exponential growth of the scale factor during inflation caused quantum fluctuations of the inflaton field to be stretched to macroscopic scales, and, upon leaving the horizon, to "freeze in". At the later stages of radiation- and matter-domination, these fluctuations re-entered the horizon, and thus set the initial conditions for structure formation.

Neuroscience

Neuroscientists such as Bjoern Brembs and Christof Koch believe thermodynamically stochastic processes in the brain are the basis of free will, and that even very simple organisms such as flies have a form of free will. Similar ideas are put forward by some philosophers such as Robert Kane

Despite recognizing indeterminism to be a very low-level, necessary prerequisite, Bjoern Brembs says that it's not even close to being sufficient for addressing things like morality and responsibility. Edward O. Wilson does not extrapolate from bugs to people, and Corina E. Tarnita alerts against trying to draw parallels between people and insects, since human selflessness and cooperation, however, is of a different sort, also involving the interaction of culture and sentience, not just genetics and environment.

Other views

Against Einstein and others who advocated determinism, indeterminism—as championed by the English astronomer Sir Arthur Eddington—says that a physical object has an ontologically undetermined component that is not due to the epistemological limitations of physicists' understanding. The uncertainty principle, then, would not necessarily be due to hidden variables but to an indeterminism in nature itself.

Determinism and indeterminism are examined in Causality and Chance in Modern Physics by David Bohm. He speculates that, since determinism can emerge from underlying indeterminism (via the law of large numbers), and that indeterminism can emerge from determinism (for instance, from classical chaos), the universe could be conceived of as having alternating layers of causality and chaos.

Representation of a Lie group

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Representation_of_a_Lie_group...