Search This Blog

Saturday, November 2, 2024

Action at a distance

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Action_at_a_distance

Action at a distance is the concept in physics that an object's motion can be affected by another object without the two being in physical contact; that is, it is the concept of the non-local interaction of objects that are separated in space. Coulomb's law and Newton's law of universal gravitation are based on action at a distance.

Historically, action at a distance was the earliest scientific model for gravity and electricity and it continues to be useful in many practical cases. In the 19th and 20th centuries, field models arose to explain these phenomena with more precision. The discovery of electrons and of special relativity led to new action at a distance models providing alternative to field theories. Under our modern understanding, the four fundamental interactions (gravity, electromagnetism, the strong interaction and the weak interaction) in all of physics are not described by action at a distance.

Categories of action

In the study of mechanics, action at a distance is one of three fundamental actions on matter that cause motion. The other two are direct impact (elastic or inelastic collisions) and actions in a continuous medium as in fluid mechanics or solid mechanics. Historically, physical explanations for particular phenomena have moved between these three categories over time as new models were developed.

Action-at-a-distance and actions in a continuous medium may be easily distinguished when the medium dynamics are visible, like waves in water or in an elastic solid. In the case of electricity or gravity, no medium is required. In the nineteenth century, criteria like the effect of actions on intervening matter, the observation of a time delay, the apparent storage of energy, or even the possibility of a plausible mechanical model for action transmission were all accepted as evidence against action at a distance. Aether theories were alternative proposals to replace apparent action-at-a-distance in gravity and electromagnetism, in terms of continuous action inside an (invisible) medium called "aether".

Direct impact of macroscopic objects seems visually distinguishable from action at a distance. If however the objects are constructed of atoms, and the volume of those atoms is not defined and atoms interact by electric and magnetic forces, the distinction is less clear.

Roles

The concept of action at a distance acts in multiple roles in physics and it can co-exist with other models according to the needs of each physical problem.

One role is as a summary of physical phenomena, independent of any understanding of the cause of such an action. For example, astronomical tables of planetary positions can be compactly summarized using Newton's law of universal gravitation, which assumes the planets interact without contact or an intervening medium. As a summary of data, the concept does not need to be evaluated as a plausible physical model.

Action at a distance also acts as a model explaining physical phenomena even in the presence of other models. Again in the case of gravity, hypothesizing an instantaneous force between masses allows the return time of comets to be predicted as well as predicting the existence of previously unknown planets, like Neptune. These triumphs of physics predated the alternative more accurate model for gravity based on general relativity by many decades.

Introductory physics textbooks discuss central forces, like gravity, by models based on action-at-distance without discussing the cause of such forces or issues with it until the topics of relativity and fields are discussed. For example, see The Feynman Lectures on Physics on gravity.

History

Early inquiries into motion

Action-at-a-distance as a physical concept requires identifying objects, distances, and their motion. In antiquity, ideas about the natural world were not organized in these terms. Objects in motion were modeled as living beings. Around 1600, the scientific method began to take root. René Descartes held a more fundamental view, developing ideas of matter and action independent of theology. Galileo Galilei wrote about experimental measurements of falling and rolling objects. Johannes Kepler's laws of planetary motion summarized Tycho Brahe's astronomical observations. Many experiments with electrical and magnetic materials led to new ideas about forces. These efforts set the stage for Newton's work on forces and gravity.

Newtonian gravity

In 1687 Isaac Newton published his Principia which combined his laws of motion with a new mathematical analysis able to reproduce Kepler's empirical results. His explanation was in the form of a law of universal gravitation: any two bodies are attracted by a force proportional to their mass and inversely proportional to the square of the distance between them. Thus the motions of planets were predicted by assuming forces working over great distances.

This mathematical expression of the force did not imply a cause. Newton considered action-at-a-distance to be an inadequate model for gravity. Newton, in his words, considered action at a distance to be:

so great an Absurdity that I believe no Man who has in philosophical Matters a competent Faculty of thinking can ever fall into it.

— Isaac Newton, Letters to Bentley, 1692/3

Metaphysical scientists of the early 1700s strongly objected to the unexplained action-at-a-distance in Newton's theory. Gottfried Wilhelm Leibniz complained that the mechanism of gravity was "invisible, intangible, and not mechanical". Moreover, initial comparisons with astronomical data were not favorable. As mathematical techniques improved throughout the 1700s, the theory showed increasing success, predicting the date of the return of Halley's comet and aiding the discovery of planet Neptune in 1846. These successes and the increasingly empirical focus of science towards the 19th century led to acceptance of Newton's theory of gravity despite distaste for action-at-a-distance.

Electrical action at a distance

Jean-Antoine Nollet reproducing Stephan Gray's “electric boy” experiment, in which a boy hanging from insulating silk ropes is given an electric charge. A group are gathered around. A woman is encouraged to bend forward and poke the boy's nose, to get an electric shock.

Electrical and magnetic phenomena also began to be explored systematically in the early 1600s. In William Gilbert's early theory of "electric effluvia," a kind of electric atmosphere, he rules out action-at-a-distance on the grounds that "no action can be performed by matter save by contact". However subsequent experiments, especially those by Stephen Gray showed electrical effects over distance. Gray developed an experiment call the "electric boy" demonstrating electric transfer without direct contact. Franz Aepinus was the first to show, in 1759, that a theory of action at a distance for electricity provides a simpler replacement for the electric effluvia theory. Despite this success, Aepinus himself considered the nature of the forces to be unexplained: he did "not approve of the doctrine which assumes the possibility of action at a distance", setting the stage for a shift to theories based on aether.

By 1785 Charles-Augustin de Coulomb showed that two electric charges at rest experience a force inversely proportional to the square of the distance between them, a result now called Coulomb's law. The striking similarity to gravity strengthened the case for action at a distance, at least as a mathematical model.

As mathematical methods improved, especially through the work of Pierre-Simon Laplace, Joseph-Louis Lagrange, and Siméon Denis Poisson, more sophisticated mathematical methods began to influence the thinking of scientists. The concept of potential energy applied to small test particles led to the concept of a scalar field, a mathematical model representing the forces throughout space. While this mathematical model is not a mechanical medium, the mental picture of such a field resembles a medium.

Fields as an alternative

Glazed frame, containing "Delineation of Lines of Magnetic Force by Iron filings" prepared by Michael Faraday

Michael Faraday was the first who suggested that action at a distance was inadequate as an account of electric and magnetic forces, even in the form of a (mathematical) potential field. Faraday, an empirical experimentalist, cited three reasons in support of some medium transmitting electrical force: 1) electrostatic induction across an insulator depends on the nature of the insulator, 2) cutting a charged insulator causes opposite charges to appear on each half, and 3) electric discharge sparks are curved at an insulator. From these reasons he concluded that the particles of an insulator must be polarized, with each particle contributing to continuous action. He also experimented with magnets, demonstrating lines of force made visible by iron filings. However, in both cases his field-like model depends on particles that interact through an action-at-a-distance: his mechanical field-like model has no more fundamental physical cause than the long-range central field model.

Faraday's observations, as well as others, led James Clerk Maxwell to a breakthrough formulation in 1865, a set of equations that combined electricity and magnetism, both static and dynamic, and which included electromagnetic radiation – light. Maxwell started with elaborate mechanical models but ultimately produced a purely mathematical treatment using dynamical vector fields. The sense that these fields must be set to vibrate to propagate light set off a search of a medium of propagation; the medium was called the luminiferous aether or the aether.

In 1873 Maxwell addressed action at a distance explicitly. He reviews Faraday's lines of force, carefully pointing out that Faraday himself did not provide a mechanical model of these lines in terms of a medium. Nevertheless the many properties of these lines of force imply these "lines must not be regarded as mere mathematical abstractions". Faraday himself viewed these lines of force as a model, a "valuable aid" to the experimentalist, a means to suggest further experiments.

In distinguishing between different kinds of action Faraday suggested three criteria: 1) do additional material objects alter the action?, 2) does the action take time, and 3) does it depend upon the receiving end? For electricity, Faraday knew that all three criteria were met for electric action, but gravity was thought to only meet the third one. After Maxwell's time a fourth criteria, the transmission of energy, was added, thought to also apply to electricity but not gravity. With the advent of new theories of gravity, the modern account would give gravity all of the criteria except dependence on additional objects.

Fields fade into spacetime

The success of Maxwell's field equations led to numerous efforts in the later decades of the 19th century to represent electrical, magnetic, and gravitational fields, primarily with mechanical models. No model emerged that explained the existing phenomena. In particular no good model for stellar aberration, the shift in the position of stars with the Earth's relative velocity. The best models required the ether to be stationary while the Earth moved, but experimental efforts to measure the effect of Earth's motion through the aether found no effect.

In 1892 Hendrik Lorentz proposed a modified aether based on the emerging microscopic molecular model rather than the strictly macroscopic continuous theory of Maxwell. Lorentz investigated the mutual interaction of a moving solitary electrons within a stationary aether. He rederived Maxwell's equations in this way but, critically, in the process he changed them to represent the wave in the coordinates moving electrons. He showed that the wave equations had the same form if they were transformed using a particular scaling factor, where is the velocity of the moving electrons and is the speed of light. Lorentz noted that if this factor were applied as a length contraction to moving matter in a stationary ether, it would eliminate any effect of motion through the ether, in agreement with experiment.

In 1899, Henri Poincaré questioned the existence of an aether, showing that the principle of relativity prohibits the absolute motion assumed by proponents of the aether model. He named the transformation used by Lorentz the Lorentz transformation but interpreted it as a transformation between two inertial frames with relative velocity . This transformation makes the electromagnetic equations look the same in every uniformly moving inertial frame. Then, in 1905, Albert Einstein demonstrated that the principle of relativity, applied to the simultaneity of time and the constant speed of light, precisely predicts the Lorentz transformation. This theory of special relativity quickly became the modern concept of spacetime.

Thus the aether model, initially so very different from action at a distance, slowly changed to resemble simple empty space.

In 1905, Poincaré proposed gravitational waves, emanating from a body and propagating at the speed of light, as being required by the Lorentz transformations and suggested that, in analogy to an accelerating electrical charge producing electromagnetic waves, accelerated masses in a relativistic field theory of gravity should produce gravitational waves. However, until 1915 gravity stood apart as a force still described by action-at-a-distance. In that year, Einstein showed that a field theory of spacetime, general relativity, consistent with relativity can explain gravity. New effects resulting from this theory were dramatic for cosmology but minor for planetary motion and physics on Earth. Einstein himself noted Newton's "enormous practical success".

Modern action at a distance

In the early decades of the 20th century, Karl Schwarzschild, Hugo Tetrode, and Adriaan Fokker independently developed non-instantaneous models for action at a distance consistent with special relativity. In 1949 John Archibald Wheeler and Richard Feynman built on these models to develop a new field-free theory of electromagnetism. While Maxwell's field equations are generally successful, the Lorentz model of a moving electron interacting with the field encounters mathematical difficulties: the self-energy of the moving point charge within the field is infinite. The Wheeler–Feynman absorber theory of electromagnetism avoids the self-energy issue. They interpret Abraham–Lorentz force, the apparent force resisting electron acceleration, as a real force returning from all the other existing charges in the universe.

The Wheeler–Feynman theory has inspired new thinking about the arrow of time and about the nature of quantum non-locality. The theory has implications for cosmology; it has been extended to quantum mechanics. A similar approach has been applied to develop an alternative theory of gravity consistent with general relativity. John G. Cramer has extended the Wheeler–Feynman ideas to create the transactional interpretation of quantum mechanics.

"Spooky action at a distance"

Albert Einstein wrote to Max Born about issues in quantum mechanics in 1947 and used a phrase translated as "spooky action at a distance", and in 1964, John Stewart Bell proved that quantum mechanics predicted stronger statistical correlations in the outcomes of certain far-apart measurements than any local theory possibly could. The phrase has been picked up and used as a description for the cause of small non-classical correlations between physically separated measurement of entangled quantum states. The correlations are predicted by quantum mechanics (the Bell theorem) and verified by experiments (the Bell test). Rather than a postulate like Newton's gravitational force, this use of "action-at-a-distance" concerns observed correlations which cannot be explained with localized particle-based models. Describing these correlations as "action-at-a-distance" requires assuming that particles became entangled and then traveled to distant locations, an assumption that is not required by quantum mechanics.

Force in quantum field theory

Quantum field theory does not need action at a distance. At the most fundamental level, only four forces are needed. Each force is described as resulting from the exchange of specific bosons. Two are short range: the strong interaction mediated by mesons and the weak interaction mediated by the weak boson; two are long range: electromagnetism mediated by the photon and gravity hypothesized to be mediated by the graviton. However, the entire concept of force is of secondary concern in advanced modern particle physics. Energy forms the basis of physical models and the word action has shifted away from implying a force to a specific technical meaning, an integral over the difference between potential energy and kinetic energy.

Many-worlds interpretation

The quantum-mechanical "Schrödinger's cat" paradox according to the many-worlds interpretation. In this interpretation, every quantum event is a branch point; the cat is both alive and dead, even before the box is opened, but the "alive" and "dead" cats are in different branches of the multiverse, both of which are equally real, but which do not interact with each other.

The many-worlds interpretation (MWI) is an interpretation of quantum mechanics that asserts that the universal wavefunction is objectively real, and that there is no wave function collapse. This implies that all possible outcomes of quantum measurements are physically realized in some "world". The evolution of reality as a whole in MWI is rigidly deterministic and local. Many-worlds is also called the relative state formulation or the Everett interpretation, after physicist Hugh Everett, who first proposed it in 1957. Bryce DeWitt popularized the formulation and named it many-worlds in the 1970s.

In modern versions of many-worlds, the subjective appearance of wave function collapse is explained by the mechanism of quantum decoherence. Decoherence approaches to interpreting quantum theory have been widely explored and developed since the 1970s. MWI is considered a mainstream interpretation of quantum mechanics, along with the other decoherence interpretations, the Copenhagen interpretation, and hidden variable theories such as Bohmian mechanics.

The many-worlds interpretation implies that there are many parallel, non-interacting worlds. It is one of a number of multiverse hypotheses in physics and philosophy. MWI views time as a many-branched tree, wherein every possible quantum outcome is realized. This is intended to resolve the measurement problem and thus some paradoxes of quantum theory, such as Wigner's friend, the EPR paradox and Schrödinger's cat, since every possible outcome of a quantum event exists in its own world.

Overview of the interpretation

The many-worlds interpretation's key idea is that the linear and unitary dynamics of quantum mechanics applies everywhere and at all times and so describes the whole universe. In particular, it models a measurement as a unitary transformation, a correlation-inducing interaction, between observer and object, without using a collapse postulate, and models observers as ordinary quantum-mechanical systems. This stands in contrast to the Copenhagen interpretation, in which a measurement is a "primitive" concept, not describable by unitary quantum mechanics; using the Copenhagen interpretation the universe is divided into a quantum and a classical domain, and the collapse postulate is central. In MWI there is no division between classical and quantum: everything is quantum and there is no collapse. MWI's main conclusion is that the universe (or multiverse in this context) is composed of a quantum superposition of an uncountable or undefinable amount or number of increasingly divergent, non-communicating parallel universes or quantum worlds. Sometimes dubbed Everett worlds, each is an internally consistent and actualized alternative history or timeline.

The many-worlds interpretation uses decoherence to explain the measurement process and the emergence of a quasi-classical world. Wojciech H. Zurek, one of decoherence theory's pioneers, said: "Under scrutiny of the environment, only pointer states remain unchanged. Other states decohere into mixtures of stable pointer states that can persist, and, in this sense, exist: They are einselected." Zurek emphasizes that his work does not depend on a particular interpretation.

The many-worlds interpretation shares many similarities with the decoherent histories interpretation, which also uses decoherence to explain the process of measurement or wave function collapse. MWI treats the other histories or worlds as real, since it regards the universal wave function as the "basic physical entity" or "the fundamental entity, obeying at all times a deterministic wave equation". The decoherent histories interpretation, on the other hand, needs only one of the histories (or worlds) to be real.

Several authors, including Everett, John Archibald Wheeler and David Deutsch, call many-worlds a theory or metatheory, rather than just an interpretation. Everett argued that it was the "only completely coherent approach to explaining both the contents of quantum mechanics and the appearance of the world." Deutsch dismissed the idea that many-worlds is an "interpretation", saying that to call it an interpretation "is like talking about dinosaurs as an 'interpretation' of fossil records."

Formulation

In his 1957 doctoral dissertation, Everett proposed that, rather than relying on external observation for analysis of isolated quantum systems, one could mathematically model an object, as well as its observers, as purely physical systems within the mathematical framework developed by Paul Dirac, John von Neumann, and others, discarding altogether the ad hoc mechanism of wave function collapse.

Relative state

Everett's original work introduced the concept of a relative state. Two (or more) subsystems, after a general interaction, become correlated, or as is now said, entangled. Everett noted that such entangled systems can be expressed as the sum of products of states, where the two or more subsystems are each in a state relative to each other. After a measurement or observation one of the pair (or triple...) is the measured, object or observed system, and one other member is the measuring apparatus (which may include an observer) having recorded the state of the measured system. Each product of subsystem states in the overall superposition evolves over time independently of other products. Once the subsystems interact, their states have become correlated or entangled and can no longer be considered independent. In Everett's terminology, each subsystem state was now correlated with its relative state, since each subsystem must now be considered relative to the other subsystems with which it has interacted.

In the example of Schrödinger's cat, after the box is opened, the entangled system is the cat, the poison vial and the observer. One relative triple of states would be the alive cat, the unbroken vial and the observer seeing an alive cat. Another relative triple of states would be the dead cat, the broken vial and the observer seeing a dead cat.

In the example of a measurement of a continuous variable (e.g., position q) the object-observer system decomposes into a continuum of pairs of relative states: the object system's relative state becomes a Dirac delta function each centered on a particular value of q and the corresponding observer relative state representing an observer having recorded the value of q. The states of the pairs of relative states are, post measurement, correlated with each other.

In Everett's scheme, there is no collapse; instead, the Schrödinger equation, or its quantum field theory, relativistic analog, holds all the time, everywhere. An observation or measurement is modeled by applying the wave equation to the entire system, comprising the object being observed and the observer. One consequence is that every observation causes the combined observer–object's wavefunction to change into a quantum superposition of two or more non-interacting branches.

Thus the process of measurement or observation, or any correlation-inducing interaction, splits the system into sets of relative states, where each set of relative states, forming a branch of the universal wave function, is consistent within itself, and all future measurements (including by multiple observers) will confirm this consistency.

Renamed many-worlds

Everett had referred to the combined observer–object system as split by an observation, each split corresponding to the different or multiple possible outcomes of an observation. These splits generate a branching tree, where each branch is a set of all the states relative to each other. Bryce DeWitt popularized Everett's work with a series of publications calling it the Many Worlds Interpretation. Focusing on the splitting process, DeWitt introduced the term "world" to describe a single branch of that tree, which is a consistent history. All observations or measurements within any branch are consistent within themselves.

Since many observation-like events have happened and are constantly happening, Everett's model implies that there are an enormous and growing number of simultaneously existing states or "worlds".

Properties

MWI removes the observer-dependent role in the quantum measurement process by replacing wave function collapse with the established mechanism of quantum decoherence. As the observer's role lies at the heart of all "quantum paradoxes" such as the EPR paradox and von Neumann's "boundary problem", this provides a clearer and easier approach to their resolution.

Since the Copenhagen interpretation requires the existence of a classical domain beyond the one described by quantum mechanics, it has been criticized as inadequate for the study of cosmology. While there is no evidence that Everett was inspired by issues of cosmology, he developed his theory with the explicit goal of allowing quantum mechanics to be applied to the universe as a whole, hoping to stimulate the discovery of new phenomena. This hope has been realized in the later development of quantum cosmology.

MWI is a realist, deterministic and local theory. It achieves this by removing wave function collapse, which is indeterministic and nonlocal, from the deterministic and local equations of quantum theory.

MWI (like other, broader multiverse theories) provides a context for the anthropic principle, which may provide an explanation for the fine-tuned universe.

MWI depends crucially on the linearity of quantum mechanics, which underpins the superposition principle. If the final theory of everything is non-linear with respect to wavefunctions, then many-worlds is invalid. All quantum field theories are linear and compatible with the MWI, a point Everett emphasized as a motivation for the MWI. While quantum gravity or string theory may be non-linear in this respect, there is as yet no evidence of this.

Alternative to wavefunction collapse

As with the other interpretations of quantum mechanics, the many-worlds interpretation is motivated by behavior that can be illustrated by the double-slit experiment. When particles of light (or anything else) pass through the double slit, a calculation assuming wavelike behavior of light can be used to identify where the particles are likely to be observed. Yet when the particles are observed in this experiment, they appear as particles (i.e., at definite places) and not as non-localized waves.

Some versions of the Copenhagen interpretation of quantum mechanics proposed a process of "collapse" in which an indeterminate quantum system would probabilistically collapse onto, or select, just one determinate outcome to "explain" this phenomenon of observation. Wave function collapse was widely regarded as artificial and ad hoc, so an alternative interpretation in which the behavior of measurement could be understood from more fundamental physical principles was considered desirable.

Everett's PhD work provided such an interpretation. He argued that for a composite system—such as a subject (the "observer" or measuring apparatus) observing an object (the "observed" system, such as a particle)—the claim that either the observer or the observed has a well-defined state is meaningless; in modern parlance, the observer and the observed have become entangled: we can only specify the state of one relative to the other, i.e., the state of the observer and the observed are correlated after the observation is made. This led Everett to derive from the unitary, deterministic dynamics alone (i.e., without assuming wave function collapse) the notion of a relativity of states.

Everett noticed that the unitary, deterministic dynamics alone entailed that after an observation is made each element of the quantum superposition of the combined subject–object wave function contains two "relative states": a "collapsed" object state and an associated observer who has observed the same collapsed outcome; what the observer sees and the state of the object have become correlated by the act of measurement or observation. The subsequent evolution of each pair of relative subject–object states proceeds with complete indifference as to the presence or absence of the other elements, as if wave function collapse has occurred, which has the consequence that later observations are always consistent with the earlier observations. Thus the appearance of the object's wave function's collapse has emerged from the unitary, deterministic theory itself. (This answered Einstein's early criticism of quantum theory: that the theory should define what is observed, not for the observables to define the theory.) Since the wave function appears to have collapsed then, Everett reasoned, there was no need to actually assume that it had collapsed. And so, invoking Occam's razor, he removed the postulate of wave function collapse from the theory.

Testability

In 1985, David Deutsch proposed a variant of the Wigner's friend thought experiment as a test of many-worlds versus the Copenhagen interpretation. It consists of an experimenter (Wigner's friend) making a measurement on a quantum system in an isolated laboratory, and another experimenter (Wigner) who would make a measurement on the first one. According to the many-worlds theory, the first experimenter would end up in a macroscopic superposition of seeing one result of the measurement in one branch, and another result in another branch. The second experimenter could then interfere these two branches in order to test whether it is in fact in a macroscopic superposition or has collapsed into a single branch, as predicted by the Copenhagen interpretation. Since then Lockwood, Vaidman, and others have made similar proposals, which require placing macroscopic objects in a coherent superposition and interfering them, a task currently beyond experimental capability.

Probability and the Born rule

Since the many-worlds interpretation's inception, physicists have been puzzled about the role of probability in it. As put by Wallace, there are two facets to the question: the incoherence problem, which asks why we should assign probabilities at all to outcomes that are certain to occur in some worlds, and the quantitative problem, which asks why the probabilities should be given by the Born rule.

Everett tried to answer these questions in the paper that introduced many-worlds. To address the incoherence problem, he argued that an observer who makes a sequence of measurements on a quantum system will in general have an apparently random sequence of results in their memory, which justifies the use of probabilities to describe the measurement process. To address the quantitative problem, Everett proposed a derivation of the Born rule based on the properties that a measure on the branches of the wave function should have. His derivation has been criticized as relying on unmotivated assumptions. Since then several other derivations of the Born rule in the many-worlds framework have been proposed. There is no consensus on whether this has been successful.

Frequentism

DeWitt and Graham and Farhi et al., among others, have proposed derivations of the Born rule based on a frequentist interpretation of probability. They try to show that in the limit of uncountably many measurements, no worlds would have relative frequencies that didn't match the probabilities given by the Born rule, but these derivations have been shown to be mathematically incorrect.

Decision theory

A decision-theoretic derivation of the Born rule was produced by David Deutsch (1999) and refined by Wallace and Saunders. They consider an agent who takes part in a quantum gamble: the agent makes a measurement on a quantum system, branches as a consequence, and each of the agent's future selves receives a reward that depends on the measurement result. The agent uses decision theory to evaluate the price they would pay to take part in such a gamble, and concludes that the price is given by the utility of the rewards weighted according to the Born rule. Some reviews have been positive, although these arguments remain highly controversial; some theoretical physicists have taken them as supporting the case for parallel universes. For example, a New Scientist story on a 2007 conference about Everettian interpretations quoted physicist Andy Albrecht as saying, "This work will go down as one of the most important developments in the history of science." In contrast, the philosopher Huw Price, also attending the conference, found the Deutsch–Wallace–Saunders approach fundamentally flawed.

Symmetries and invariance

In 2005, Zurek produced a derivation of the Born rule based on the symmetries of entangled states; Schlosshauer and Fine argue that Zurek's derivation is not rigorous, as it does not define what probability is and has several unstated assumptions about how it should behave.

In 2016, Charles Sebens and Sean M. Carroll, building on work by Lev Vaidman, proposed a similar approach based on self-locating uncertainty. In this approach, decoherence creates multiple identical copies of observers, who can assign credences to being on different branches using the Born rule. The Sebens–Carroll approach has been criticized by Adrian Kent, and Vaidman does not find it satisfactory.

Branch counting

In 2021, Simon Saunders produced a branch counting derivation of the Born rule. The crucial feature of this approach is to define the branches so that they all have the same magnitude or 2-norm. The ratios of the numbers of branches thus defined give the probabilities of the various outcomes of a measurement, in accordance with the Born rule.

The preferred basis problem

As originally formulated by Everett and DeWitt, the many-worlds interpretation had a privileged role for measurements: they determined which basis of a quantum system would give rise to the eponymous worlds. Without this the theory was ambiguous, as a quantum state can equally well be described (e.g.) as having a well-defined position or as being a superposition of two delocalized states. The assumption is that the preferred basis to use is the one which assigns a unique measurement outcome to each world. This special role for measurements is problematic for the theory, as it contradicts Everett and DeWitt's goal of having a reductionist theory and undermines their criticism of the ill-defined measurement postulate of the Copenhagen interpretation. This is known today as the preferred basis problem.

The preferred basis problem has been solved, according to Saunders and Wallace, among others, by incorporating decoherence into the many-worlds theory. In this approach, the preferred basis does not have to be postulated, but rather is identified as the basis stable under environmental decoherence. In this way measurements no longer play a special role; rather, any interaction that causes decoherence causes the world to split. Since decoherence is never complete, there will always remain some infinitesimal overlap between two worlds, making it arbitrary whether a pair of worlds has split or not. Wallace argues that this is not problematic: it only shows that worlds are not a part of the fundamental ontology, but rather of the emergent ontology, where these approximate, effective descriptions are routine in the physical sciences. Since in this approach the worlds are derived, it follows that they must be present in any other interpretation of quantum mechanics that does not have a collapse mechanism, such as Bohmian mechanics.

This approach to deriving the preferred basis has been criticized as creating circularity with derivations of probability in the many-worlds interpretation, as decoherence theory depends on probability and probability depends on the ontology derived from decoherence. Wallace contends that decoherence theory depends not on probability but only on the notion that one is allowed to do approximations in physics.

History

MWI originated in Everett's Princeton University PhD thesis "The Theory of the Universal Wave Function", developed under his thesis advisor John Archibald Wheeler, a shorter summary of which was published in 1957 under the title "Relative State Formulation of Quantum Mechanics" (Wheeler contributed the title "relative state"; Everett originally called his approach the "Correlation Interpretation", where "correlation" refers to quantum entanglement). The phrase "many-worlds" is due to Bryce DeWitt, who was responsible for the wider popularization of Everett's theory, which had been largely ignored for a decade after publication in 1957.

Everett's proposal was not without precedent. In 1952, Erwin Schrödinger gave a lecture in Dublin in which at one point he jocularly warned his audience that what he was about to say might "seem lunatic". He went on to assert that while the Schrödinger equation seemed to be describing several different histories, they were "not alternatives but all really happen simultaneously". According to David Deutsch, this is the earliest known reference to many-worlds; Jeffrey A. Barrett describes it as indicating the similarity of "general views" between Everett and Schrödinger. Schrödinger's writings from the period also contain elements resembling the modal interpretation originated by Bas van Fraassen. Because Schrödinger subscribed to a kind of post-Machian neutral monism, in which "matter" and "mind" are only different aspects or arrangements of the same common elements, treating the wave function as physical and treating it as information became interchangeable.

Leon Cooper and Deborah Van Vechten developed a very similar approach before reading Everett's work. Zeh also came to the same conclusions as Everett before reading his work, then built a new theory of quantum decoherence based on these ideas.

According to people who knew him, Everett believed in the literal reality of the other quantum worlds. His son and wife reported that he "never wavered in his belief over his many-worlds theory". In their detailed review of Everett's work, Osnaghi, Freitas, and Freire Jr. note that Everett consistently used quotes around "real" to indicate a meaning within scientific practice.

Reception

MWI's initial reception was overwhelmingly negative, in the sense that it was ignored, with the notable exception of DeWitt. Wheeler made considerable efforts to formulate the theory in a way that would be palatable to Bohr, visited Copenhagen in 1956 to discuss it with him, and convinced Everett to visit as well, which happened in 1959. Nevertheless, Bohr and his collaborators completely rejected the theory. Everett had already left academia in 1957, never to return, and in 1980, Wheeler disavowed the theory.

Support

One of MWI's strongest longtime advocates is David Deutsch. According to him, the single photon interference pattern observed in the double slit experiment can be explained by interference of photons in multiple universes. Viewed this way, the single photon interference experiment is indistinguishable from the multiple photon interference experiment. In a more practical vein, in one of the earliest papers on quantum computing, Deutsch suggested that parallelism that results from MWI could lead to "a method by which certain probabilistic tasks can be performed faster by a universal quantum computer than by any classical restriction of it". He also proposed that MWI will be testable (at least against "naive" Copenhagenism) when reversible computers become conscious via the reversible observation of spin.

Equivocal

Philosophers of science James Ladyman and Don Ross say that MWI could be true, but do not embrace it. They note that no quantum theory is yet empirically adequate for describing all of reality, given its lack of unification with general relativity, and so do not see a reason to regard any interpretation of quantum mechanics as the final word in metaphysics. They also suggest that the multiple branches may be an artifact of incomplete descriptions and of using quantum mechanics to represent the states of macroscopic objects. They argue that macroscopic objects are significantly different from microscopic objects in not being isolated from the environment, and that using quantum formalism to describe them lacks explanatory and descriptive power and accuracy.

Rejection

Some scientists consider some aspects of MWI to be unfalsifiable and hence unscientific because the multiple parallel universes are non-communicating, in the sense that no information can be passed between them.

Victor J. Stenger remarked that Murray Gell-Mann's published work explicitly rejects the existence of simultaneous parallel universes. Collaborating with James Hartle, Gell-Mann worked toward the development of a more "palatable" post-Everett quantum mechanics. Stenger thought it fair to say that most physicists find MWI too extreme, though it "has merit in finding a place for the observer inside the system being analyzed and doing away with the troublesome notion of wave function collapse".

Roger Penrose argues that the idea is flawed because it is based on an oversimplified version of quantum mechanics that does not account for gravity. In his view, applying conventional quantum mechanics to the universe implies the MWI, but the lack of a successful theory of quantum gravity negates the claimed universality of conventional quantum mechanics. According to Penrose, "the rules must change when gravity is involved". He further asserts that gravity helps anchor reality and "blurry" events have only one allowable outcome: "electrons, atoms, molecules, etc., are so minute that they require almost no amount of energy to maintain their gravity, and therefore their overlapping states. They can stay in that state forever, as described in standard quantum theory". On the other hand, "in the case of large objects, the duplicate states disappear in an instant due to the fact that these objects create a large gravitational field".

Philosopher of science Robert P. Crease says that MWI is "one of the most implausible and unrealistic ideas in the history of science" because it means that everything conceivable happens. Science writer Philip Ball calls MWI's implications fantasies, since "beneath their apparel of scientific equations or symbolic logic, they are acts of imagination, of 'just supposing'".

Theoretical physicist Gerard 't Hooft also dismisses the idea: "I do not believe that we have to live with the many-worlds interpretation. Indeed, it would be a stupendous number of parallel worlds, which are only there because physicists couldn't decide which of them is real."

Asher Peres was an outspoken critic of MWI. A section of his 1993 textbook had the title Everett's interpretation and other bizarre theories. Peres argued that the various many-worlds interpretations merely shift the arbitrariness or vagueness of the collapse postulate to the question of when "worlds" can be regarded as separate, and that no objective criterion for that separation can actually be formulated.

Polls

A poll of 72 "leading quantum cosmologists and other quantum field theorists" conducted before 1991 by L. David Raub showed 58% agreement with "Yes, I think MWI is true".

Max Tegmark reports the result of a "highly unscientific" poll taken at a 1997 quantum mechanics workshop. According to Tegmark, "The many worlds interpretation (MWI) scored second, comfortably ahead of the consistent histories and Bohm interpretations."

In response to Sean M. Carroll's statement "As crazy as it sounds, most working physicists buy into the many-worlds theory", Michael Nielsen counters: "at a quantum computing conference at Cambridge in 1998, a many-worlder surveyed the audience of approximately 200 people... Many-worlds did just fine, garnering support on a level comparable to, but somewhat below, Copenhagen and decoherence." But Nielsen notes that it seemed most attendees found it to be a waste of time: Peres "got a huge and sustained round of applause…when he got up at the end of the polling and asked 'And who here believes the laws of physics are decided by a democratic vote?'"

A 2005 poll of fewer than 40 students and researchers taken after a course on the Interpretation of Quantum Mechanics at the Institute for Quantum Computing University of Waterloo found "Many Worlds (and decoherence)" to be the least favored.

A 2011 poll of 33 participants at an Austrian conference on quantum foundations found 6 endorsed MWI, 8 "Information-based/information-theoretical", and 14 Copenhagen; the authors remark that MWI received a similar percentage of votes as in Tegmark's 1997 poll.

Speculative implications

DeWitt has said that Everett, Wheeler, and Graham "do not in the end exclude any element of the superposition. All the worlds are there, even those in which everything goes wrong and all the statistical laws break down." Tegmark affirmed that absurd or highly unlikely events are rare but inevitable under MWI: "Things inconsistent with the laws of physics will never happen—everything else will... it's important to keep track of the statistics, since even if everything conceivable happens somewhere, really freak events happen only exponentially rarely." David Deutsch speculates in his book The Beginning of Infinity that some fiction, such as alternate history, could occur somewhere in the multiverse, as long as it is consistent with the laws of physics.

According to Ladyman and Ross, many seemingly physically plausible but unrealized possibilities, such as those discussed in other scientific fields, generally have no counterparts in other branches, because they are in fact incompatible with the universal wave function. According to Carroll, human decision-making, contrary to common misconceptions, is best thought of as a classical process, not a quantum one, because it works on the level of neurochemistry rather than fundamental particles. Human decisions do not cause the world to branch into equally realized outcomes; even for subjectively difficult decisions, the "weight" of realized outcomes is almost entirely concentrated in a single branch.

Quantum suicide is a thought experiment in quantum mechanics and the philosophy of physics that can purportedly distinguish between the Copenhagen interpretation of quantum mechanics and the many-worlds interpretation by a variation of the Schrödinger's cat thought experiment, from the cat's point of view. Quantum immortality refers to the subjective experience of surviving quantum suicide. Most experts believe the experiment would not work in the real world, because the world with the surviving experimenter has a lower "measure" than the world before the experiment, making it less likely that the experimenter will experience their survival.

Superdense coding

From Wikipedia, the free encyclopedia
In quantum information theory, superdense coding (also referred to as dense coding) is a quantum communication protocol to communicate a number of classical bits of information by only transmitting a smaller number of qubits, under the assumption of sender and receiver pre-sharing an entangled resource. In its simplest form, the protocol involves two parties, often referred to as Alice and Bob in this context, which share a pair of maximally entangled qubits, and allows Alice to transmit two bits (i.e., one of 00, 01, 10 or 11) to Bob by sending only one qubit. This protocol was first proposed by Charles H. Bennett and Stephen Wiesner in 1970 (though not published by them until 1992) and experimentally actualized in 1996 by Klaus Mattle, Harald Weinfurter, Paul G. Kwiat and Anton Zeilinger using entangled photon pairs. Superdense coding can be thought of as the opposite of quantum teleportation, in which one transfers one qubit from Alice to Bob by communicating two classical bits, as long as Alice and Bob have a pre-shared Bell pair.

The transmission of two bits via a single qubit is made possible by the fact that Alice can choose among four quantum gate operations to perform on her share of the entangled state. Alice determines which operation to perform accordingly to the pair of bits she wants to transmit. She then sends Bob the qubit state evolved through the chosen gate. Said qubit thus encodes information about the two bits Alice used to select the operation, and this information can be retrieved by Bob thanks to pre-shared entanglement between them. After receiving Alice's qubit, operating on the pair and measuring both, Bob obtains two classical bits of information. It is worth stressing that if Alice and Bob do not pre-share entanglement, then the superdense protocol is impossible, as this would violate Holevo's theorem.

Superdense coding is the underlying principle of secure quantum secret coding. The necessity of having both qubits to decode the information being sent eliminates the risk of eavesdroppers intercepting messages.

Overview

When the sender and receiver share a Bell state, two classical bits can be packed into one qubit. In the diagram, lines carry qubits, while the doubled lines carry classic bits. The variables b1 and b2 are classic Boolean, and the zeroes at the left-hand side represent the pure quantum state . See the section named "The protocol" below for more details regarding this picture.

Suppose Alice wants to send two classical bits of information (00, 01, 10, or 11) to Bob using qubits (instead of classical bits). To do this, an entangled state (e.g. a Bell state) is prepared using a Bell circuit or gate by Charlie, a third person. Charlie then sends one of these qubits (in the Bell state) to Alice and the other to Bob. Once Alice obtains her qubit in the entangled state, she applies a certain quantum gate to her qubit depending on which two-bit message (00, 01, 10 or 11) she wants to send to Bob. Her entangled qubit is then sent to Bob who, after applying the appropriate quantum gate and making a measurement, can retrieve the classical two-bit message. Observe that Alice does not need to communicate to Bob which gate to apply in order to obtain the correct classical bits from his projective measurement.

The protocol

The protocol can be split into five different steps: preparation, sharing, encoding, sending, and decoding.

Preparation

The protocol starts with the preparation of an entangled state, which is later shared between Alice and Bob. For example, the following Bell state

is prepared, where denotes the tensor product. In common usage the tensor product symbol may be omitted:

.

Sharing

After the preparation of the Bell state , the qubit denoted by subscript A is sent to Alice and the qubit denoted by subscript B is sent to Bob. Alice and Bob may be in different locations, an unlimited distance from each other.

There may be an arbitrary period between the preparation and sharing of the entangled state and the rest of the steps in the procedure.

Encoding

By applying a quantum gate to her qubit locally, Alice can transform the entangled state into any of the four Bell states (including, of course, ). Note that this process cannot "break" the entanglement between the two qubits.

Let's now describe which operations Alice needs to perform on her entangled qubit, depending on which classical two-bit message she wants to send to Bob. We'll later see why these specific operations are performed. There are four cases, which correspond to the four possible two-bit strings that Alice may want to send.

1. If Alice wants to send the classical two-bit string 00 to Bob, then she applies the identity quantum gate, , to her qubit, so that it remains unchanged. The resultant entangled state is then

In other words, the entangled state shared between Alice and Bob has not changed, i.e. it is still . The notation indicates that Alice wants to send the two-bit string 00.

2. If Alice wants to send the classical two-bit string 01 to Bob, then she applies the quantum NOT (or bit-flip) gate, , to her qubit, so that the resultant entangled quantum state becomes

3. If Alice wants to send the classical two-bit string 10 to Bob, then she applies the quantum phase-flip gate to her qubit, so the resultant entangled state becomes

4. If, instead, Alice wants to send the classical two-bit string 11 to Bob, then she applies the quantum gate to her qubit, so that the resultant entangled state becomes

The matrices , , and are known as Pauli matrices.

Sending

After having performed one of the operations described above, Alice can send her entangled qubit to Bob using a quantum network through some conventional physical medium.

Decoding

In order for Bob to find out which classical bits Alice sent he will perform the CNOT unitary operation, with A as control qubit and B as target qubit. Then, he will perform unitary operation on the entangled qubit A. In other words, the Hadamard quantum gate H is only applied to A (see the figure above).

  • If the resultant entangled state was then after the application of the above unitary operations the entangled state will become
  • If the resultant entangled state was then after the application of the above unitary operations the entangled state will become
  • If the resultant entangled state was then after the application of the above unitary operations the entangled state will become
  • If the resultant entangled state was then after the application of the above unitary operations the entangled state will become

These operations performed by Bob can be seen as a measurement which projects the entangled state onto one of the four two-qubit basis vectors or (as you can see from the outcomes and the example below).

Example

For example, if the resultant entangled state (after the operations performed by Alice) was , then a CNOT with A as control bit and B as target bit will change to become . Now, the Hadamard gate is applied only to A, to obtain

For simplicity, subscripts may be removed:

Now, Bob has the basis state , so he knows that Alice wanted to send the two-bit string 01.

Security

Superdense coding is a form of secure quantum communication. If an eavesdropper, commonly called Eve, intercepts Alice's qubit en route to Bob, all that is obtained by Eve is part of an entangled state. Without access to Bob's qubit, Eve is unable to get any information from Alice's qubit. A third party is unable to eavesdrop on information being communicated through superdense coding and an attempt to measure either qubit would collapse the state of that qubit and alert Bob and Alice.

General dense coding scheme

General dense coding schemes can be formulated in the language used to describe quantum channels. Alice and Bob share a maximally entangled state ω. Let the subsystems initially possessed by Alice and Bob be labeled 1 and 2, respectively. To transmit the message x, Alice applies an appropriate channel

on subsystem 1. On the combined system, this is effected by

where I denotes the identity map on subsystem 2. Alice then sends her subsystem to Bob, who performs a measurement on the combined system to recover the message. Let Bob's measurement be modelled by a POVM , with positive semidefinite operators such that . The probability that Bob's measuring apparatus registers the message is thus Therefore, to achieve the desired transmission, we require that where is the Kronecker delta.

Experimental

The protocol of superdense coding has been actualized in several experiments using different systems to varying levels of channel capacity and fidelities. In 2004, trapped beryllium-9 ions were used in a maximally entangled state to achieve a channel capacity of 1.16 with a fidelity of 0.85. In 2017, a channel capacity of 1.665 was achieved with a fidelity of 0.87 through optical fibers. High-dimensional ququarts (states formed in photon pairs by non-degenerate spontaneous parametric down-conversion) were used to reach a channel capacity of 2.09 (with a limit of 2.32) with a fidelity of 0.98. Nuclear magnetic resonance (NMR) has also been used to share among three parties.

Delayed-choice quantum eraser

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Delayed-choice_quantum_eraser A delayed-cho...