Search This Blog

Saturday, July 5, 2025

Quantum pseudo-telepathy

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Quantum_pseudo-telepathy

Quantum pseudo-telepathy
describes the use of quantum entanglement to eliminate the need for classical communications. A nonlocal game is said to display quantum pseudo-telepathy if players who can use entanglement can win it with certainty while players without it can not. The prefix pseudo refers to the fact that quantum pseudo-telepathy does not involve the exchange of information between any parties. Instead, quantum pseudo-telepathy removes the need for parties to exchange information in some circumstances.

Quantum pseudo-telepathy is generally used as a thought experiment to demonstrate the non-local characteristics of quantum mechanics. However, quantum pseudo-telepathy is a real-world phenomenon which can be verified experimentally. It is thus an especially striking example of an experimental confirmation of Bell inequality violations.

The magic square game

When attempting to construct a 3×3 table filled with the numbers +1 and −1, such that each row has an even number of negative entries and each column an odd number of negative entries, a conflict is bound to emerge.

A simple magic square game demonstrating nonclassical correlations was introduced by P. K. Aravind based on a series of papers by N. David Mermin and Asher Peres and Adán Cabello that developed simplifying demonstrations of Bell's theorem. The game has been reformulated to demonstrate quantum pseudo-telepathy.

Game rules

This is a cooperative game featuring two players, Alice and Bob, and a referee. The referee asks Alice to fill in one row, and Bob one column, of a 3×3 table with plus and minus signs. Their answers must respect the following constraints: Alice's row must contain an even number of minus signs, Bob's column must contain an odd number of minus signs, and they both must assign the same sign to the cell where the row and column intersects. If they manage to do so, they win—otherwise they lose.

Alice and Bob are allowed to elaborate a strategy together, but crucially are not allowed to communicate after they know which row and column they will need to fill in (as otherwise the game would be trivial).

Classical strategy

It is easy to see that if Alice and Bob can come up with a classical strategy where they always win, they can represent it as a 3×3 table encoding their answers. But this is not possible, as the number of minus signs in this hypothetical table would need to be even and odd at the same time: every row must contain an even number of minus signs, making the total number of minus signs even, and every column must contain an odd number of minus signs, making the total number of minus signs odd.

With a bit further analysis one can see that the best possible classical strategy can be represented by a table where each cell now contains both Alice and Bob's answers, that may differ. It is possible to make their answers equal in 8 out of 9 cells, while respecting the parity of Alice's rows and Bob's columns. This implies that if the referee asks for a row and column whose intersection is one of the cells where their answers match they win, and otherwise they lose. Under the usual assumption that the referee asks for them uniformly at random, the best classical winning probability is 8/9.

Pseudo-telepathic strategies

Use of quantum pseudo-telepathy would enable Alice and Bob to win the game 100% of the time without any communication once the game has begun.

This requires Alice and Bob to possess two pairs of particles with entangled states. These particles must have been prepared before the start of the game. One particle of each pair is held by Alice and the other by Bob, so they each have two particles. When Alice and Bob learn which column and row they must fill, each uses that information to select which measurements they should make to their particles. The result of the measurements will appear to each of them to be random (and the observed partial probability distribution of either particle will be independent of the measurement performed by the other party), so no real "communication" takes place.

However, the process of measuring the particles imposes sufficient structure on the joint probability distribution of the results of the measurement such that if Alice and Bob choose their actions based on the results of their measurement, then there will exist a set of strategies and measurements allowing the game to be won with probability 1.

Note that Alice and Bob could be light years apart from one another, and the entangled particles will still enable them to coordinate their actions sufficiently well to win the game with certainty.

Each round of this game uses up one entangled state. Playing N rounds requires that N entangled states (2N independent Bell pairs, see below) be shared in advance. This is because each round needs 2-bits of information to be measured (the third entry is determined by the first two, so measuring it isn't necessary), which destroys the entanglement. There is no way to reuse old measurements from earlier games.

The trick is for Alice and Bob to share an entangled quantum state and to use specific measurements on their components of the entangled state to derive the table entries. A suitable correlated state consists of a pair of entangled Bell states:

here and are eigenstates of the Pauli operator Sx with eigenvalues +1 and −1, respectively, whilst the subscripts a, b, c, and d identify the components of each Bell state, with a and c going to Alice, and b and d going to Bob. The symbol represents a tensor product.

Observables for these components can be written as products of the Pauli matrices:

Products of these Pauli spin operators can be used to fill the 3×3 table such that each row and each column contains a mutually commuting set of observables with eigenvalues +1 and −1, and with the product of the observables in each row being the identity operator, and the product of observables in each column equating to minus the identity operator. This is a so-called Mermin–Peres magic square. It is shown in below table.

Effectively, while it is not possible to construct a 3×3 table with entries +1 and −1 such that the product of the elements in each row equals +1 and the product of elements in each column equals −1, it is possible to do so with the richer algebraic structure based on spin matrices.

The play proceeds by having each player make one measurement on their part of the entangled state per round of play. Each of Alice's measurements will give her the values for a row, and each of Bob's measurements will give him the values for a column. It is possible to do that because all observables in a given row or column commute, so there exists a basis in which they can be measured simultaneously. For Alice's first row she needs to measure both her particles in the basis, for the second row she needs to measure them in the basis, and for the third row she needs to measure them in an entangled basis. For Bob's first column he needs to measure his first particle in the basis and the second in the basis, for second column he needs to measure his first particle in the basis and the second in the basis, and for his third column he needs to measure both his particles in a different entangled basis, the Bell basis. As long as the table above is used, the measurement results are guaranteed to always multiply out to +1 for Alice along her row, and −1 for Bob down his column. Of course, each completely new round requires a new entangled state, as different rows and columns are not compatible with each other.

Current research

It has been demonstrated that the above-described game is the simplest two-player game of its type in which quantum pseudo-telepathy allows a win with probability one. Other games in which quantum pseudo-telepathy occurs have been studied, including larger magic square games, graph colouring games giving rise to the notion of quantum chromatic number, and multiplayer games involving more than two participants.

In July 2022 a study reported the experimental demonstration of quantum pseudotelepathy via playing the nonlocal version of Mermin-Peres magic square game.

Greenberger–Horne–Zeilinger game

The Greenberger–Horne–Zeilinger (GHZ) game is another example of quantum pseudo-telepathy. Classically, the game has 0.75 winning probability. However, with a quantum strategy, the players can achieve a winning probability of 1, meaning they always win.

In the game there are three players, Alice, Bob, and Carol playing against a referee. The referee poses a binary question to each player (either or ). The three players each respond with an answer again in the form of either or . Therefore, when the game is played the three questions of the referee x, y, z are drawn from the 4 options . For example, if question triple is chosen, then Alice receives bit 0, Bob receives bit 1, and Carol receives bit 1 from the referee. Based on the question bit received, Alice, Bob, and Carol each respond with an answer a, b, c, also in the form of 0 or 1. The players can formulate a strategy together prior to the start of the game. However, no communication is allowed during the game itself.

The players win if , where indicates OR condition and indicates summation of answers modulo 2. In other words, the sum of three answers has to be even if . Otherwise, the sum of answers has to be odd.

Winning condition of GHZ game
0 0 0 0 mod 2
1 1 0 1 mod 2
1 0 1 1 mod 2
0 1 1 1 mod 2

Classical strategy

Classically, Alice, Bob, and Carol can employ a deterministic strategy that always end up with odd sum (e.g. Alice always output 1. Bob and Carol always output 0). The players win 75% of the time and only lose if the questions are .

This is the best classical strategy: only 3 out of 4 winning conditions can be satisfied simultaneously. Let be Alice's response to question 0 and 1 respectively, be Bob's response to question 0, 1, and be Carol's response to question 0, 1. We can write all constraints that satisfy winning conditions as

Suppose that there is a classical strategy that satisfies all four winning conditions, all four conditions hold true. Through observation, each term appears twice on the left hand side. Hence, the left side sum = 0 mod 2. However, the right side sum = 1 mod 2. The contradiction shows that all four winning conditions cannot be simultaneously satisfied.

Quantum strategy

When Alice, Bob, and Carol decide to adopt a quantum strategy they share a tripartite entangled state , known as the GHZ state.

If question 0 is received, the player makes a measurement in the X basis . If question 1 is received, the player makes a measurement in the Y basis . In both cases, the players give answer 0 if the result of the measurement is the first state of the pair, and answer 1 if the result is the second state of the pair. With this strategy the players win the game with probability 1.

 

Friday, July 4, 2025

Quantum foundations

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Quantum_foundations

Quantum foundations is a discipline of science that seeks to understand the most counter-intuitive aspects of quantum theory, reformulate it and even propose new generalizations thereof. Contrary to other physical theories, such as general relativity, the defining axioms of quantum theory are quite ad hoc, with no obvious physical intuition. While they lead to the right experimental predictions, they do not come with a mental picture of the world where they fit.

There exist different approaches to resolve this conceptual gap:

  • First, one can put quantum physics in contraposition with classical physics: by identifying scenarios, such as Bell experiments, where quantum theory radically deviates from classical predictions, one hopes to gain physical insights on the structure of quantum physics.
  • Second, one can attempt to find a re-derivation of the quantum formalism in terms of operational axioms.
  • Third, one can search for a full correspondence between the mathematical elements of the quantum framework and physical phenomena: any such correspondence is called an interpretation.
  • Fourth, one can renounce quantum theory altogether and propose a different model of the world.

Research in quantum foundations is structured along these roads.

Non-classical features of quantum theory

Quantum nonlocality

Two or more separate parties conducting measurements over a quantum state can observe correlations which cannot be explained with any local hidden variable theory. Whether this should be regarded as proving that the physical world itself is "nonlocal" is a topic of debate, but the terminology of "quantum nonlocality" is commonplace. Nonlocality research efforts in quantum foundations focus on determining the exact limits that classical or quantum physics enforces on the correlations observed in a Bell experiment or more complex causal scenarios. This research program has so far provided a generalization of Bell's theorem that allows falsifying all classical theories with a superluminal, yet finite, hidden influence.

Quantum contextuality

Nonlocality can be understood as an instance of quantum contextuality. A situation is contextual when the value of an observable depends on the context in which it is measured (namely, on which other observables are being measured as well). The original definition of measurement contextuality can be extended to state preparations and even general physical transformations.

Epistemic models for the quantum wave-function

A physical property is epistemic when it represents our knowledge or beliefs on the value of a second, more fundamental feature. The probability of an event to occur is an example of an epistemic property. In contrast, a non-epistemic or ontic variable captures the notion of a “real” property of the system under consideration.

There is an on-going debate on whether the wave-function represents the epistemic state of a yet to be discovered ontic variable or, on the contrary, it is a fundamental entity. Under some physical assumptions, the Pusey–Barrett–Rudolph (PBR) theorem demonstrates the inconsistency of quantum states as epistemic states, in the sense above. Note that, in QBism and Copenhagen-type views, quantum states are still regarded as epistemic, not with respect to some ontic variable, but to one's expectations about future experimental outcomes. The PBR theorem does not exclude such epistemic views on quantum states.

Axiomatic reconstructions

Some of the counter-intuitive aspects of quantum theory, as well as the difficulty to extend it, follow from the fact that its defining axioms lack a physical motivation. An active area of research in quantum foundations is therefore to find alternative formulations of quantum theory which rely on physically compelling principles. Those efforts come in two flavors, depending on the desired level of description of the theory: the so-called Generalized Probabilistic Theories approach and the Black boxes approach.

The framework of generalized probabilistic theories

Generalized Probabilistic Theories (GPTs) are a general framework to describe the operational features of arbitrary physical theories. Essentially, they provide a statistical description of any experiment combining state preparations, transformations and measurements. The framework of GPTs can accommodate classical and quantum physics, as well as hypothetical non-quantum physical theories which nonetheless possess quantum theory's most remarkable features, such as entanglement or teleportation. Notably, a small set of physically motivated axioms is enough to single out the GPT representation of quantum theory.

L. Hardy introduced the concept of GPT in 2001, in an attempt to re-derive quantum theory from basic physical principles. Although Hardy's work was very influential (see the follow-ups below), one of his axioms was regarded as unsatisfactory: it stipulated that, of all the physical theories compatible with the rest of the axioms, one should choose the simplest one. The work of Dakic and Brukner eliminated this “axiom of simplicity” and provided a reconstruction of quantum theory based on three physical principles. This was followed by the more rigorous reconstruction of Masanes and Müller.

Axioms common to these three reconstructions are:

  • The subspace axiom: systems which can store the same amount of information are physically equivalent.
  • Local tomography: to characterize the state of a composite system it is enough to conduct measurements at each part.
  • Reversibility: for any two extremal states [i.e., states which are not statistical mixtures of other states], there exists a reversible physical transformation that maps one into the other.

An alternative GPT reconstruction proposed by Chiribella, D'Ariano and Perinotti  around the same time is also based on the

  • Purification axiom: for any state of a physical system A there exists a bipartite physical system and an extremal state (or purification) such that is the restriction of to system . In addition, any two such purifications of can be mapped into one another via a reversible physical transformation on system .

The use of purification to characterize quantum theory has been criticized on the grounds that it also applies in the Spekkens toy model.

To the success of the GPT approach, it can be countered that all such works just recover finite dimensional quantum theory. In addition, none of the previous axioms can be experimentally falsified unless the measurement apparatuses are assumed to be tomographically complete.

Categorical quantum mechanics or process theories

Categorical Quantum Mechanics (CQM) or Process Theories are a general framework to describe physical theories, with an emphasis on processes and their compositions. It was pioneered by Samson Abramsky and Bob Coecke. Besides its influence in quantum foundations, most notably the use of a diagrammatic formalism, CQM also plays an important role in quantum technologies, most notably in the form of ZX-calculus. It also has been used to model theories outside of physics, for example the DisCoCat compositional natural language meaning model.

The framework of black boxes

In the black box or device-independent framework, an experiment is regarded as a black box where the experimentalist introduces an input (the type of experiment) and obtains an output (the outcome of the experiment). Experiments conducted by two or more parties in separate labs are hence described by their statistical correlations alone.

From Bell's theorem, we know that classical and quantum physics predict different sets of allowed correlations. It is expected, therefore, that far-from-quantum physical theories should predict correlations beyond the quantum set. In fact, there exist instances of theoretical non-quantum correlations which, a priori, do not seem physically implausible. The aim of device-independent reconstructions is to show that all such supra-quantum examples are precluded by a reasonable physical principle.

The physical principles proposed so far include no-signalling, Non-Trivial Communication Complexity, No-Advantage for Nonlocal computation, Information Causality, Macroscopic Locality, and Local Orthogonality. All these principles limit the set of possible correlations in non-trivial ways. Moreover, they are all device-independent: this means that they can be falsified under the assumption that we can decide if two or more events are space-like separated. The drawback of the device-independent approach is that, even when taken together, all the afore-mentioned physical principles do not suffice to single out the set of quantum correlations. In other words: all such reconstructions are partial.

Interpretations of quantum theory

An interpretation of quantum theory is a correspondence between the elements of its mathematical formalism and physical phenomena. For instance, in the pilot wave theory, the quantum wave function is interpreted as a field that guides the particle trajectory and evolves with it via a system of coupled differential equations. Most interpretations of quantum theory stem from the desire to solve the quantum measurement problem.

Extensions of quantum theory

In an attempt to reconcile quantum and classical physics, or to identify non-classical models with a dynamical causal structure, some modifications of quantum theory have been proposed.

Collapse models

Collapse models posit the existence of natural processes which periodically localize the wave-function. Such theories provide an explanation to the nonexistence of superpositions of macroscopic objects, at the cost of abandoning unitarity and exact energy conservation.

Quantum measure theory

In Sorkin's quantum measure theory (QMT), physical systems are not modeled via unitary rays and Hermitian operators, but through a single matrix-like object, the decoherence functional. The entries of the decoherence functional determine the feasibility to experimentally discriminate between two or more different sets of classical histories, as well as the probabilities of each experimental outcome. In some models of QMT the decoherence functional is further constrained to be positive semidefinite (strong positivity). Even under the assumption of strong positivity, there exist models of QMT which generate stronger-than-quantum Bell correlations.

Acausal quantum processes

The formalism of process matrices starts from the observation that, given the structure of quantum states, the set of feasible quantum operations follows from positivity considerations. Namely, for any linear map from states to probabilities one can find a physical system where this map corresponds to a physical measurement. Likewise, any linear transformation that maps composite states to states corresponds to a valid operation in some physical system. In view of this trend, it is reasonable to postulate that any high-order map from quantum instruments (namely, measurement processes) to probabilities should also be physically realizable. Any such map is termed a process matrix. As shown by Oreshkov et al., some process matrices describe situations where the notion of global causality breaks.

The starting point of this claim is the following mental experiment: two parties, Alice and Bob, enter a building and end up in separate rooms. The rooms have ingoing and outgoing channels from which a quantum system periodically enters and leaves the room. While those systems are in the lab, Alice and Bob are able to interact with them in any way; in particular, they can measure some of their properties.

Since Alice and Bob's interactions can be modeled by quantum instruments, the statistics they observe when they apply one instrument or another are given by a process matrix. As it turns out, there exist process matrices which would guarantee that the measurement statistics collected by Alice and Bob is incompatible with Alice interacting with her system at the same time, before or after Bob, or any convex combination of these three situations. Such processes are called acausal.

Wave function collapse

From Wikipedia, the free encyclopedia
Particle impacts during a double-slit experiment. The total interference pattern represents the original wave function, while each particle impact represents an individual wave function collapse.

In various interpretations of quantum mechanics, wave function collapse, also called reduction of the state vector, occurs when a wave function—initially in a superposition of several eigenstates—reduces to a single eigenstate due to interaction with the external world. This interaction is called an observation and is the essence of a measurement in quantum mechanics, which connects the wave function with classical observables such as position and momentum. Collapse is one of the two processes by which quantum systems evolve in time; the other is the continuous evolution governed by the Schrödinger equation.

In the Copenhagen interpretation, wave function collapse connects quantum to classical models, with a special role for the observer. By contrast, objective-collapse proposes an origin in physical processes. In the many-worlds interpretation, collapse does not exist; all wave function outcomes occur while quantum decoherence accounts for the appearance of collapse.

Historically, Werner Heisenberg was the first to use the idea of wave function reduction to explain quantum measurement.

Mathematical description

In quantum mechanics each measurable physical quantity of a quantum system is called an observable which, for example, could be the position and the momentum but also energy , components of spin (), and so on. The observable acts as a linear function on the states of the system; its eigenvectors correspond to the quantum state (i.e. eigenstate) and the eigenvalues to the possible values of the observable. The collection of eigenstates/eigenvalue pairs represent all possible values of the observable. Writing for an eigenstate and for the corresponding observed value, any arbitrary state of the quantum system can be expressed as a vector using bra–ket notation: The kets specify the different available quantum "alternatives", i.e., particular quantum states.

The wave function is a specific representation of a quantum state. Wave functions can therefore always be expressed as eigenstates of an observable though the converse is not necessarily true.

Collapse

To account for the experimental result that repeated measurements of a quantum system give the same results, the theory postulates a "collapse" or "reduction of the state vector" upon observation, abruptly converting an arbitrary state into a single component eigenstate of the observable:

where the arrow represents a measurement of the observable corresponding to the basis. For any single event, only one eigenvalue is measured, chosen randomly from among the possible values.

Meaning of the expansion coefficients

The complex coefficients in the expansion of a quantum state in terms of eigenstates , can be written as an (complex) overlap of the corresponding eigenstate and the quantum state: They are called the probability amplitudes. The square modulus is the probability that a measurement of the observable yields the eigenstate . The sum of the probability over all possible outcomes must be one:

As examples, individual counts in a double slit experiment with electrons appear at random locations on the detector; after many counts are summed the distribution shows a wave interference pattern. In a Stern-Gerlach experiment with silver atoms, each particle appears in one of two areas unpredictably, but the final conclusion has equal numbers of events in each area.

This statistical aspect of quantum measurements differs fundamentally from classical mechanics. In quantum mechanics the only information we have about a system is its wave function and measurements of its wave function can only give statistical information.

Terminology

The two terms "reduction of the state vector" (or "state reduction" for short) and "wave function collapse" are used to describe the same concept. A quantum state is a mathematical description of a quantum system; a quantum state vector uses Hilbert space vectors for the description. Reduction of the state vector replaces the full state vector with a single eigenstate of the observable.

The term "wave function" is typically used for a different mathematical representation of the quantum state, one that uses spatial coordinates also called the "position representation". When the wave function representation is used, the "reduction" is called "wave function collapse".

The measurement problem

The Schrödinger equation describes quantum systems but does not describe their measurement. Solution to the equations include all possible observable values for measurements, but measurements only result in one definite outcome. This difference is called the measurement problem of quantum mechanics. To predict measurement outcomes from quantum solutions, the orthodox interpretation of quantum theory postulates wave function collapse and uses the Born rule to compute the probable outcomes. Despite the widespread quantitative success of these postulates scientists remain dissatisfied and have sought more detailed physical models. Rather than suspending the Schrödinger equation during the process of measurement, the measurement apparatus should be included and governed by the laws of quantum mechanics.

Physical approaches to collapse

Quantum theory offers no dynamical description of the "collapse" of the wave function. Viewed as a statistical theory, no description is expected. As Fuchs and Peres put it, "collapse is something that happens in our description of the system, not to the system itself".

Various interpretations of quantum mechanics attempt to provide a physical model for collapse. Three treatments of collapse can be found among the common interpretations. The first group includes hidden-variable theories like de Broglie–Bohm theory; here random outcomes only result from unknown values of hidden variables. Results from tests of Bell's theorem shows that these variables would need to be non-local. The second group models measurement as quantum entanglement between the quantum state and the measurement apparatus. This results in a simulation of classical statistics called quantum decoherence. This group includes the many-worlds interpretation and consistent histories models. The third group postulates additional, but as yet undetected, physical basis for the randomness; this group includes for example the objective-collapse interpretations. While models in all groups have contributed to better understanding of quantum theory, no alternative explanation for individual events has emerged as more useful than collapse followed by statistical prediction with the Born rule.

The significance ascribed to the wave function varies from interpretation to interpretation and even within an interpretation (such as the Copenhagen interpretation). If the wave function merely encodes an observer's knowledge of the universe, then the wave function collapse corresponds to the receipt of new information. This is somewhat analogous to the situation in classical physics, except that the classical "wave function" does not necessarily obey a wave equation. If the wave function is physically real, in some sense and to some extent, then the collapse of the wave function is also seen as a real process, to the same extent.

Quantum decoherence

Quantum decoherence explains why a system interacting with an environment transitions from being a pure state, exhibiting superpositions, to a mixed state, an incoherent combination of classical alternatives. This transition is fundamentally reversible, as the combined state of system and environment is still pure, but for all practical purposes irreversible in the same sense as in the second law of thermodynamics: the environment is a very large and complex quantum system, and it is not feasible to reverse their interaction. Decoherence is thus very important for explaining the classical limit of quantum mechanics, but cannot explain wave function collapse, as all classical alternatives are still present in the mixed state, and wave function collapse selects only one of them.

The form of decoherence known as environment-induced superselection proposes that when a quantum system interacts with the environment, the superpositions apparently reduce to mixtures of classical alternatives. The combined wave function of the system and environment continue to obey the Schrödinger equation throughout this apparent collapse. More importantly, this is not enough to explain actual wave function collapse, as decoherence does not reduce it to a single eigenstate.

History

The concept of wavefunction collapse was introduced by Werner Heisenberg in his 1927 paper on the uncertainty principle, "Über den anschaulichen Inhalt der quantentheoretischen Kinematik und Mechanik", and incorporated into the mathematical formulation of quantum mechanics by John von Neumann, in his 1932 treatise Mathematische Grundlagen der Quantenmechanik. Heisenberg did not try to specify exactly what the collapse of the wavefunction meant. However, he emphasized that it should not be understood as a physical process. Niels Bohr never mentions wave function collapse in his published work, but he repeatedly cautioned that we must give up a "pictorial representation". Despite the differences between Bohr and Heisenberg, their views are often grouped together as the "Copenhagen interpretation", of which wave function collapse is regarded as a key feature.

John von Neumann's influential 1932 work Mathematical Foundations of Quantum Mechanics took a more formal approach, developing an "ideal" measurement scheme that postulated that there were two processes of wave function change:

  1. The probabilistic, non-unitary, non-local, discontinuous change brought about by observation and measurement (state reduction or collapse).
  2. The deterministic, unitary, continuous time evolution of an isolated system that obeys the Schrödinger equation.

In 1957 Hugh Everett III proposed a model of quantum mechanics that dropped von Neumann's first postulate. Everett observed that the measurement apparatus was also a quantum system and its quantum interaction with the system under observation should determine the results. He proposed that the discontinuous change is instead a splitting of a wave function representing the universe. While Everett's approach rekindled interest in foundational quantum mechanics, it left core issues unresolved. Two key issues relate to origin of the observed classical results: what causes quantum systems to appear classical and to resolve with the observed probabilities of the Born rule.

Beginning in 1970 H. Dieter Zeh sought a detailed quantum decoherence model for the discontinuous change without postulating collapse. Further work by Wojciech H. Zurek in 1980 lead eventually to a large number of papers on many aspects of the concept. Decoherence assumes that every quantum system interacts quantum mechanically with its environment and such interaction is not separable from the system, a concept called an "open system". Decoherence has been shown to work very quickly and within a minimal environment, but as yet it has not succeeded in a providing a detailed model replacing the collapse postulate of orthodox quantum mechanics.

By explicitly dealing with the interaction of object and measuring instrument, von Neumann described a quantum mechanical measurement scheme consistent with wave function collapse. However, he did not prove the necessity of such a collapse. Von Neumann's projection postulate was conceived based on experimental evidence available during the 1930s, in particular Compton scattering. Later work refined the notion of measurements into the more easily discussed first kind, that will give the same value when immediately repeated, and the second kind that give different values when repeated.

Naïve realism (psychology)

From Wikipedia, the free encyclopedia

In social psychology, naïve realism is the human tendency to believe that we see the world around us objectively, and that people who disagree with us must be uninformed, irrational, or biased.

Naïve realism provides a theoretical basis for several other cognitive biases, which are systematic errors when it comes to thinking and making decisions. These include the false consensus effect, actor–observer bias, bias blind spot, and fundamental attribution error, among others.

The term, as it is used in psychology today, was coined by social psychologist Lee Ross and his colleagues in the 1990s. It is related to the philosophical concept of naïve realism, which is the idea that our senses allow us to perceive objects directly and without any intervening processes. Social psychologists in the mid-20th century argued against this stance and proposed instead that perception is inherently subjective.

Several prominent social psychologists have studied naïve realism experimentally, including Lee Ross, Andrew Ward, Dale Griffin, Emily Pronin, Thomas Gilovich, Robert Robinson, and Dacher Keltner. In 2010, the Handbook of Social Psychology recognized naïve realism as one of "four hard-won insights about human perception, thinking, motivation and behavior that ... represent important, indeed foundational, contributions of social psychology."

Main assumptions

Lee Ross and fellow psychologist Andrew Ward have outlined three interrelated assumptions, or "tenets", that make up naïve realism. They argue that these assumptions are supported by a long line of thinking in social psychology, along with several empirical studies. According to their model, people:

  • Believe that they see the world objectively and without bias.
  • Expect that others will come to the same conclusions, so long as they are exposed to the same information and interpret it in a rational manner.
  • Assume that others who do not share the same views must be ignorant, irrational, or biased.

History of the concept

Naïve realism follows from a subjectivist tradition in modern social psychology, which traces its roots back to one of the field's founders, German-American psychologist Kurt Lewin. Lewin's ideas were strongly informed by Gestalt psychology, a 20th-century school of thought which focused on examining psychological phenomena in context, as parts of a whole.

From the 1920s through the 1940s, Lewin developed an approach for studying human behavior which he called field theory. Field theory proposes that a person's behavior is a function of the person and the environment. Lewin considered a person's psychological environment, or "life space", to be subjective and thus distinct from physical reality.

During this time period, subjectivist ideas also propagated throughout other areas of psychology. For example, the developmental psychologist Jean Piaget argued that children view the world through an egocentric lens, and they have trouble separating their own beliefs from the beliefs of others.

In the 1940s and 1950s, early pioneers in social psychology applied the subjectivist view to the field of social perception. In 1948, psychologists David Kretch and Richard Krutchfield argued that people perceive and interpret the world according to their "own needs, own connotations, own personality, own previously formed cognitive patterns".

Social psychologist Gustav Ichheiser expanded on this idea, noting how biases in person perception lead to misunderstandings in social relations. According to Ichheiser, "We tend to resolve our perplexity arising out of the experience that other people see the world differently than we see it ourselves by declaring that these others, in consequence of some basic intellectual and moral defect, are unable to see things 'as they really are' and to react to them 'in a normal way'. We thus imply, of course, that things are in fact as we see them, and that our ways are the normal ways."

Solomon Asch, a prominent social psychologist who was also brought up in the Gestalt tradition, argued that people disagree because they base their judgments on different construals, or ways of looking at various issues. However, they are under the illusion that their judgments about the social world are objective. "This attitude, which has been aptly described as naive realism, sees no problem in the fact of perception or knowledge of the surroundings. Things are what they appear to be; they have just the qualities that they reveal to sight and touch," he wrote in his textbook Social Psychology in 1952. "This attitude, does not, however, describe the actual conditions of our knowledge of the surroundings."

Experimental evidence

"They saw a game"

In a seminal study in social psychology, which was published in a paper in 1954, students from Dartmouth and Princeton watched a video of a heated football game between the two schools. Though they looked at the same footage, fans from both schools perceived the game very differently. The Princeton students "saw" the Dartmouth team make twice as many infractions as their own team, and they also saw the team make twice as many infractions compared to what the Dartmouth students saw. Dartmouth students viewed the game as being evenly-matched in violence, in which both sides were to blame. This study revealed that two groups perceived an event subjectively. Each team believed they saw the event objectively and that the other side's perception of the event was blinded by bias.

False consensus effect

A 1977 study conducted by Ross and colleagues provided early evidence for a cognitive bias called the false consensus effect, which is the tendency for people to overestimate the extent to which others share the same views. This bias has been cited as supporting the first two tenets of naïve realism. In the study, students were asked whether they would wear a sandwich-board sign, which said "Eat At Joe's" on it, around campus. Then they were asked to indicate whether they thought other students were likely to wear the sign, and what they thought about students who were either willing to wear it or not. The researchers found that students who agreed to wear the sign thought that the majority of students would wear the sign, and they thought that refusing to wear the sign was more revealing of their peers' personal attributes. Conversely, students who declined to wear the sign thought that most other students would also refuse, and that accepting the invitation was more revealing of certain personality traits.

Hostile media effect

A phenomenon referred to as the hostile media effect demonstrates that partisans can view neutral events subjectively according to their own needs and values, and make the assumption that those who interpret the event differently are biased. For a study in 1985, pro-Israeli and pro-Arab students were asked to watch real news coverage on the 1982 Sabra and Shatila massacre, a massive killing of Palestinian refugees (Vallone, Lee Ross and Lepper, 1985). Researchers found that partisans from both sides perceived the coverage as being biased in favor of the opposite viewpoint, and believed that the people in charge of the news program held the ideological views of the opposite side.

"Musical tapping" study

More empirical evidence for naïve realism came from psychologist Elizabeth Newton's "musical tapping study" in 1990. For the study, participants were designated either as "tappers" or as "listeners". The tappers were told to tap out the rhythm of a well-known song, while the "listeners" were asked to try to identify the song. While tappers expected that listeners would guess the tune around 50 percent of the time, the listeners were able to identify it only around 2.5 percent of the time. This provided support for a failure in perspective-taking on the side of the tappers, and an overestimation of the extent to which others would share in "hearing" the song as it was tapped.

Wall Street Game

In 2004, Ross, Liberman, and Samuels asked dorm resident advisors to nominate students to participate in a study, and to indicate whether those students were likely to cooperate or defect in the first round of the classic decision-making game called the Prisoner's Dilemma. The game was introduced to subjects in one of two ways: it was either referred to as the "Wall Street Game" or as the "Community Game". The researchers found that students in the "Community Game" condition were twice as likely to cooperate, and that it did not seem to make a difference whether students were previously categorized as "cooperators" versus "defectors". This experiment demonstrated that the game's label exerted more power on how the students played the game than the subjects' personality traits. Furthermore, the study showed that the dorm advisors did not make sufficient allowances for subjective interpretations of the game.

Consequences

Naïve realism causes people to exaggerate differences between themselves and others. Psychologists believe that it can spark and exacerbate conflict, as well as create barriers to negotiation through several different mechanisms.

Bias blind spot

One consequence of naïve realism is referred to as the bias blind spot, which is the ability to recognize cognitive and motivational biases in others while failing to recognize the impact of bias on the self. In a study conducted by Pronin, Lin, and Ross (2002), Stanford students completed a questionnaire about various biases in social judgment. The participants indicated how susceptible they thought they were to these biases compared to the average student. The researchers found that the participants consistently believed that they were less likely to be biased than their peers. In a follow-up study, students answered questions about their personal attributes (e.g. how considerate they were) compared to those of other students. The majority of students saw themselves as falling above average on most traits, which provided support for a cognitive bias known as the better-than-average effect. The students then were told that 70 to 80 percent of people fall prey to this bias. When asked about the accuracy of their self-assessments, 63 percent of the students argued that their ratings had been objective, while 13 percent of students indicated they thought their ratings had been too modest.

Fig. 1. Actual views (top), "circle's" perception of views (middle), "triangle's" perception of views (bottom). (Modeled after similar illustrations found in Robinson et al., 1995, and Ross & Ward, 1996.)

False polarization

When an individual does not share our views, the third tenet of naïve realism attributes this discrepancy to three possibilities. The individual either has been exposed to a different set of information, is lazy or unable to come to a rational conclusion, or is under a distorting influence such as bias or self-interest. This gives rise to a phenomenon called false polarization, which involves interpreting others' views as more extreme than they really are, and leads to a perception of greater intergroup differences (see Fig. 1). People assume that they perceive the issue objectively, carefully considering it from multiple views, while the other side processes information in top-down fashion. For instance, in a study conducted by Robinson et al. in 1996, pro-life and pro-choice partisans greatly overestimated the extremity of the views of the opposite side, and also overestimated the influence of ideology on others in their own group.

Reactive devaluation

The assumption that others' views are more extreme than they are, can create a barrier for conflict resolution. In a sidewalk survey conducted in the 1980s, pedestrians evaluated a nuclear arms' disarmament proposal (Stillinger et al., 1991). One group of participants was told that the proposal was made by American President Ronald Reagan, while others thought the proposal came from Soviet leader Mikhail Gorbachev. The researchers found that 90 percent of the participants who thought the proposal was from Reagan supported it, while only 44 percent in the Gorbachev group indicated their support. This provided support for a phenomenon called reactive devaluation, which involves dismissing a concession from an adversary on the assumption that the concession is either motivated by self-interest or less valuable.

Observer (quantum physics)

From Wikipedia, the free encyclopedia

Some interpretations of quantum mechanics posit a central role for an observer of a quantum phenomenon. The quantum mechanical observer is tied to the issue of observer effect, where a measurement necessarily requires interacting with the physical object being measured, affecting its properties through the interaction. The term "observable" has gained a technical meaning, denoting a Hermitian operator that represents a measurement.

Foundation

The theoretical foundation of the concept of measurement in quantum mechanics is a contentious issue deeply connected to the many interpretations of quantum mechanics. A key focus point is that of wave function collapse, for which several popular interpretations assert that measurement causes a discontinuous change into an eigenstate of the operator associated with the quantity that was measured, a change which is not time-reversible.

More explicitly, the superposition principle (ψ = Σnanψn) of quantum physics dictates that for a wave function ψ, a measurement will result in a state of the quantum system of one of the m possible eigenvalues fn , n = 1, 2, ..., m, of the operator F which is in the space of the eigenfunctions ψn , n = 1, 2, ..., m.

Once one has measured the system, one knows its current state; and this prevents it from being in one of its other states ⁠— it has apparently decohered from them without prospects of future strong quantum interference. This means that the type of measurement one performs on the system affects the end-state of the system.

An experimentally studied situation related to this is the quantum Zeno effect, in which a quantum state would decay if left alone, but does not decay because of its continuous observation. The dynamics of a quantum system under continuous observation are described by a quantum stochastic master equation known as the Belavkin equation. Further studies have shown that even observing the results after the photon is produced leads to collapsing the wave function and loading a back-history as shown by delayed choice quantum eraser.

When discussing the wave function ψ which describes the state of a system in quantum mechanics, one should be cautious of a common misconception that assumes that the wave function ψ amounts to the same thing as the physical object it describes. This flawed concept must then require existence of an external mechanism, such as a measuring instrument, that lies outside the principles governing the time evolution of the wave function ψ, in order to account for the so-called "collapse of the wave function" after a measurement has been performed. But the wave function ψ is not a physical object like, for example, an atom, which has an observable mass, charge and spin, as well as internal degrees of freedom. Instead, ψ is an abstract mathematical function that contains all the statistical information that an observer can obtain from measurements of a given system. In this case, there is no real mystery in that this mathematical form of the wave function ψ must change abruptly after a measurement has been performed.

A consequence of Bell's theorem is that measurement on one of two entangled particles can appear to have a nonlocal effect on the other particle. Additional problems related to decoherence arise when the observer is modeled as a quantum system.

Description

The Copenhagen interpretation, which is the most widely accepted interpretation of quantum mechanics among physicists, posits that an "observer" or a "measurement" is merely a physical process. One of the founders of the Copenhagen interpretation, Werner Heisenberg, wrote:

Of course the introduction of the observer must not be misunderstood to imply that some kind of subjective features are to be brought into the description of nature. The observer has, rather, only the function of registering decisions, i.e., processes in space and time, and it does not matter whether the observer is an apparatus or a human being; but the registration, i.e., the transition from the "possible" to the "actual," is absolutely necessary here and cannot be omitted from the interpretation of quantum theory.

Niels Bohr, also a founder of the Copenhagen interpretation, wrote:

all unambiguous information concerning atomic objects is derived from the permanent marks such as a spot on a photographic plate, caused by the impact of an electron left on the bodies which define the experimental conditions. Far from involving any special intricacy, the irreversible amplification effects on which the recording of the presence of atomic objects rests rather remind us of the essential irreversibility inherent in the very concept of observation. The description of atomic phenomena has in these respects a perfectly objective character, in the sense that no explicit reference is made to any individual observer and that therefore, with proper regard to relativistic exigencies, no ambiguity is involved in the communication of information.

Likewise, Asher Peres stated that "observers" in quantum physics are

similar to the ubiquitous "observers" who send and receive light signals in special relativity. Obviously, this terminology does not imply the actual presence of human beings. These fictitious physicists may as well be inanimate automata that can perform all the required tasks, if suitably programmed.

Critics of the special role of the observer also point out that observers can themselves be observed, leading to paradoxes such as that of Wigner's friend; and that it is not clear how much consciousness is required. As John Bell inquired, "Was the wave function waiting to jump for thousands of millions of years until a single-celled living creature appeared? Or did it have to wait a little longer for some highly qualified measurer—with a PhD?"

Anthropocentric interpretation

The prominence of seemingly subjective or anthropocentric ideas like "observer" in the early development of the theory has been a continuing source of disquiet and philosophical dispute. A number of new-age religious or philosophical views give the observer a more special role, or place constraints on who or what can be an observer. As an example of such claims, Fritjof Capra declared, "The crucial feature of atomic physics is that the human observer is not only necessary to observe the properties of an object, but is necessary even to define these properties." There is no credible peer-reviewed research that backs such claims.

Confusion with uncertainty principle

The uncertainty principle has been frequently confused with the observer effect, evidently even by its originator, Werner Heisenberg. The uncertainty principle in its standard form describes how precisely it is possible to measure the position and momentum of a particle at the same time. If the precision in measuring one quantity is increased, the precision in measuring the other decreases.

An alternative version of the uncertainty principle, more in the spirit of an observer effect, fully accounts for the disturbance the observer has on a system and the error incurred, although this is not how the term "uncertainty principle" is most commonly used in practice.

Mind–body problem

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Mind%E2%80%93body_problem   ...