Search This Blog

Friday, July 4, 2025

Quantum foundations

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Quantum_foundations

Quantum foundations is a discipline of science that seeks to understand the most counter-intuitive aspects of quantum theory, reformulate it and even propose new generalizations thereof. Contrary to other physical theories, such as general relativity, the defining axioms of quantum theory are quite ad hoc, with no obvious physical intuition. While they lead to the right experimental predictions, they do not come with a mental picture of the world where they fit.

There exist different approaches to resolve this conceptual gap:

  • First, one can put quantum physics in contraposition with classical physics: by identifying scenarios, such as Bell experiments, where quantum theory radically deviates from classical predictions, one hopes to gain physical insights on the structure of quantum physics.
  • Second, one can attempt to find a re-derivation of the quantum formalism in terms of operational axioms.
  • Third, one can search for a full correspondence between the mathematical elements of the quantum framework and physical phenomena: any such correspondence is called an interpretation.
  • Fourth, one can renounce quantum theory altogether and propose a different model of the world.

Research in quantum foundations is structured along these roads.

Non-classical features of quantum theory

Quantum nonlocality

Two or more separate parties conducting measurements over a quantum state can observe correlations which cannot be explained with any local hidden variable theory. Whether this should be regarded as proving that the physical world itself is "nonlocal" is a topic of debate, but the terminology of "quantum nonlocality" is commonplace. Nonlocality research efforts in quantum foundations focus on determining the exact limits that classical or quantum physics enforces on the correlations observed in a Bell experiment or more complex causal scenarios. This research program has so far provided a generalization of Bell's theorem that allows falsifying all classical theories with a superluminal, yet finite, hidden influence.

Quantum contextuality

Nonlocality can be understood as an instance of quantum contextuality. A situation is contextual when the value of an observable depends on the context in which it is measured (namely, on which other observables are being measured as well). The original definition of measurement contextuality can be extended to state preparations and even general physical transformations.

Epistemic models for the quantum wave-function

A physical property is epistemic when it represents our knowledge or beliefs on the value of a second, more fundamental feature. The probability of an event to occur is an example of an epistemic property. In contrast, a non-epistemic or ontic variable captures the notion of a “real” property of the system under consideration.

There is an on-going debate on whether the wave-function represents the epistemic state of a yet to be discovered ontic variable or, on the contrary, it is a fundamental entity. Under some physical assumptions, the Pusey–Barrett–Rudolph (PBR) theorem demonstrates the inconsistency of quantum states as epistemic states, in the sense above. Note that, in QBism and Copenhagen-type views, quantum states are still regarded as epistemic, not with respect to some ontic variable, but to one's expectations about future experimental outcomes. The PBR theorem does not exclude such epistemic views on quantum states.

Axiomatic reconstructions

Some of the counter-intuitive aspects of quantum theory, as well as the difficulty to extend it, follow from the fact that its defining axioms lack a physical motivation. An active area of research in quantum foundations is therefore to find alternative formulations of quantum theory which rely on physically compelling principles. Those efforts come in two flavors, depending on the desired level of description of the theory: the so-called Generalized Probabilistic Theories approach and the Black boxes approach.

The framework of generalized probabilistic theories

Generalized Probabilistic Theories (GPTs) are a general framework to describe the operational features of arbitrary physical theories. Essentially, they provide a statistical description of any experiment combining state preparations, transformations and measurements. The framework of GPTs can accommodate classical and quantum physics, as well as hypothetical non-quantum physical theories which nonetheless possess quantum theory's most remarkable features, such as entanglement or teleportation. Notably, a small set of physically motivated axioms is enough to single out the GPT representation of quantum theory.

L. Hardy introduced the concept of GPT in 2001, in an attempt to re-derive quantum theory from basic physical principles. Although Hardy's work was very influential (see the follow-ups below), one of his axioms was regarded as unsatisfactory: it stipulated that, of all the physical theories compatible with the rest of the axioms, one should choose the simplest one. The work of Dakic and Brukner eliminated this “axiom of simplicity” and provided a reconstruction of quantum theory based on three physical principles. This was followed by the more rigorous reconstruction of Masanes and Müller.

Axioms common to these three reconstructions are:

  • The subspace axiom: systems which can store the same amount of information are physically equivalent.
  • Local tomography: to characterize the state of a composite system it is enough to conduct measurements at each part.
  • Reversibility: for any two extremal states [i.e., states which are not statistical mixtures of other states], there exists a reversible physical transformation that maps one into the other.

An alternative GPT reconstruction proposed by Chiribella, D'Ariano and Perinotti  around the same time is also based on the

  • Purification axiom: for any state of a physical system A there exists a bipartite physical system and an extremal state (or purification) such that is the restriction of to system . In addition, any two such purifications of can be mapped into one another via a reversible physical transformation on system .

The use of purification to characterize quantum theory has been criticized on the grounds that it also applies in the Spekkens toy model.

To the success of the GPT approach, it can be countered that all such works just recover finite dimensional quantum theory. In addition, none of the previous axioms can be experimentally falsified unless the measurement apparatuses are assumed to be tomographically complete.

Categorical quantum mechanics or process theories

Categorical Quantum Mechanics (CQM) or Process Theories are a general framework to describe physical theories, with an emphasis on processes and their compositions. It was pioneered by Samson Abramsky and Bob Coecke. Besides its influence in quantum foundations, most notably the use of a diagrammatic formalism, CQM also plays an important role in quantum technologies, most notably in the form of ZX-calculus. It also has been used to model theories outside of physics, for example the DisCoCat compositional natural language meaning model.

The framework of black boxes

In the black box or device-independent framework, an experiment is regarded as a black box where the experimentalist introduces an input (the type of experiment) and obtains an output (the outcome of the experiment). Experiments conducted by two or more parties in separate labs are hence described by their statistical correlations alone.

From Bell's theorem, we know that classical and quantum physics predict different sets of allowed correlations. It is expected, therefore, that far-from-quantum physical theories should predict correlations beyond the quantum set. In fact, there exist instances of theoretical non-quantum correlations which, a priori, do not seem physically implausible. The aim of device-independent reconstructions is to show that all such supra-quantum examples are precluded by a reasonable physical principle.

The physical principles proposed so far include no-signalling, Non-Trivial Communication Complexity, No-Advantage for Nonlocal computation, Information Causality, Macroscopic Locality, and Local Orthogonality. All these principles limit the set of possible correlations in non-trivial ways. Moreover, they are all device-independent: this means that they can be falsified under the assumption that we can decide if two or more events are space-like separated. The drawback of the device-independent approach is that, even when taken together, all the afore-mentioned physical principles do not suffice to single out the set of quantum correlations. In other words: all such reconstructions are partial.

Interpretations of quantum theory

An interpretation of quantum theory is a correspondence between the elements of its mathematical formalism and physical phenomena. For instance, in the pilot wave theory, the quantum wave function is interpreted as a field that guides the particle trajectory and evolves with it via a system of coupled differential equations. Most interpretations of quantum theory stem from the desire to solve the quantum measurement problem.

Extensions of quantum theory

In an attempt to reconcile quantum and classical physics, or to identify non-classical models with a dynamical causal structure, some modifications of quantum theory have been proposed.

Collapse models

Collapse models posit the existence of natural processes which periodically localize the wave-function. Such theories provide an explanation to the nonexistence of superpositions of macroscopic objects, at the cost of abandoning unitarity and exact energy conservation.

Quantum measure theory

In Sorkin's quantum measure theory (QMT), physical systems are not modeled via unitary rays and Hermitian operators, but through a single matrix-like object, the decoherence functional. The entries of the decoherence functional determine the feasibility to experimentally discriminate between two or more different sets of classical histories, as well as the probabilities of each experimental outcome. In some models of QMT the decoherence functional is further constrained to be positive semidefinite (strong positivity). Even under the assumption of strong positivity, there exist models of QMT which generate stronger-than-quantum Bell correlations.

Acausal quantum processes

The formalism of process matrices starts from the observation that, given the structure of quantum states, the set of feasible quantum operations follows from positivity considerations. Namely, for any linear map from states to probabilities one can find a physical system where this map corresponds to a physical measurement. Likewise, any linear transformation that maps composite states to states corresponds to a valid operation in some physical system. In view of this trend, it is reasonable to postulate that any high-order map from quantum instruments (namely, measurement processes) to probabilities should also be physically realizable. Any such map is termed a process matrix. As shown by Oreshkov et al., some process matrices describe situations where the notion of global causality breaks.

The starting point of this claim is the following mental experiment: two parties, Alice and Bob, enter a building and end up in separate rooms. The rooms have ingoing and outgoing channels from which a quantum system periodically enters and leaves the room. While those systems are in the lab, Alice and Bob are able to interact with them in any way; in particular, they can measure some of their properties.

Since Alice and Bob's interactions can be modeled by quantum instruments, the statistics they observe when they apply one instrument or another are given by a process matrix. As it turns out, there exist process matrices which would guarantee that the measurement statistics collected by Alice and Bob is incompatible with Alice interacting with her system at the same time, before or after Bob, or any convex combination of these three situations. Such processes are called acausal.

Wave function collapse

From Wikipedia, the free encyclopedia
Particle impacts during a double-slit experiment. The total interference pattern represents the original wave function, while each particle impact represents an individual wave function collapse.

In various interpretations of quantum mechanics, wave function collapse, also called reduction of the state vector, occurs when a wave function—initially in a superposition of several eigenstates—reduces to a single eigenstate due to interaction with the external world. This interaction is called an observation and is the essence of a measurement in quantum mechanics, which connects the wave function with classical observables such as position and momentum. Collapse is one of the two processes by which quantum systems evolve in time; the other is the continuous evolution governed by the Schrödinger equation.

In the Copenhagen interpretation, wave function collapse connects quantum to classical models, with a special role for the observer. By contrast, objective-collapse proposes an origin in physical processes. In the many-worlds interpretation, collapse does not exist; all wave function outcomes occur while quantum decoherence accounts for the appearance of collapse.

Historically, Werner Heisenberg was the first to use the idea of wave function reduction to explain quantum measurement.

Mathematical description

In quantum mechanics each measurable physical quantity of a quantum system is called an observable which, for example, could be the position and the momentum but also energy , components of spin (), and so on. The observable acts as a linear function on the states of the system; its eigenvectors correspond to the quantum state (i.e. eigenstate) and the eigenvalues to the possible values of the observable. The collection of eigenstates/eigenvalue pairs represent all possible values of the observable. Writing for an eigenstate and for the corresponding observed value, any arbitrary state of the quantum system can be expressed as a vector using bra–ket notation: The kets specify the different available quantum "alternatives", i.e., particular quantum states.

The wave function is a specific representation of a quantum state. Wave functions can therefore always be expressed as eigenstates of an observable though the converse is not necessarily true.

Collapse

To account for the experimental result that repeated measurements of a quantum system give the same results, the theory postulates a "collapse" or "reduction of the state vector" upon observation, abruptly converting an arbitrary state into a single component eigenstate of the observable:

where the arrow represents a measurement of the observable corresponding to the basis. For any single event, only one eigenvalue is measured, chosen randomly from among the possible values.

Meaning of the expansion coefficients

The complex coefficients in the expansion of a quantum state in terms of eigenstates , can be written as an (complex) overlap of the corresponding eigenstate and the quantum state: They are called the probability amplitudes. The square modulus is the probability that a measurement of the observable yields the eigenstate . The sum of the probability over all possible outcomes must be one:

As examples, individual counts in a double slit experiment with electrons appear at random locations on the detector; after many counts are summed the distribution shows a wave interference pattern. In a Stern-Gerlach experiment with silver atoms, each particle appears in one of two areas unpredictably, but the final conclusion has equal numbers of events in each area.

This statistical aspect of quantum measurements differs fundamentally from classical mechanics. In quantum mechanics the only information we have about a system is its wave function and measurements of its wave function can only give statistical information.

Terminology

The two terms "reduction of the state vector" (or "state reduction" for short) and "wave function collapse" are used to describe the same concept. A quantum state is a mathematical description of a quantum system; a quantum state vector uses Hilbert space vectors for the description. Reduction of the state vector replaces the full state vector with a single eigenstate of the observable.

The term "wave function" is typically used for a different mathematical representation of the quantum state, one that uses spatial coordinates also called the "position representation". When the wave function representation is used, the "reduction" is called "wave function collapse".

The measurement problem

The Schrödinger equation describes quantum systems but does not describe their measurement. Solution to the equations include all possible observable values for measurements, but measurements only result in one definite outcome. This difference is called the measurement problem of quantum mechanics. To predict measurement outcomes from quantum solutions, the orthodox interpretation of quantum theory postulates wave function collapse and uses the Born rule to compute the probable outcomes. Despite the widespread quantitative success of these postulates scientists remain dissatisfied and have sought more detailed physical models. Rather than suspending the Schrödinger equation during the process of measurement, the measurement apparatus should be included and governed by the laws of quantum mechanics.

Physical approaches to collapse

Quantum theory offers no dynamical description of the "collapse" of the wave function. Viewed as a statistical theory, no description is expected. As Fuchs and Peres put it, "collapse is something that happens in our description of the system, not to the system itself".

Various interpretations of quantum mechanics attempt to provide a physical model for collapse. Three treatments of collapse can be found among the common interpretations. The first group includes hidden-variable theories like de Broglie–Bohm theory; here random outcomes only result from unknown values of hidden variables. Results from tests of Bell's theorem shows that these variables would need to be non-local. The second group models measurement as quantum entanglement between the quantum state and the measurement apparatus. This results in a simulation of classical statistics called quantum decoherence. This group includes the many-worlds interpretation and consistent histories models. The third group postulates additional, but as yet undetected, physical basis for the randomness; this group includes for example the objective-collapse interpretations. While models in all groups have contributed to better understanding of quantum theory, no alternative explanation for individual events has emerged as more useful than collapse followed by statistical prediction with the Born rule.

The significance ascribed to the wave function varies from interpretation to interpretation and even within an interpretation (such as the Copenhagen interpretation). If the wave function merely encodes an observer's knowledge of the universe, then the wave function collapse corresponds to the receipt of new information. This is somewhat analogous to the situation in classical physics, except that the classical "wave function" does not necessarily obey a wave equation. If the wave function is physically real, in some sense and to some extent, then the collapse of the wave function is also seen as a real process, to the same extent.

Quantum decoherence

Quantum decoherence explains why a system interacting with an environment transitions from being a pure state, exhibiting superpositions, to a mixed state, an incoherent combination of classical alternatives. This transition is fundamentally reversible, as the combined state of system and environment is still pure, but for all practical purposes irreversible in the same sense as in the second law of thermodynamics: the environment is a very large and complex quantum system, and it is not feasible to reverse their interaction. Decoherence is thus very important for explaining the classical limit of quantum mechanics, but cannot explain wave function collapse, as all classical alternatives are still present in the mixed state, and wave function collapse selects only one of them.

The form of decoherence known as environment-induced superselection proposes that when a quantum system interacts with the environment, the superpositions apparently reduce to mixtures of classical alternatives. The combined wave function of the system and environment continue to obey the Schrödinger equation throughout this apparent collapse. More importantly, this is not enough to explain actual wave function collapse, as decoherence does not reduce it to a single eigenstate.

History

The concept of wavefunction collapse was introduced by Werner Heisenberg in his 1927 paper on the uncertainty principle, "Über den anschaulichen Inhalt der quantentheoretischen Kinematik und Mechanik", and incorporated into the mathematical formulation of quantum mechanics by John von Neumann, in his 1932 treatise Mathematische Grundlagen der Quantenmechanik. Heisenberg did not try to specify exactly what the collapse of the wavefunction meant. However, he emphasized that it should not be understood as a physical process. Niels Bohr never mentions wave function collapse in his published work, but he repeatedly cautioned that we must give up a "pictorial representation". Despite the differences between Bohr and Heisenberg, their views are often grouped together as the "Copenhagen interpretation", of which wave function collapse is regarded as a key feature.

John von Neumann's influential 1932 work Mathematical Foundations of Quantum Mechanics took a more formal approach, developing an "ideal" measurement scheme that postulated that there were two processes of wave function change:

  1. The probabilistic, non-unitary, non-local, discontinuous change brought about by observation and measurement (state reduction or collapse).
  2. The deterministic, unitary, continuous time evolution of an isolated system that obeys the Schrödinger equation.

In 1957 Hugh Everett III proposed a model of quantum mechanics that dropped von Neumann's first postulate. Everett observed that the measurement apparatus was also a quantum system and its quantum interaction with the system under observation should determine the results. He proposed that the discontinuous change is instead a splitting of a wave function representing the universe. While Everett's approach rekindled interest in foundational quantum mechanics, it left core issues unresolved. Two key issues relate to origin of the observed classical results: what causes quantum systems to appear classical and to resolve with the observed probabilities of the Born rule.

Beginning in 1970 H. Dieter Zeh sought a detailed quantum decoherence model for the discontinuous change without postulating collapse. Further work by Wojciech H. Zurek in 1980 lead eventually to a large number of papers on many aspects of the concept. Decoherence assumes that every quantum system interacts quantum mechanically with its environment and such interaction is not separable from the system, a concept called an "open system". Decoherence has been shown to work very quickly and within a minimal environment, but as yet it has not succeeded in a providing a detailed model replacing the collapse postulate of orthodox quantum mechanics.

By explicitly dealing with the interaction of object and measuring instrument, von Neumann described a quantum mechanical measurement scheme consistent with wave function collapse. However, he did not prove the necessity of such a collapse. Von Neumann's projection postulate was conceived based on experimental evidence available during the 1930s, in particular Compton scattering. Later work refined the notion of measurements into the more easily discussed first kind, that will give the same value when immediately repeated, and the second kind that give different values when repeated.

Naïve realism (psychology)

From Wikipedia, the free encyclopedia

In social psychology, naïve realism is the human tendency to believe that we see the world around us objectively, and that people who disagree with us must be uninformed, irrational, or biased.

Naïve realism provides a theoretical basis for several other cognitive biases, which are systematic errors when it comes to thinking and making decisions. These include the false consensus effect, actor–observer bias, bias blind spot, and fundamental attribution error, among others.

The term, as it is used in psychology today, was coined by social psychologist Lee Ross and his colleagues in the 1990s. It is related to the philosophical concept of naïve realism, which is the idea that our senses allow us to perceive objects directly and without any intervening processes. Social psychologists in the mid-20th century argued against this stance and proposed instead that perception is inherently subjective.

Several prominent social psychologists have studied naïve realism experimentally, including Lee Ross, Andrew Ward, Dale Griffin, Emily Pronin, Thomas Gilovich, Robert Robinson, and Dacher Keltner. In 2010, the Handbook of Social Psychology recognized naïve realism as one of "four hard-won insights about human perception, thinking, motivation and behavior that ... represent important, indeed foundational, contributions of social psychology."

Main assumptions

Lee Ross and fellow psychologist Andrew Ward have outlined three interrelated assumptions, or "tenets", that make up naïve realism. They argue that these assumptions are supported by a long line of thinking in social psychology, along with several empirical studies. According to their model, people:

  • Believe that they see the world objectively and without bias.
  • Expect that others will come to the same conclusions, so long as they are exposed to the same information and interpret it in a rational manner.
  • Assume that others who do not share the same views must be ignorant, irrational, or biased.

History of the concept

Naïve realism follows from a subjectivist tradition in modern social psychology, which traces its roots back to one of the field's founders, German-American psychologist Kurt Lewin. Lewin's ideas were strongly informed by Gestalt psychology, a 20th-century school of thought which focused on examining psychological phenomena in context, as parts of a whole.

From the 1920s through the 1940s, Lewin developed an approach for studying human behavior which he called field theory. Field theory proposes that a person's behavior is a function of the person and the environment. Lewin considered a person's psychological environment, or "life space", to be subjective and thus distinct from physical reality.

During this time period, subjectivist ideas also propagated throughout other areas of psychology. For example, the developmental psychologist Jean Piaget argued that children view the world through an egocentric lens, and they have trouble separating their own beliefs from the beliefs of others.

In the 1940s and 1950s, early pioneers in social psychology applied the subjectivist view to the field of social perception. In 1948, psychologists David Kretch and Richard Krutchfield argued that people perceive and interpret the world according to their "own needs, own connotations, own personality, own previously formed cognitive patterns".

Social psychologist Gustav Ichheiser expanded on this idea, noting how biases in person perception lead to misunderstandings in social relations. According to Ichheiser, "We tend to resolve our perplexity arising out of the experience that other people see the world differently than we see it ourselves by declaring that these others, in consequence of some basic intellectual and moral defect, are unable to see things 'as they really are' and to react to them 'in a normal way'. We thus imply, of course, that things are in fact as we see them, and that our ways are the normal ways."

Solomon Asch, a prominent social psychologist who was also brought up in the Gestalt tradition, argued that people disagree because they base their judgments on different construals, or ways of looking at various issues. However, they are under the illusion that their judgments about the social world are objective. "This attitude, which has been aptly described as naive realism, sees no problem in the fact of perception or knowledge of the surroundings. Things are what they appear to be; they have just the qualities that they reveal to sight and touch," he wrote in his textbook Social Psychology in 1952. "This attitude, does not, however, describe the actual conditions of our knowledge of the surroundings."

Experimental evidence

"They saw a game"

In a seminal study in social psychology, which was published in a paper in 1954, students from Dartmouth and Princeton watched a video of a heated football game between the two schools. Though they looked at the same footage, fans from both schools perceived the game very differently. The Princeton students "saw" the Dartmouth team make twice as many infractions as their own team, and they also saw the team make twice as many infractions compared to what the Dartmouth students saw. Dartmouth students viewed the game as being evenly-matched in violence, in which both sides were to blame. This study revealed that two groups perceived an event subjectively. Each team believed they saw the event objectively and that the other side's perception of the event was blinded by bias.

False consensus effect

A 1977 study conducted by Ross and colleagues provided early evidence for a cognitive bias called the false consensus effect, which is the tendency for people to overestimate the extent to which others share the same views. This bias has been cited as supporting the first two tenets of naïve realism. In the study, students were asked whether they would wear a sandwich-board sign, which said "Eat At Joe's" on it, around campus. Then they were asked to indicate whether they thought other students were likely to wear the sign, and what they thought about students who were either willing to wear it or not. The researchers found that students who agreed to wear the sign thought that the majority of students would wear the sign, and they thought that refusing to wear the sign was more revealing of their peers' personal attributes. Conversely, students who declined to wear the sign thought that most other students would also refuse, and that accepting the invitation was more revealing of certain personality traits.

Hostile media effect

A phenomenon referred to as the hostile media effect demonstrates that partisans can view neutral events subjectively according to their own needs and values, and make the assumption that those who interpret the event differently are biased. For a study in 1985, pro-Israeli and pro-Arab students were asked to watch real news coverage on the 1982 Sabra and Shatila massacre, a massive killing of Palestinian refugees (Vallone, Lee Ross and Lepper, 1985). Researchers found that partisans from both sides perceived the coverage as being biased in favor of the opposite viewpoint, and believed that the people in charge of the news program held the ideological views of the opposite side.

"Musical tapping" study

More empirical evidence for naïve realism came from psychologist Elizabeth Newton's "musical tapping study" in 1990. For the study, participants were designated either as "tappers" or as "listeners". The tappers were told to tap out the rhythm of a well-known song, while the "listeners" were asked to try to identify the song. While tappers expected that listeners would guess the tune around 50 percent of the time, the listeners were able to identify it only around 2.5 percent of the time. This provided support for a failure in perspective-taking on the side of the tappers, and an overestimation of the extent to which others would share in "hearing" the song as it was tapped.

Wall Street Game

In 2004, Ross, Liberman, and Samuels asked dorm resident advisors to nominate students to participate in a study, and to indicate whether those students were likely to cooperate or defect in the first round of the classic decision-making game called the Prisoner's Dilemma. The game was introduced to subjects in one of two ways: it was either referred to as the "Wall Street Game" or as the "Community Game". The researchers found that students in the "Community Game" condition were twice as likely to cooperate, and that it did not seem to make a difference whether students were previously categorized as "cooperators" versus "defectors". This experiment demonstrated that the game's label exerted more power on how the students played the game than the subjects' personality traits. Furthermore, the study showed that the dorm advisors did not make sufficient allowances for subjective interpretations of the game.

Consequences

Naïve realism causes people to exaggerate differences between themselves and others. Psychologists believe that it can spark and exacerbate conflict, as well as create barriers to negotiation through several different mechanisms.

Bias blind spot

One consequence of naïve realism is referred to as the bias blind spot, which is the ability to recognize cognitive and motivational biases in others while failing to recognize the impact of bias on the self. In a study conducted by Pronin, Lin, and Ross (2002), Stanford students completed a questionnaire about various biases in social judgment. The participants indicated how susceptible they thought they were to these biases compared to the average student. The researchers found that the participants consistently believed that they were less likely to be biased than their peers. In a follow-up study, students answered questions about their personal attributes (e.g. how considerate they were) compared to those of other students. The majority of students saw themselves as falling above average on most traits, which provided support for a cognitive bias known as the better-than-average effect. The students then were told that 70 to 80 percent of people fall prey to this bias. When asked about the accuracy of their self-assessments, 63 percent of the students argued that their ratings had been objective, while 13 percent of students indicated they thought their ratings had been too modest.

Fig. 1. Actual views (top), "circle's" perception of views (middle), "triangle's" perception of views (bottom). (Modeled after similar illustrations found in Robinson et al., 1995, and Ross & Ward, 1996.)

False polarization

When an individual does not share our views, the third tenet of naïve realism attributes this discrepancy to three possibilities. The individual either has been exposed to a different set of information, is lazy or unable to come to a rational conclusion, or is under a distorting influence such as bias or self-interest. This gives rise to a phenomenon called false polarization, which involves interpreting others' views as more extreme than they really are, and leads to a perception of greater intergroup differences (see Fig. 1). People assume that they perceive the issue objectively, carefully considering it from multiple views, while the other side processes information in top-down fashion. For instance, in a study conducted by Robinson et al. in 1996, pro-life and pro-choice partisans greatly overestimated the extremity of the views of the opposite side, and also overestimated the influence of ideology on others in their own group.

Reactive devaluation

The assumption that others' views are more extreme than they are, can create a barrier for conflict resolution. In a sidewalk survey conducted in the 1980s, pedestrians evaluated a nuclear arms' disarmament proposal (Stillinger et al., 1991). One group of participants was told that the proposal was made by American President Ronald Reagan, while others thought the proposal came from Soviet leader Mikhail Gorbachev. The researchers found that 90 percent of the participants who thought the proposal was from Reagan supported it, while only 44 percent in the Gorbachev group indicated their support. This provided support for a phenomenon called reactive devaluation, which involves dismissing a concession from an adversary on the assumption that the concession is either motivated by self-interest or less valuable.

Observer (quantum physics)

From Wikipedia, the free encyclopedia

Some interpretations of quantum mechanics posit a central role for an observer of a quantum phenomenon. The quantum mechanical observer is tied to the issue of observer effect, where a measurement necessarily requires interacting with the physical object being measured, affecting its properties through the interaction. The term "observable" has gained a technical meaning, denoting a Hermitian operator that represents a measurement.

Foundation

The theoretical foundation of the concept of measurement in quantum mechanics is a contentious issue deeply connected to the many interpretations of quantum mechanics. A key focus point is that of wave function collapse, for which several popular interpretations assert that measurement causes a discontinuous change into an eigenstate of the operator associated with the quantity that was measured, a change which is not time-reversible.

More explicitly, the superposition principle (ψ = Σnanψn) of quantum physics dictates that for a wave function ψ, a measurement will result in a state of the quantum system of one of the m possible eigenvalues fn , n = 1, 2, ..., m, of the operator F which is in the space of the eigenfunctions ψn , n = 1, 2, ..., m.

Once one has measured the system, one knows its current state; and this prevents it from being in one of its other states ⁠— it has apparently decohered from them without prospects of future strong quantum interference. This means that the type of measurement one performs on the system affects the end-state of the system.

An experimentally studied situation related to this is the quantum Zeno effect, in which a quantum state would decay if left alone, but does not decay because of its continuous observation. The dynamics of a quantum system under continuous observation are described by a quantum stochastic master equation known as the Belavkin equation. Further studies have shown that even observing the results after the photon is produced leads to collapsing the wave function and loading a back-history as shown by delayed choice quantum eraser.

When discussing the wave function ψ which describes the state of a system in quantum mechanics, one should be cautious of a common misconception that assumes that the wave function ψ amounts to the same thing as the physical object it describes. This flawed concept must then require existence of an external mechanism, such as a measuring instrument, that lies outside the principles governing the time evolution of the wave function ψ, in order to account for the so-called "collapse of the wave function" after a measurement has been performed. But the wave function ψ is not a physical object like, for example, an atom, which has an observable mass, charge and spin, as well as internal degrees of freedom. Instead, ψ is an abstract mathematical function that contains all the statistical information that an observer can obtain from measurements of a given system. In this case, there is no real mystery in that this mathematical form of the wave function ψ must change abruptly after a measurement has been performed.

A consequence of Bell's theorem is that measurement on one of two entangled particles can appear to have a nonlocal effect on the other particle. Additional problems related to decoherence arise when the observer is modeled as a quantum system.

Description

The Copenhagen interpretation, which is the most widely accepted interpretation of quantum mechanics among physicists, posits that an "observer" or a "measurement" is merely a physical process. One of the founders of the Copenhagen interpretation, Werner Heisenberg, wrote:

Of course the introduction of the observer must not be misunderstood to imply that some kind of subjective features are to be brought into the description of nature. The observer has, rather, only the function of registering decisions, i.e., processes in space and time, and it does not matter whether the observer is an apparatus or a human being; but the registration, i.e., the transition from the "possible" to the "actual," is absolutely necessary here and cannot be omitted from the interpretation of quantum theory.

Niels Bohr, also a founder of the Copenhagen interpretation, wrote:

all unambiguous information concerning atomic objects is derived from the permanent marks such as a spot on a photographic plate, caused by the impact of an electron left on the bodies which define the experimental conditions. Far from involving any special intricacy, the irreversible amplification effects on which the recording of the presence of atomic objects rests rather remind us of the essential irreversibility inherent in the very concept of observation. The description of atomic phenomena has in these respects a perfectly objective character, in the sense that no explicit reference is made to any individual observer and that therefore, with proper regard to relativistic exigencies, no ambiguity is involved in the communication of information.

Likewise, Asher Peres stated that "observers" in quantum physics are

similar to the ubiquitous "observers" who send and receive light signals in special relativity. Obviously, this terminology does not imply the actual presence of human beings. These fictitious physicists may as well be inanimate automata that can perform all the required tasks, if suitably programmed.

Critics of the special role of the observer also point out that observers can themselves be observed, leading to paradoxes such as that of Wigner's friend; and that it is not clear how much consciousness is required. As John Bell inquired, "Was the wave function waiting to jump for thousands of millions of years until a single-celled living creature appeared? Or did it have to wait a little longer for some highly qualified measurer—with a PhD?"

Anthropocentric interpretation

The prominence of seemingly subjective or anthropocentric ideas like "observer" in the early development of the theory has been a continuing source of disquiet and philosophical dispute. A number of new-age religious or philosophical views give the observer a more special role, or place constraints on who or what can be an observer. As an example of such claims, Fritjof Capra declared, "The crucial feature of atomic physics is that the human observer is not only necessary to observe the properties of an object, but is necessary even to define these properties." There is no credible peer-reviewed research that backs such claims.

Confusion with uncertainty principle

The uncertainty principle has been frequently confused with the observer effect, evidently even by its originator, Werner Heisenberg. The uncertainty principle in its standard form describes how precisely it is possible to measure the position and momentum of a particle at the same time. If the precision in measuring one quantity is increased, the precision in measuring the other decreases.

An alternative version of the uncertainty principle, more in the spirit of an observer effect, fully accounts for the disturbance the observer has on a system and the error incurred, although this is not how the term "uncertainty principle" is most commonly used in practice.

Quantum state

From Wikipedia, the free encyclopedia
 
In quantum physics, a quantum state is a mathematical entity that embodies the knowledge of a quantum system. Quantum mechanics specifies the construction, evolution, and measurement of a quantum state. The result is a prediction for the system represented by the state. Knowledge of the quantum state, and the rules for the system's evolution in time, exhausts all that can be known about a quantum system.

Quantum states may be defined differently for different kinds of systems or problems. Two broad categories are

Historical, educational, and application-focused problems typically feature wave functions; modern professional physics uses the abstract vector states. In both categories, quantum states divide into pure versus mixed states, or into coherent states and incoherent states. Categories with special properties include stationary states for time independence and quantum vacuum states in quantum field theory.

From the states of classical mechanics

As a tool for physics, quantum states grew out of states in classical mechanics. A classical dynamical state consists of a set of dynamical variables with well-defined real values at each instant of time. For example, the state of a cannon ball would consist of its position and velocity. The state values evolve under equations of motion and thus remain strictly determined. If we know the position of a cannon and the exit velocity of its projectiles, then we can use equations containing the force of gravity to predict the trajectory of a cannon ball precisely.

Similarly, quantum states consist of sets of dynamical variables that evolve under equations of motion. However, the values derived from quantum states are complex numbers, quantized, limited by uncertainty relations, and only provide a probability distribution for the outcomes for a system. These constraints alter the nature of quantum dynamic variables. For example, the quantum state of an electron in a double-slit experiment would consist of complex values over the detection region and, when squared, only predict the probability distribution of electron counts across the detector.

Role in quantum mechanics

The process of describing a quantum system with quantum mechanics begins with identifying a set of variables defining the quantum state of the system. The set will contain compatible and incompatible variables. Simultaneous measurement of a complete set of compatible variables prepares the system in a unique state. The state then evolves deterministically according to the equations of motion. Subsequent measurement of the state produces a sample from a probability distribution predicted by the quantum mechanical operator corresponding to the measurement.

The fundamentally statistical or probabilisitic nature of quantum measurements changes the role of quantum states in quantum mechanics compared to classical states in classical mechanics. In classical mechanics, the initial state of one or more bodies is measured; the state evolves according to the equations of motion; measurements of the final state are compared to predictions. In quantum mechanics, ensembles of identically prepared quantum states evolve according to the equations of motion and many repeated measurements are compared to predicted probability distributions.

Measurements

Measurements, macroscopic operations on quantum states, filter the state. Whatever the input quantum state might be, repeated identical measurements give consistent values. For this reason, measurements 'prepare' quantum states for experiments, placing the system in a partially defined state. Subsequent measurements may either further prepare the system – these are compatible measurements – or it may alter the state, redefining it – these are called incompatible or complementary measurements. For example, we may measure the momentum of a state along the axis any number of times and get the same result, but if we measure the position after once measuring the momentum, subsequent measurements of momentum are changed. The quantum state appears unavoidably altered by incompatible measurements. This is known as the uncertainty principle.

Eigenstates and pure states

The quantum state after a measurement is in an eigenstate corresponding to that measurement and the value measured. Other aspects of the state may be unknown. Repeating the measurement will not alter the state. In some cases, compatible measurements can further refine the state, causing it to be an eigenstate corresponding to all these measurements. A full set of compatible measurements produces a pure state. Any state that is not pure is called a mixed state as discussed in more depth below.

The eigenstate solutions to the Schrödinger equation can be formed into pure states. Experiments rarely produce pure states. Therefore statistical mixtures of solutions must be compared to experiments.

Representations

The same physical quantum state can be expressed mathematically in different ways called representations. The position wave function is one representation often seen first in introductions to quantum mechanics. The equivalent momentum wave function is another wave function based representation. Representations are analogous to coordinate systems or similar mathematical devices like parametric equations. Selecting a representation will make some aspects of a problem easier at the cost of making other things difficult.

In formal quantum mechanics (see § Formalism in quantum physics below) the theory develops in terms of abstract 'vector space', avoiding any particular representation. This allows many elegant concepts of quantum mechanics to be expressed and to be applied even in cases where no classical analog exists.

Wave function representations

Wave functions represent quantum states, particularly when they are functions of position or of momentum. Historically, definitions of quantum states used wavefunctions before the more formal methods were developed. The wave function is a complex-valued function of any complete set of commuting or compatible degrees of freedom. For example, one set could be the spatial coordinates of an electron. Preparing a system by measuring the complete set of compatible observables produces a pure quantum state. More common, incomplete preparation produces a mixed quantum state. Wave function solutions of Schrödinger's equations of motion for operators corresponding to measurements can readily be expressed as pure states; they must be combined with statistical weights matching experimental preparation to compute the expected probability distribution.

Pure states of wave functions

Probability densities for the electron of a hydrogen atom in different quantum states.

Numerical or analytic solutions in quantum mechanics can be expressed as pure states. These solution states, called eigenstates, are labeled with quantized values, typically quantum numbers. For example, when dealing with the energy spectrum of the electron in a hydrogen atom, the relevant pure states are identified by the principal quantum number n, the angular momentum quantum number , the magnetic quantum number m, and the spin z-component sz. For another example, if the spin of an electron is measured in any direction, e.g. with a Stern–Gerlach experiment, there are two possible results: up or down. A pure state here is represented by a two-dimensional complex vector , with a length of one; that is, with where and are the absolute values of and .

The postulates of quantum mechanics state that pure states, at a given time t, correspond to vectors in a separable complex Hilbert space, while each measurable physical quantity (such as the energy or momentum of a particle) is associated with a mathematical operator called the observable. The operator serves as a linear function that acts on the states of the system. The eigenvalues of the operator correspond to the possible values of the observable. For example, it is possible to observe a particle with a momentum of 1 kg⋅m/s if and only if one of the eigenvalues of the momentum operator is 1 kg⋅m/s. The corresponding eigenvector (which physicists call an eigenstate) with eigenvalue 1 kg⋅m/s would be a quantum state with a definite, well-defined value of momentum of 1 kg⋅m/s, with no quantum uncertainty. If its momentum were measured, the result is guaranteed to be 1 kg⋅m/s.

On the other hand, a pure state described as a superposition of multiple different eigenstates does in general have quantum uncertainty for the given observable. Using bra–ket notation, this linear combination of eigenstates can be represented as: The coefficient that corresponds to a particular state in the linear combination is a complex number, thus allowing interference effects between states. The coefficients are time dependent. How a quantum state changes in time is governed by the time evolution operator.

Mixed states of wave functions

A mixed quantum state corresponds to a probabilistic mixture of pure states; however, different distributions of pure states can generate equivalent (i.e., physically indistinguishable) mixed states. A mixture of quantum states is again a quantum state.

A mixed state for electron spins, in the density-matrix formulation, has the structure of a matrix that is Hermitian and positive semi-definite, and has trace 1. A more complicated case is given (in bra–ket notation) by the singlet state, which exemplifies quantum entanglement: which involves superposition of joint spin states for two particles with spin 1/2. The singlet state satisfies the property that if the particles' spins are measured along the same direction then either the spin of the first particle is observed up and the spin of the second particle is observed down, or the first one is observed down and the second one is observed up, both possibilities occurring with equal probability.

A pure quantum state can be represented by a ray in a projective Hilbert space over the complex numbers, while mixed states are represented by density matrices, which are positive semidefinite operators that act on Hilbert spaces. The Schrödinger–HJW theorem classifies the multitude of ways to write a given mixed state as a convex combination of pure states. Before a particular measurement is performed on a quantum system, the theory gives only a probability distribution for the outcome, and the form that this distribution takes is completely determined by the quantum state and the linear operators describing the measurement. Probability distributions for different measurements exhibit tradeoffs exemplified by the uncertainty principle: a state that implies a narrow spread of possible outcomes for one experiment necessarily implies a wide spread of possible outcomes for another.

Statistical mixtures of states are a different type of linear combination. A statistical mixture of states is a statistical ensemble of independent systems. Statistical mixtures represent the degree of knowledge whilst the uncertainty within quantum mechanics is fundamental. Mathematically, a statistical mixture is not a combination using complex coefficients, but rather a combination using real-valued, positive probabilities of different states . A number represents the probability of a randomly selected system being in the state . Unlike the linear combination case each system is in a definite eigenstate.

The expectation value of an observable A is a statistical mean of measured values of the observable. It is this mean, and the distribution of probabilities, that is predicted by physical theories.

There is no state that is simultaneously an eigenstate for all observables. For example, we cannot prepare a state such that both the position measurement Q(t) and the momentum measurement P(t) (at the same time t) are known exactly; at least one of them will have a range of possible values.[a] This is the content of the Heisenberg uncertainty relation.

Moreover, in contrast to classical mechanics, it is unavoidable that performing a measurement on the system generally changes its state. More precisely: After measuring an observable A, the system will be in an eigenstate of A; thus the state has changed, unless the system was already in that eigenstate. This expresses a kind of logical consistency: If we measure A twice in the same run of the experiment, the measurements being directly consecutive in time, then they will produce the same results. This has some strange consequences, however, as follows.

Consider two incompatible observables, A and B, where A corresponds to a measurement earlier in time than B. Suppose that the system is in an eigenstate of B at the experiment's beginning. If we measure only B, all runs of the experiment will yield the same result. If we measure first A and then B in the same run of the experiment, the system will transfer to an eigenstate of A after the first measurement, and we will generally notice that the results of B are statistical. Thus: Quantum mechanical measurements influence one another, and the order in which they are performed is important.

Another feature of quantum states becomes relevant if we consider a physical system that consists of multiple subsystems; for example, an experiment with two particles rather than one. Quantum physics allows for certain states, called entangled states, that show certain statistical correlations between measurements on the two particles which cannot be explained by classical theory. For details, see Quantum entanglement. These entangled states lead to experimentally testable properties (Bell's theorem) that allow us to distinguish between quantum theory and alternative classical (non-quantum) models.

Schrödinger picture vs. Heisenberg picture

One can take the observables to be dependent on time, while the state σ was fixed once at the beginning of the experiment. This approach is called the Heisenberg picture. (This approach was taken in the later part of the discussion above, with time-varying observables P(t), Q(t).) One can, equivalently, treat the observables as fixed, while the state of the system depends on time; that is known as the Schrödinger picture. (This approach was taken in the earlier part of the discussion above, with a time-varying state .) Conceptually (and mathematically), the two approaches are equivalent; choosing one of them is a matter of convention.

Both viewpoints are used in quantum theory. While non-relativistic quantum mechanics is usually formulated in terms of the Schrödinger picture, the Heisenberg picture is often preferred in a relativistic context, that is, for quantum field theory. Compare with Dirac picture.

Formalism in quantum physics

Pure states as rays in a complex Hilbert space

Quantum physics is most commonly formulated in terms of linear algebra, as follows. Any given system is identified with some finite- or infinite-dimensional Hilbert space. The pure states correspond to vectors of norm 1. Thus the set of all pure states corresponds to the unit sphere in the Hilbert space, because the unit sphere is defined as the set of all vectors with norm 1.

Multiplying a pure state by a scalar is physically inconsequential (as long as the state is considered by itself). If a vector in a complex Hilbert space can be obtained from another vector by multiplying by some non-zero complex number, the two vectors in are said to correspond to the same ray in the projective Hilbert space of . Note that although the word ray is used, properly speaking, a point in the projective Hilbert space corresponds to a line passing through the origin of the Hilbert space, rather than a half-line, or ray in the geometrical sense.

Spin

The angular momentum has the same dimension (M·L2·T−1) as the Planck constant and, at quantum scale, behaves as a discrete degree of freedom of a quantum system. Most particles possess a kind of intrinsic angular momentum that does not appear at all in classical mechanics and arises from Dirac's relativistic generalization of the theory. Mathematically it is described with spinors. In non-relativistic quantum mechanics the group representations of the Lie group SU(2) are used to describe this additional freedom. For a given particle, the choice of representation (and hence the range of possible values of the spin observable) is specified by a non-negative number S that, in units of the reduced Planck constant ħ, is either an integer (0, 1, 2, ...) or a half-integer (1/2, 3/2, 5/2, ...). For a massive particle with spin S, its spin quantum number m always assumes one of the 2S + 1 possible values in the set

As a consequence, the quantum state of a particle with spin is described by a vector-valued wave function with values in C2S+1. Equivalently, it is represented by a complex-valued function of four variables: one discrete quantum number variable (for the spin) is added to the usual three continuous variables (for the position in space).

Many-body states and particle statistics

The quantum state of a system of N particles, each potentially with spin, is described by a complex-valued function with four variables per particle, corresponding to 3 spatial coordinates and spin, e.g.

Here, the spin variables mν assume values from the set where is the spin of νth particle. for a particle that does not exhibit spin.

The treatment of identical particles is very different for bosons (particles with integer spin) versus fermions (particles with half-integer spin). The above N-particle function must either be symmetrized (in the bosonic case) or anti-symmetrized (in the fermionic case) with respect to the particle numbers. If not all N particles are identical, but some of them are, then the function must be (anti)symmetrized separately over the variables corresponding to each group of identical variables, according to its statistics (bosonic or fermionic).

Electrons are fermions with S = 1/2, photons (quanta of light) are bosons with S = 1 (although in the vacuum they are massless and can't be described with Schrödinger mechanics).

When symmetrization or anti-symmetrization is unnecessary, N-particle spaces of states can be obtained simply by tensor products of one-particle spaces, to which we will return later.

Basis states of one-particle systems

A state belonging to a separable complex Hilbert space can always be expressed uniquely as a linear combination of elements of an orthonormal basis of . Using bra–ket notation, this means any state can be written as with complex coefficients and basis elements . In this case, the normalization condition translates to In physical terms, has been expressed as a quantum superposition of the "basis states" , i.e., the eigenstates of an observable. In particular, if said observable is measured on the normalized state , then is the probability that the result of the measurement is .

In general, the expression for probability always consist of a relation between the quantum state and a portion of the spectrum of the dynamical variable (i.e. random variable) being observed. For example, the situation above describes the discrete case as eigenvalues belong to the point spectrum. Likewise, the wave function is just the eigenfunction of the Hamiltonian operator with corresponding eigenvalue(s) ; the energy of the system.

An example of the continuous case is given by the position operator. The probability measure for a system in state is given by:  where is the probability density function for finding a particle at a given position. These examples emphasize the distinction in charactertistics between the state and the observable. That is, whereas is a pure state belonging to , the (generalized) eigenvectors of the position operator do not.

Pure states vs. bound states

Though closely related, pure states are not the same as bound states belonging to the pure point spectrum of an observable with no quantum uncertainty. A particle is said to be in a bound state if it remains localized in a bounded region of space for all times. A pure state is called a bound state if and only if for every there is a compact set such that for all . The integral represents the probability that a particle is found in a bounded region at any time . If the probability remains arbitrarily close to then the particle is said to remain in .

For example, non-normalizable solutions of the free Schrödinger equation can be expressed as functions that are normalizable, using wave packets. These wave packets belong to the pure point spectrum of a corresponding projection operator which, mathematically speaking, constitutes an observable. However, they are not bound states.

Superposition of pure states

As mentioned above, quantum states may be superposed. If and are two kets corresponding to quantum states, the ket is also a quantum state of the same system. Both and can be complex numbers; their relative amplitude and relative phase will influence the resulting quantum state.

Writing the superposed state using and defining the norm of the state as: and extracting the common factors gives: The overall phase factor in front has no physical effect. Only the relative phase affects the physical nature of the superposition.

One example of superposition is the double-slit experiment, in which superposition leads to quantum interference. Another example of the importance of relative phase is Rabi oscillations, where the relative phase of two states varies in time due to the Schrödinger equation. The resulting superposition ends up oscillating back and forth between two different states.

Mixed states

A pure quantum state is a state which can be described by a single ket vector, as described above. A mixed quantum state is a statistical ensemble of pure states (see Quantum statistical mechanics).

Mixed states arise in quantum mechanics in two different situations: first, when the preparation of the system is not fully known, and thus one must deal with a statistical ensemble of possible preparations; and second, when one wants to describe a physical system which is entangled with another, as its state cannot be described by a pure state. In the first case, there could theoretically be another person who knows the full history of the system, and therefore describe the same system as a pure state; in this case, the density matrix is simply used to represent the limited knowledge of a quantum state. In the second case, however, the existence of quantum entanglement theoretically prevents the existence of complete knowledge about the subsystem, and it's impossible for any person to describe the subsystem of an entangled pair as a pure state.

Mixed states inevitably arise from pure states when, for a composite quantum system with an entangled state on it, the part is inaccessible to the observer. The state of the part is expressed then as the partial trace over .

A mixed state cannot be described with a single ket vector. Instead, it is described by its associated density matrix (or density operator), usually denoted ρ. Density matrices can describe both mixed and pure states, treating them on the same footing. Moreover, a mixed quantum state on a given quantum system described by a Hilbert space can be always represented as the partial trace of a pure quantum state (called a purification) on a larger bipartite system for a sufficiently large Hilbert space .

The density matrix describing a mixed state is defined to be an operator of the form where ps is the fraction of the ensemble in each pure state The density matrix can be thought of as a way of using the one-particle formalism to describe the behavior of many similar particles by giving a probability distribution (or ensemble) of states that these particles can be found in.

A simple criterion for checking whether a density matrix is describing a pure or mixed state is that the trace of ρ2 is equal to 1 if the state is pure, and less than 1 if the state is mixed. Another, equivalent, criterion is that the von Neumann entropy is 0 for a pure state, and strictly positive for a mixed state.

The rules for measurement in quantum mechanics are particularly simple to state in terms of density matrices. For example, the ensemble average (expectation value) of a measurement corresponding to an observable A is given by where and are eigenkets and eigenvalues, respectively, for the operator A, and "tr" denotes trace. It is important to note that two types of averaging are occurring, one (over ) being the usual expected value of the observable when the quantum is in state , and the other (over ) being a statistical (said incoherent) average with the probabilities ps that the quantum is in those states.

Mathematical generalizations

States can be formulated in terms of observables, rather than as vectors in a vector space. These are positive normalized linear functionals on a C*-algebra, or sometimes other classes of algebras of observables. See State on a C*-algebra and Gelfand–Naimark–Segal construction for more details.

Empirical evidence

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Empirical_evidence Empirical evi...