A study in 2010 is claimed to provide preliminary supporting evidence of quantum Darwinism with scars of a quantum dot "becoming a family of mother-daughter states" indicating they could "stabilize into multiple pointer states"; additionally, a similar kind of scene has been suggested with perturbation-induced scarring in disordered quantum dots (see scars).
However, the claimed evidence is also subject to the circularity
criticism by Ruth Kastner (see Implications below). Basically, the de
facto phenomenon of decoherence that underlies the claims of Quantum
Darwinism may not really arise in a unitary-only dynamics. Thus, even if
there is decoherence, this does not show that macroscopic pointer
states naturally emerge without some form of collapse.
Implications
Along with Zurek's related theory of envariance (invariance due to quantum entanglement), quantum Darwinism seeks to explain how the classical world emerges from the quantum world and proposes to answer the quantum measurement problem, the main interpretational challenge
for quantum theory. The measurement problem arises because the quantum
state vector, the source of all knowledge concerning quantum systems,
evolves according to the Schrödinger equation into a linear superposition of different states, predicting paradoxical situations such as "Schrödinger's cat";
situations never experienced in our classical world. Quantum theory has
traditionally treated this problem as being resolved by a non-unitary transformation of the state vector
at the time of measurement into a definite state. It provides an
extremely accurate means of predicting the value of the definite state
that will be measured in the form of a probability for each possible
measurement value. The physical nature of the transition from the
quantum superposition of states to the definite classical state measured
is not explained by the traditional theory but is usually assumed as an
axiom and was at the basis of the debate between Niels Bohr and Albert Einstein concerning the completeness of quantum theory.
Quantum Darwinism seeks to explain the transition of quantum
systems from the vast potentiality of superposed states to the greatly
reduced set of pointer states as a selection process, einselection,
imposed on the quantum system through its continuous interactions with
the environment. All quantum interactions, including measurements, but
much more typically interactions with the environment such as with the
sea of photons in which all quantum systems are immersed, lead to decoherence
or the manifestation of the quantum system in a particular basis
dictated by the nature of the interaction in which the quantum system is
involved. In the case of interactions with its environment Zurek and
his collaborators have shown that a preferred basis into which a quantum
system will decohere is the pointer basis underlying predictable
classical states. It is in this sense that the pointer states of
classical reality are selected from quantum reality and exist in the
macroscopic realm in a state able to undergo further evolution. However,
the 'einselection' program depends on assuming a particular division of
the universal quantum state into 'system' + 'environment', with the
different degrees of freedom of the environment posited as having
mutually random phases. This phase randomness does not arise from within
the quantum state of the universe on its own, and Ruth Kastner
has pointed out that this limits the explanatory power of the Quantum
Darwinism program. Zurek replies to Kastner's criticism in Classical selection and quantum Darwinism.
As a quantum system's interactions with its environment results
in the recording of many redundant copies of information regarding its
pointer states, this information is available to numerous observers able
to achieve consensual agreement concerning their information of the
quantum state. This aspect of einselection, called by Zurek 'Environment
as a Witness', results in the potential for objective knowledge.
Darwinian significance
Perhaps
of equal significance to the light this theory shines on quantum
explanations is its identification of a Darwinian process operating as
the selective mechanism establishing our classical reality. As numerous
researchers have made clear any system employing a Darwinian process
will evolve. As argued by the thesis of Universal Darwinism, Darwinian processes are not confined to biology but are all following the simple Darwinian algorithm:
Reproduction/Heredity; the ability to make copies and thereby produce descendants.
Selection; A process that preferentially selects one trait over
another trait, leading to one trait being more numerous after sufficient
generations.
Variation; differences in heritable traits that affect "Fitness" or
the ability to survive and reproduce leading to differential survival.
Quantum Darwinism appears to conform to this algorithm and thus is aptly named:
Numerous copies are made of pointer states
Successive interactions between pointer states and their environment
reveal them to evolve and those states to survive which conform to the
predictions of classical physics within the macroscopic world. This
happens in a continuous, predictable manner; that is descendants inherit
many of their traits from ancestor states.
The analogy to the Variation principle of "simple Darwinism" does
not exist since the Pointer states do not mutate and the selection by
the environment is among the pointer states preferred by the environment
(e.g. location states).
From this view quantum Darwinism provides a Darwinian explanation at
the basis of our reality, explaining the unfolding or evolution of our
classical macroscopic world.
Quantum decoherence is the loss of quantum coherence.
Quantum decoherence has been studied to understand how quantum systems
convert to systems which can be explained by classical mechanics.
Beginning out of attempts to extend the understanding of quantum
mechanics, the theory has developed in several directions and
experimental studies have confirmed some of the key issues. Quantum
computing relies on quantum coherence and is the primary practical
applications of the concept.
Concept
In quantum mechanics, particles such as electrons are described by a wave function,
a mathematical representation of the quantum state of a system; a
probabilistic interpretation of the wave function is used to explain
various quantum effects. As long as there exists a definite phase
relation between different states, the system is said to be coherent. A
definite phase relationship is necessary to perform quantum computing on quantum information encoded in quantum states. Coherence is preserved under the laws of quantum physics.
If a quantum system were perfectly isolated, it would maintain
coherence indefinitely, but it would be impossible to manipulate or
investigate it. If it is not perfectly isolated, for example during a
measurement, coherence is shared with the environment and appears to be
lost with time ─ a process called quantum decoherence or environmental
decoherence. As a result of this process, quantum behavior is apparently
lost, just as energy appears to be lost by friction in classical
mechanics.
Decoherence can be viewed as the loss of information from a system into the environment (often modeled as a heat bath),[1]
since every system is loosely coupled with the energetic state of its
surroundings. Viewed in isolation, the system's dynamics are non-unitary (although the combined system plus environment evolves in a unitary fashion).[2] Thus the dynamics of the system alone are irreversible. As with any coupling, entanglements are generated between the system and environment. These have the effect of sharing quantum information with—or transferring it to—the surroundings.
History and interpretation
Relation to interpretation of quantum mechanics
An interpretation of quantum mechanics is an attempt to explain how the mathematical theory of quantum physics might correspond to experienced reality.
Decoherence calculations can be done in any interpretation of quantum
mechanics, since those calculations are an application of the standard
mathematical tools of quantum theory. However, the subject of
decoherence has been closely related to the problem of interpretation
throughout its history.
Decoherence has been used to understand the possibility of the collapse of the wave function in quantum mechanics. Decoherence does not generate actual wave-function collapse. It only provides a framework for apparent
wave-function collapse, as the quantum nature of the system "leaks"
into the environment. That is, components of the wave function are
decoupled from a coherent system and acquire phases from their immediate surroundings. A total superposition of the global or universal wavefunction still exists (and remains coherent at the global level), but its ultimate fate remains an interpretational issue.
With respect to the measurement problem, decoherence provides an explanation for the transition of the system to a mixture of states
that seem to correspond to those states observers perceive. Moreover,
observation indicates that this mixture looks like a proper quantum ensemble in a measurement situation, as the measurements lead to the "realization" of precisely one state in the "ensemble".
The philosophical views of Werner Heisenberg and Niels Bohr have often been grouped together as the "Copenhagen interpretation", despite significant divergences between them on important points.
In 1955, Heisenberg suggested that the interaction of a system with its
surrounding environment would eliminate quantum interference effects.
However, Heisenberg did not provide a detailed account of how this might
transpire, nor did he make explicit the importance of entanglement in
the process.
Origin of the concepts
Nevill Mott's solution to the iconic Mott problem in 1929 is considered in retrospect to be the first quantum decoherence work. It was cited by the first modern theoretical treatment.
Although he did not use the term, the concept of quantum decoherence was first introduced in 1951 by the American physicist David Bohm,
who called it the "destruction of interference in the process of
measurement". Bohm later used decoherence to handle the measurement
process in the de Broglie-Bohm interpretation of quantum theory.
The significance of decoherence was further highlighted in 1970 by the German physicist H. Dieter Zeh, and it has been a subject of active research since the 1980s. Decoherence has been developed into a complete framework, but there is controversy as to whether it solves the measurement problem, as the founders of decoherence theory admit in their seminal papers.
The study of decoherence as a proper subject began in 1970, with H. Dieter Zeh's paper "On the Interpretation of Measurement in Quantum Theory".
Zeh regarded the wavefunction as a physical entity, rather than a
calculational device or a compendium of statistical information (as is
typical for Copenhagen-type interpretations), and he proposed that it
should evolve unitarily, in accord with the Schrödinger equation, at all
times. Zeh was initially unaware of Hugh Everett III's earlier work,
which also proposed a universal wavefunction evolving unitarily; he
revised his paper to reference Everett after learning of Everett's
"relative-state interpretation" through an article by Bryce DeWitt. (DeWitt was the one who termed Everett's proposal the many-worlds interpretation,
by which name it is commonly known.) For Zeh, the question of how to
interpret quantum mechanics was of key importance, and an interpretation
along the lines of Everett's was the most natural. Partly because of a
general disinterest among physicists for interpretational questions,
Zeh's work remained comparatively neglected until the early 1980s, when
two papers by Wojciech Zurek
invigorated the subject. Unlike Zeh's publications, Zurek's articles
were fairly agnostic about interpretation, focusing instead on specific
problems of density-matrix dynamics. Zurek's interest in decoherence
stemmed from furthering Bohr's analysis of the double-slit experiment in
his reply to the Einstein–Podolsky–Rosen paradox, work he had undertaken with Bill Wootters, and he has since argued that decoherence brings a kind of rapprochement between Everettian and Copenhagen-type views.
Decoherence does not claim to provide a mechanism for some actual
wave-function collapse; rather it puts forth a reasonable framework for
the appearance of wave-function collapse. The quantum nature of the
system is simply "leaked" into the environment so that a total
superposition of the wave function still exists, but exists—at least for
all practical purposes—beyond the realm of measurement.
By definition, the claim that a merged but unmeasurable wave function
still exists cannot be proven experimentally. Decoherence is needed to
understand why a quantum system begins to obey classical probability
rules after interacting with its environment (due to the suppression of
the interference terms when applying Born's probability rules to the
system).
Criticism of the adequacy of decoherence theory to solve the measurement problem has been expressed by Anthony Leggett.
Mechanisms
To
examine how decoherence operates, an "intuitive" model is presented
below. The model requires some familiarity with quantum theory basics.
Analogies are made between visualizable classical phase spaces and Hilbert spaces. A more rigorous derivation in Dirac notation shows how decoherence destroys interference effects and the "quantum nature" of systems. Next, the density matrix approach is presented for perspective.
Phase-space picture
An N-particle system can be represented in non-relativistic quantum mechanics by a wave function, where each xi is a point in 3-dimensional space. This has analogies with the classical phase space. A classical phase space contains a real-valued function in 6N
dimensions (each particle contributes 3 spatial coordinates and 3
momenta). In this case a "quantum" phase space, on the other hand,
involves a complex-valued function on a 3N-dimensional space. The position and momenta are represented by operators that do not commute, and lives in the mathematical structure of a Hilbert space. Aside from these differences, however, the rough analogy holds.
Different previously isolated, non-interacting systems occupy
different phase spaces. Alternatively we can say that they occupy
different lower-dimensional subspaces in the phase space of the joint system. The effective dimensionality of a system's phase space is the number of degrees of freedom present, which—in non-relativistic models—is 6 times the number of a system's free particles. For a macroscopic
system this will be a very large dimensionality. When two systems (the
environment being one system) start to interact, though, their
associated state vectors are no longer constrained to the subspaces.
Instead the combined state vector time-evolves a path through the
"larger volume", whose dimensionality is the sum of the dimensions of
the two subspaces. The extent to which two vectors interfere with each
other is a measure of how "close" they are to each other (formally,
their overlap or Hilbert space multiplies together) in the phase space.
When a system couples to an external environment, the dimensionality of,
and hence "volume" available to, the joint state vector increases
enormously. Each environmental degree of freedom contributes an extra
dimension.
The original system's wave function can be expanded in many
different ways as a sum of elements in a quantum superposition. Each
expansion corresponds to a projection of the wave vector onto a basis.
The basis can be chosen at will. Choosing an expansion where the
resulting basis elements interact with the environment in an
element-specific way, such elements will—with overwhelming
probability—be rapidly separated from each other by their natural
unitary time evolution along their own independent paths. After a very
short interaction, there is almost no chance of further interference.
The process is effectively irreversible.
The different elements effectively become "lost" from each other in the
expanded phase space created by coupling with the environment. In phase
space, this decoupling is monitored through the Wigner quasi-probability distribution. The original elements are said to have decohered.
The environment has effectively selected out those expansions or
decompositions of the original state vector that decohere (or lose phase
coherence) with each other. This is called "environmentally-induced
superselection", or einselection. The decohered elements of the system no longer exhibit quantum interference between each other, as in a double-slit experiment. Any elements that decohere from each other via environmental interactions are said to be quantum-entangled with the environment. The converse is not true: not all entangled states are decohered from each other.
Any measuring device or apparatus acts as an environment, since
at some stage along the measuring chain, it has to be large enough to be
read by humans. It must possess a very large number of hidden degrees
of freedom. In effect, the interactions may be considered to be quantum measurements.
As a result of an interaction, the wave functions of the system and the
measuring device become entangled with each other. Decoherence happens
when different portions of the system's wave function become entangled
in different ways with the measuring device. For two einselected
elements of the entangled system's state to interfere, both the original
system and the measuring in both elements device must significantly
overlap, in the scalar product sense. If the measuring device has many
degrees of freedom, it is very unlikely for this to happen.
As a consequence, the system behaves as a classical statistical ensemble of the different elements rather than as a single coherent quantum superposition
of them. From the perspective of each ensemble member's measuring
device, the system appears to have irreversibly collapsed onto a state
with a precise value for the measured attributes, relative to that
element. This provides one explanation of how the Born rule coefficients
effectively act as probabilities as per the measurement postulate
constituting a solution to the quantum measurement problem.
Dirac notation
Using Dirac notation, let the system initially be in the state
where the s form an einselectedbasis (environmentally induced selected eigenbasis), and let the environment initially be in the state . The vector basis of the combination of the system and the environment consists of the tensor products
of the basis vectors of the two subsystems. Thus, before any
interaction between the two subsystems, the joint state can be written
as
where is shorthand for the tensor product .
There are two extremes in the way the system can interact with its
environment: either (1) the system loses its distinct identity and
merges with the environment (e.g. photons in a cold, dark cavity get
converted into molecular excitations within the cavity walls), or (2)
the system is not disturbed at all, even though the environment is
disturbed (e.g. the idealized non-disturbing measurement). In general,
an interaction is a mixture of these two extremes that we examine.
System absorbed by environment
If the environment absorbs the system, each element of the total system's basis interacts with the environment such that
evolves into
and so
evolves into
The unitarity of time evolution demands that the total state basis remains orthonormal, i.e. the scalar or inner products of the basis vectors must vanish, since :
This orthonormality of the environment states is the defining characteristic required for einselection.
System not disturbed by environment
In
an idealized measurement, the system disturbs the environment, but is
itself undisturbed by the environment. In this case, each element of the
basis interacts with the environment such that
where was used. Additionally, decoherence requires, by virtue of the large number of hidden degrees of freedom in the environment, that
As before, this is the defining characteristic for decoherence to become einselection. The approximation becomes more exact as the number of environmental degrees of freedom affected increases.
Note that if the system basis were not an einselected basis, then the last condition is trivial, since the disturbed environment is not a function of , and we have the trivial disturbed environment basis .
This would correspond to the system basis being degenerate with respect
to the environmentally defined measurement observable. For a complex
environmental interaction (which would be expected for a typical
macroscale interaction) a non-einselected basis would be hard to define.
Loss of interference and the transition from quantum to classical probabilities
The
utility of decoherence lies in its application to the analysis of
probabilities, before and after environmental interaction, and in
particular to the vanishing of quantum interference terms after decoherence has occurred. If we ask what is the probability of observing the system making a transition from to before has interacted with its environment, then application of the Born probability rule states that the transition probability is the squared modulus of the scalar product of the two states:
where , , and etc.
The above expansion of the transition probability has terms that involve ; these can be thought of as representing interference
between the different basis elements or quantum alternatives. This is a
purely quantum effect and represents the non-additivity of the
probabilities of quantum alternatives.
To calculate the probability of observing the system making a quantum leap from to after has interacted with its environment, then application of the Born probability rule states that we must sum over all the relevant possible states of the environment before squaring the modulus:
The internal summation vanishes when we apply the decoherence/einselection condition , and the formula simplifies to
If we compare this with the formula we derived before the environment
introduced decoherence, we can see that the effect of decoherence has
been to move the summation sign from inside of the modulus sign to outside. As a result, all the cross- or quantum interference-terms
have vanished from the transition-probability calculation. The decoherence has irreversibly converted quantum behaviour (additive probability amplitudes) to classical behaviour (additive probabilities).However, Ballentine
shows that the significant impact of decoherence to reduce interference
need not have significance for the transition of quantum systems to
classical limits.
In terms of density matrices, the loss of interference effects
corresponds to the diagonalization of the "environmentally traced-over" density matrix.
Density-matrix approach
The effect of decoherence on density matrices is essentially the decay or rapid vanishing of the off-diagonal elements of the partial trace of the joint system's density matrix, i.e. the trace, with respect to any environmental basis, of the density matrix of the combined system and its environment. The decoherence irreversibly converts the "averaged" or "environmentally traced-over" density matrix from a pure state to a reduced mixture; it is this that gives the appearance of wave-function collapse. Again, this is called "environmentally induced superselection", or einselection. The advantage of taking the partial trace is that this procedure is indifferent to the environmental basis chosen.
Initially, the density matrix of the combined system can be denoted as
where
is the state of the environment.
Then if the transition happens before any interaction takes place
between the system and the environment, the environment subsystem has no
part and can be traced out, leaving the reduced density matrix for the system:
Now the transition probability will be given as
where , , and etc.
Now the case when transition takes place after the interaction of
the system with the environment. The combined density matrix will be
To get the reduced density matrix of the system, we trace out the environment and employ the decoherence/einselection condition and see that the off-diagonal terms vanish (a result obtained by Erich Joos and H. D. Zeh in 1985):
Similarly, the final reduced density matrix after the transition will be
The transition probability will then be given as
which has no contribution from the interference terms
The density-matrix approach has been combined with the Bohmian approach to yield a reduced-trajectory approach, taking into account the system reduced density matrix and the influence of the environment.
Operator-sum representation
Consider a system S and environment (bath) B, which are closed and can be treated quantum-mechanically. Let and be the system's and bath's Hilbert spaces respectively. Then the Hamiltonian for the combined system is
where are the system and bath Hamiltonians respectively, is the interaction Hamiltonian between the system and bath, and are the identity operators on the system and bath Hilbert spaces respectively. The time-evolution of the density operator of this closed system is unitary and, as such, is given by
where the unitary operator is . If the system and bath are not entangled initially, then we can write . Therefore, the evolution of the system becomes
The system–bath interaction Hamiltonian can be written in a general form as
where is the operator acting on the combined system–bath Hilbert space, and
are the operators that act on the system and bath respectively. This
coupling of the system and bath is the cause of decoherence in the
system alone. To see this, a partial trace is performed over the bath to give a description of the system alone:
is called the reduced density matrix
and gives information about the system only. If the bath is written in
terms of its set of orthogonal basis kets, that is, if it has been
initially diagonalized, then . Computing the partial trace with respect to this (computational) basis gives
where are defined as the Kraus operators and are represented as (the index combines indices and ):
This is known as the operator-sum representation (OSR). A condition on the Kraus operators can be obtained by using the fact that ; this then gives
This restriction determines whether decoherence will occur or not in
the OSR. In particular, when there is more than one term present in the
sum for , then the dynamics of the system will be non-unitary, and hence decoherence will take place.
Semigroup approach
A more general consideration for the existence of decoherence in a quantum system is given by the master equation, which determines how the density matrix of the system alone evolves in time (see also the Belavkin equation for the evolution under continuous measurement). This uses the Schrödinger picture, where evolution of the state (represented by its density matrix) is considered. The master equation is
where is the system Hamiltonian along with a (possible) unitary contribution from the bath, and is the Lindblad decohering term. The Lindblad decohering term is represented as
The are basis operators for the M-dimensional space of bounded operators that act on the system Hilbert space and are the error generators. The matrix elements represent the elements of a positive semi-definiteHermitian matrix; they characterize the decohering processes and, as such, are called the noise parameters.
The semigroup approach is particularly nice, because it distinguishes
between the unitary and decohering (non-unitary) processes, which is not
the case with the OSR. In particular, the non-unitary dynamics are
represented by , whereas the unitary dynamics of the state are represented by the usual Heisenberg commutator. Note that when ,
the dynamical evolution of the system is unitary. The conditions for
the evolution of the system density matrix to be described by the master
equation are:[2]
the evolution of the system density matrix is determined by a one-parameter semigroup
the evolution is "completely positive" (i.e. probabilities are preserved)
the system and bath density matrices are initially decoupled
Non-unitary modelling examples
Decoherence can be modelled as a non-unitary
process by which a system couples with its environment (although the
combined system plus environment evolves in a unitary fashion). Thus the dynamics of the system alone, treated in isolation, are non-unitary and, as such, are represented by irreversible transformations acting on the system's Hilbert space.
Since the system's dynamics are represented by irreversible
representations, then any information present in the quantum system can
be lost to the environment or heat bath.
Alternatively, the decay of quantum information caused by the coupling
of the system to the environment is referred to as decoherence.
Thus decoherence is the process by which information of a quantum
system is altered by the system's interaction with its environment
(which form a closed system), hence creating an entanglement
between the system and heat bath (environment). As such, since the
system is entangled with its environment in some unknown way, a
description of the system by itself cannot be made without also
referring to the environment (i.e. without also describing the state of
the environment).
Rotational decoherence
Consider a system of N qubits that is coupled to a bath symmetrically. Suppose this system of N qubits undergoes a rotation around the eigenstates of . Then under such a rotation, a random phase will be created between the eigenstates , of . Thus these basis qubits and will transform in the following way:
This transformation is performed by the rotation operator
Since any qubit in this space can be expressed in terms of the basis
qubits, then all such qubits will be transformed under this rotation.
Consider the th qubit in a pure state where . Before application of the rotation this state is:
.
This state will decohere, since it is not ‘encoded’ with (dependent upon) the dephasing factor . This can be seen by examining the density matrix averaged over the random phase :
,
where is a probability measure of the random phase, . Although not entirely necessary, let us assume for simplicity that this is given by the Gaussian distribution, i.e., where represents the spread of the random phase. Then the density matrix computed as above is
.
Observe that the off-diagonal elements—the coherence terms—decay as the spread of the random phase, ,
increases over time (which is a realistic expectation). Thus the
density matrices for each qubit of the system become indistinguishable
over time. This means that no measurement can distinguish between the
qubits, thus creating decoherence between the various qubit states. In
particular, this dephasing process causes the qubits to collapse to one
of the pure states in . This is why this type of decoherence process is called collective dephasing, because the mutual phases between all qubits of the N-qubit system are destroyed.
Depolarizing
Depolarizing is a non-unitary transformation on a quantum system which maps
pure states to mixed states. This is a non-unitary process because any
transformation that reverses this process will map states out of their
respective Hilbert space thus not preserving positivity (i.e. the
original probabilities
are mapped to negative probabilities, which is not allowed). The
2-dimensional case of such a transformation would consist of mapping
pure states on the surface of the Bloch sphere
to mixed states within the Bloch sphere. This would contract the Bloch
sphere by some finite amount and the reverse process would expand the
Bloch sphere, which cannot happen.
Dissipation is a decohering process by which the populations
of quantum states are changed due to entanglement with a bath. An
example of this would be a quantum system that can exchange its energy
with a bath through the interaction Hamiltonian. If the system is not in its ground state
and the bath is at a temperature lower than that of the system's, then
the system will give off energy to the bath, and thus higher-energy
eigenstates of the system Hamiltonian will decohere to the ground state
after cooling and, as such, will all be non-degenerate. Since the states are no longer degenerate, they are not distinguishable, and thus this process is irreversible (non-unitary).
Timescales
Decoherence
represents an extremely fast process for macroscopic objects, since
these are interacting with many microscopic objects, with an enormous
number of degrees of freedom in their natural environment. The process
is needed if we are to understand why we tend not to observe quantum
behavior in everyday macroscopic objects and why we do see classical
fields emerge from the properties of the interaction between matter and
radiation for large amounts of matter. The time taken for off-diagonal
components of the density matrix to effectively vanish is called the decoherence time. It is typically extremely short for everyday, macroscale processes.
A modern basis-independent definition of the decoherence time relies on
the short-time behavior of the fidelity between the initial and the
time-dependent state or, equivalently, the decay of the purity.
Mathematical details
Assume for the moment that the system in question consists of a subsystem A being studied and the "environment" , and the total Hilbert space is the tensor product of a Hilbert space describing A and a Hilbert space describing , that is,
This is a reasonably good approximation in the case where A and are relatively independent (e.g. there is nothing like parts of A mixing with parts of
or conversely). The point is, the interaction with the environment is
for all practical purposes unavoidable (e.g. even a single excited atom
in a vacuum would emit a photon, which would then go off). Let's say
this interaction is described by a unitary transformationU acting upon . Assume that the initial state of the environment is , and the initial state of A is the superposition state
where and are orthogonal, and there is no entanglement initially. Also, choose an orthonormal basis for .
(This could be a "continuously indexed basis" or a mixture of
continuous and discrete indexes, in which case we would have to use a rigged Hilbert space
and be more careful about what we mean by orthonormal, but that's an
inessential detail for expository purposes.) Then, we can expand
and
uniquely as
and
respectively. One thing to realize is that the environment contains a
huge number of degrees of freedom, a good number of them interacting
with each other all the time. This makes the following assumption
reasonable in a handwaving way, which can be shown to be true in some
simple toy models. Assume that there exists a basis for such that and are all approximately orthogonal to a good degree if i ≠ j and the same thing for and and also for and for any i and j (the decoherence property).
This often turns out to be true (as a reasonable conjecture) in the position basis because how A interacts with the environment would often depend critically upon the position of the objects in A. Then, if we take the partial trace over the environment, we would find the density state is approximately described by
that is, we have a diagonal mixed state, there is no constructive or destructive interference, and the "probabilities" add up classically. The time it takes for U(t) (the unitary operator as a function of time) to display the decoherence property is called the decoherence time.
Experimental observations
Quantitative measurement
The
decoherence rate depends on a number of factors, including temperature
or uncertainty in position, and many experiments have tried to measure
it depending on the external environment.
The process of a quantum superposition gradually obliterated by decoherence was quantitatively measured for the first time by Serge Haroche and his co-workers at the École Normale Supérieure in Paris in 1996. Their approach involved sending individual rubidium
atoms, each in a superposition of two states, through a
microwave-filled cavity. The two quantum states both cause shifts in the
phase of the microwave field, but by different amounts, so that the
field itself is also put into a superposition of two states. Due to
photon scattering on cavity-mirror imperfection, the cavity field loses
phase coherence to the environment. Haroche and his colleagues measured
the resulting decoherence via correlations between the states of pairs
of atoms sent through the cavity with various time delays between the
atoms.
Decoherence represents a challenge for the practical realization of quantum computers,
since such machines are expected to rely heavily on the undisturbed
evolution of quantum coherences. Simply put, they require that the
coherence of states be preserved and that decoherence be managed, in
order to actually perform quantum computation. The preservation of
coherence, and mitigation of decoherence effects, are thus related to
the concept of quantum error correction
In August 2020 scientists reported that ionizing radiation from environmental radioactive materials and cosmic rays may substantially limit the coherence times of qubits
if they aren't shielded adequately which may be critical for realizing
fault-tolerant superconducting quantum computers in the future.