Search This Blog

Wednesday, December 18, 2024

Von Neumann entropy

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Von_Neumann_entropy

In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density matrix ρ, the von Neumann entropy is

where denotes the trace and ln denotes the (natural) matrix logarithm. If the density matrix ρ is written in a basis of its eigenvectors as

then the von Neumann entropy is merely

In this form, S can be seen as the information theoretic Shannon entropy.

The von Neumann entropy is also used in different forms (conditional entropies, relative entropies, etc.) in the framework of quantum information theory to characterize the entropy of entanglement.

Background

John von Neumann established a rigorous mathematical framework for quantum mechanics in his 1932 work Mathematical Foundations of Quantum Mechanics. In it, he provided a theory of measurement, where the usual notion of wave-function collapse is described as an irreversible process (the so-called von Neumann or projective measurement).

The density matrix was introduced, with different motivations, by von Neumann and by Lev Landau. The motivation that inspired Landau was the impossibility of describing a subsystem of a composite quantum system by a state vector. On the other hand, von Neumann introduced the density matrix in order to develop both quantum statistical mechanics and a theory of quantum measurements.

The density matrix formalism, thus developed, extended the tools of classical statistical mechanics to the quantum domain. In the classical framework, the probability distribution and partition function of the system allows us to compute all possible thermodynamic quantities. Von Neumann introduced the density matrix to play the same role in the context of quantum states and operators in a complex Hilbert space. The knowledge of the statistical density matrix operator would allow us to compute all average quantum entities in a conceptually similar, but mathematically different, way.

Let us suppose we have a set of wave functions |Ψ〉 that depend parametrically on a set of quantum numbers n1, n2, ..., nN. The natural variable which we have is the amplitude with which a particular wavefunction of the basic set participates in the actual wavefunction of the system. Let us denote the square of this amplitude by p(n1, n2, ..., nN). The goal is to turn this quantity p into the classical density function in phase space. We have to verify that p goes over into the density function in the classical limit, and that it has ergodic properties. After checking that p(n1, n2, ..., nN) is a constant of motion, an ergodic assumption for the probabilities p(n1, n2, ..., nN) makes p a function of the energy only.

After this procedure, one finally arrives at the density matrix formalism when seeking a form where p(n1, n2, ..., nN) is invariant with respect to the representation used. In the form it is written, it will only yield the correct expectation values for quantities which are diagonal with respect to the quantum numbers n1, n2, ..., nN.

Expectation values of operators which are not diagonal involve the phases of the quantum amplitudes. Suppose we encode the quantum numbers n1, n2, ..., nN into the single index i or j. Then our wave function has the form

The expectation value of an operator B which is not diagonal in these wave functions, so

The role which was originally reserved for the quantities is thus taken over by the density matrix of the system S.

Therefore, 〈B〉 reads

The invariance of the above term is described by matrix theory. The trace is invariant under cyclic permutations, and both the matrices ρ and B can be transformed into whatever basis is convenient, typically a basis of the eigenvectors. By cyclic permutations of the matrix product, it can be seen that an identity matrix will arise and so the trace will not be affected by the change in basis. A mathematical framework was described where the expectation value of quantum operators, as described by matrices, is obtained by taking the trace of the product of the density operator and an operator (Hilbert scalar product between operators). The matrix formalism here is in the statistical mechanics framework, although it applies as well for finite quantum systems, which is usually the case, where the state of the system cannot be described by a pure state, but as a statistical operator of the above form. Mathematically, is a positive-semidefinite Hermitian matrix with unit trace.

Definition

Given the density matrix ρ, von Neumann defined the entropy as

which is a proper extension of the Gibbs entropy (up to a factor kB) and the Shannon entropy to the quantum case. To compute S(ρ) it is convenient (see logarithm of a matrix) to compute the eigendecomposition of . The von Neumann entropy is then given by

Properties

Some properties of the von Neumann entropy:

  • S(ρ) is zero if and only if ρ represents a pure state.
  • S(ρ) is maximal and equal to for a maximally mixed state, N being the dimension of the Hilbert space.
  • S(ρ) is invariant under changes in the basis of ρ, that is, S(ρ) = S(UρU), with U a unitary transformation.
  • S(ρ) is concave, that is, given a collection of positive numbers λi which sum to unity () and density operators ρi, we have
  • S(ρ) satisfies the bound
where equality is achieved if the ρi have orthogonal support, and as before ρi are density operators and λi is a collection of positive numbers which sum to unity ()
  • S(ρ) is additive for independent systems. Given two density matrices ρA , ρB describing independent systems A and B, we have
.
  • S(ρ) is strongly subadditive for any three systems A, B, and C:
This automatically means that S(ρ) is subadditive:

Below, the concept of subadditivity is discussed, followed by its generalization to strong subadditivity.

Subadditivity

If ρA, ρB are the reduced density matrices of the general state ρAB, then

This right hand inequality is known as subadditivity. The two inequalities together are sometimes known as the triangle inequality. They were proved in 1970 by Huzihiro Araki and Elliott H. Lieb. While in Shannon's theory the entropy of a composite system can never be lower than the entropy of any of its parts, in quantum theory this is not the case, i.e., it is possible that S(ρAB) = 0, while S(ρA) = S(ρB) > 0.

Intuitively, this can be understood as follows: In quantum mechanics, the entropy of the joint system can be less than the sum of the entropy of its components because the components may be entangled. For instance, as seen explicitly, the Bell state of two spin-½s,

is a pure state with zero entropy, but each spin has maximum entropy when considered individually in its reduced density matrix. The entropy in one spin can be "cancelled" by being correlated with the entropy of the other. The left-hand inequality can be roughly interpreted as saying that entropy can only be cancelled by an equal amount of entropy.

If system A and system B have different amounts of entropy, the smaller can only partially cancel the greater, and some entropy must be left over. Likewise, the right-hand inequality can be interpreted as saying that the entropy of a composite system is maximized when its components are uncorrelated, in which case the total entropy is just a sum of the sub-entropies. This may be more intuitive in the phase space formulation, instead of Hilbert space one, where the Von Neumann entropy amounts to minus the expected value of the -logarithm of the Wigner function, ∫ f log f  dx dp, up to an offset shift. Up to this normalization offset shift, the entropy is majorized by that of its classical limit.

Strong subadditivity

The von Neumann entropy is also strongly subadditive. Given three Hilbert spaces, A, B, C,

This is a more difficult theorem and was proved first by J. Kiefer in 1959 and independently by Elliott H. Lieb and Mary Beth Ruskai in 1973, using a matrix inequality of Elliott H. Lieb proved in 1973. By using the proof technique that establishes the left side of the triangle inequality above, one can show that the strong subadditivity inequality is equivalent to the following inequality.

when ρAB, etc. are the reduced density matrices of a density matrix ρABC. If we apply ordinary subadditivity to the left side of this inequality, and consider all permutations of A, B, C, we obtain the triangle inequality for ρABC: Each of the three numbers S(ρAB), S(ρBC), S(ρAC) is less than or equal to the sum of the other two.

Canonical ensemble

Theorem. The canonical distribution is the unique maximum of the Helmholtz free entropy , which has the solutionin the eigenbasis of the Hamiltonian operator . This state has free entropywhere is the partition function.

Equivalently, the canonical distribution is the unique maximum of entropy under constraint:

Coarse graining

Since, for a pure state, the density matrix is idempotent, ρ = ρ2, the entropy S(ρ) for it vanishes. Thus, if the system is finite (finite-dimensional matrix representation), the entropy S(ρ) quantifies the departure of the system from a pure state. In other words, it codifies the degree of mixing of the state describing a given finite system.

Measurement decoheres a quantum system into something noninterfering and ostensibly classical; so, e.g., the vanishing entropy of a pure state , corresponding to a density matrix

increases to for the measurement outcome mixture

as the quantum interference information is erased.

However, if the measuring device is also quantum mechanical, and it starts at a pure state as well, then the joint system of device-system is just a larger quantum system. Since it starts at a pure state, it ends up with a pure state as well, and so the von Neumann entropy never increases. The problem can be resolved by using the idea of coarse graining.

Concretely, let the system be a qubit, and let the measuring device be another qubit. The measuring device starts at the state. The measurement process is a CNOT gate, so that we have , . That is, if the system starts at the pure 1 state, then after measuring, the measurement device is also at the pure 1 state.

Now if the system starts at the state, then after measurement, the joint system is in the Bell state . The vN entropy of the joint system is still 0, since it is still a pure state. However, if we coarse grain the system by measuring the vN entropy of just the device, then just the qubit, then add them together, we get .

By subadditivity, , that is, any way to coarse-grain the entire system into parts would equal or increase vN entropy.

Quantum thermodynamics

From Wikipedia, the free encyclopedia

Quantum thermodynamics is the study of the relations between two independent physical theories: thermodynamics and quantum mechanics. The two independent theories address the physical phenomena of light and matter. In 1905, Albert Einstein argued that the requirement of consistency between thermodynamics and electromagnetism leads to the conclusion that light is quantized, obtaining the relation . This paper is the dawn of quantum theory. In a few decades quantum theory became established with an independent set of rules. Currently quantum thermodynamics addresses the emergence of thermodynamic laws from quantum mechanics. It differs from quantum statistical mechanics in the emphasis on dynamical processes out of equilibrium. In addition, there is a quest for the theory to be relevant for a single individual quantum system.

Dynamical view

There is an intimate connection of quantum thermodynamics with the theory of open quantum systems. Quantum mechanics inserts dynamics into thermodynamics, giving a sound foundation to finite-time-thermodynamics. The main assumption is that the entire world is a large closed system, and therefore, time evolution is governed by a unitary transformation generated by a global Hamiltonian. For the combined system bath scenario, the global Hamiltonian can be decomposed into:

where is the system Hamiltonian, is the bath Hamiltonian and is the system-bath interaction.

The state of the system is obtained from a partial trace over the combined system and bath:

.

Reduced dynamics is an equivalent description of the system dynamics utilizing only system operators. Assuming Markov property for the dynamics the basic equation of motion for an open quantum system is the Lindblad equation (GKLS):

is a (Hermitian) Hamiltonian part and :

is the dissipative part describing implicitly through system operators the influence of the bath on the system. The Markov property imposes that the system and bath are uncorrelated at all times . The L-GKS equation is unidirectional and leads any initial state to a steady state solution which is an invariant of the equation of motion .

The Heisenberg picture supplies a direct link to quantum thermodynamic observables. The dynamics of a system observable represented by the operator, , has the form:

where the possibility that the operator, is explicitly time-dependent, is included.

Emergence of time derivative of first law of thermodynamics

When the first law of thermodynamics emerges:

where power is interpreted as

and the heat current

.

Additional conditions have to be imposed on the dissipator to be consistent with thermodynamics.

First the invariant should become an equilibrium Gibbs state. This implies that the dissipator should commute with the unitary part generated by . In addition an equilibrium state is stationary and stable. This assumption is used to derive the Kubo-Martin-Schwinger stability criterion for thermal equilibrium i.e. KMS state.

A unique and consistent approach is obtained by deriving the generator, , in the weak system bath coupling limit. In this limit, the interaction energy can be neglected. This approach represents a thermodynamic idealization: it allows energy transfer, while keeping a tensor product separation between the system and bath, i.e., a quantum version of an isothermal partition.

Markovian behavior involves a rather complicated cooperation between system and bath dynamics. This means that in phenomenological treatments, one cannot combine arbitrary system Hamiltonians, , with a given L-GKS generator. This observation is particularly important in the context of quantum thermodynamics, where it is tempting to study Markovian dynamics with an arbitrary control Hamiltonian. Erroneous derivations of the quantum master equation can easily lead to a violation of the laws of thermodynamics.

An external perturbation modifying the Hamiltonian of the system will also modify the heat flow. As a result, the L-GKS generator has to be renormalized. For a slow change, one can adopt the adiabatic approach and use the instantaneous system’s Hamiltonian to derive . An important class of problems in quantum thermodynamics is periodically driven systems. Periodic quantum heat engines and power-driven refrigerators fall into this class.

A reexamination of the time-dependent heat current expression using quantum transport techniques has been proposed.

A derivation of consistent dynamics beyond the weak coupling limit has been suggested.

Phenomenological formulations of irreversible quantum dynamics consistent with the second law and implementing the geometric idea of "steepest entropy ascent" or "gradient flow" have been suggested to model relaxation and strong coupling.

Emergence of the second law

The second law of thermodynamics is a statement on the irreversibility of dynamics or, the breakup of time reversal symmetry (T-symmetry). This should be consistent with the empirical direct definition: heat will flow spontaneously from a hot source to a cold sink.

From a static viewpoint, for a closed quantum system, the 2nd law of thermodynamics is a consequence of the unitary evolution. In this approach, one accounts for the entropy change before and after a change in the entire system. A dynamical viewpoint is based on local accounting for the entropy changes in the subsystems and the entropy generated in the baths.

Entropy

In thermodynamics, entropy is related to the amount of energy of a system that can be converted into mechanical work in a concrete process. In quantum mechanics, this translates to the ability to measure and manipulate the system based on the information gathered by measurement. An example is the case of Maxwell’s demon, which has been resolved by Leó Szilárd.

The entropy of an observable is associated with the complete projective measurement of an observable,, where the operator has a spectral decomposition:

where are the projection operators of the eigenvalue

The probability of outcome is The entropy associated with the observable is the Shannon entropy with respect to the possible outcomes:

The most significant observable in thermodynamics is the energy represented by the Hamiltonian operator and its associated energy entropy,

John von Neumann suggested to single out the most informative observable to characterize the entropy of the system. This invariant is obtained by minimizing the entropy with respect to all possible observables. The most informative observable operator commutes with the state of the system. The entropy of this observable is termed the Von Neumann entropy and is equal to

As a consequence, for all observables. At thermal equilibrium the energy entropy is equal to the von Neumann entropy:

is invariant to a unitary transformation changing the state. The Von Neumann entropy is additive only for a system state that is composed of a tensor product of its subsystems:

Clausius version of the II-law

No process is possible whose sole result is the transfer of heat from a body of lower temperature to a body of higher temperature.

This statement for N-coupled heat baths in steady state becomes

A dynamical version of the II-law can be proven, based on Spohn's inequality:

which is valid for any L-GKS generator, with a stationary state, .

Consistency with thermodynamics can be employed to verify quantum dynamical models of transport. For example, local models for networks where local L-GKS equations are connected through weak links have been thought to violate the second law of thermodynamics. In 2018 has been shown that, by correctly taking into account all work and energy contributions in the full system, local master equations are fully coherent with the second law of thermodynamics.

Quantum and thermodynamic adiabatic conditions and quantum friction

Thermodynamic adiabatic processes have no entropy change. Typically, an external control modifies the state. A quantum version of an adiabatic process can be modeled by an externally controlled time dependent Hamiltonian . If the system is isolated, the dynamics are unitary, and therefore, is a constant. A quantum adiabatic process is defined by the energy entropy being constant. The quantum adiabatic condition is therefore equivalent to no net change in the population of the instantaneous energy levels. This implies that the Hamiltonian should commute with itself at different times: .

When the adiabatic conditions are not fulfilled, additional work is required to reach the final control value. For an isolated system, this work is recoverable, since the dynamics is unitary and can be reversed. In this case, quantum friction can be suppressed using shortcuts to adiabaticity as demonstrated in the laboratory using a unitary Fermi gas in a time-dependent trap.

The coherence stored in the off-diagonal elements of the density operator carry the required information to recover the extra energy cost and reverse the dynamics. Typically, this energy is not recoverable, due to interaction with a bath that causes energy dephasing. The bath, in this case, acts like a measuring apparatus of energy. This lost energy is the quantum version of friction.

Emergence of the dynamical version of the third law of thermodynamics

There are seemingly two independent formulations of the third law of thermodynamics. Both were originally stated by Walther Nernst. The first formulation is known as the Nernst heat theorem, and can be phrased as:

  • The entropy of any pure substance in thermodynamic equilibrium approaches zero as the temperature approaches zero.

The second formulation is dynamical, known as the unattainability principle

  • It is impossible by any procedure, no matter how idealized, to reduce any assembly to absolute zero temperature in a finite number of operations.

At steady state the second law of thermodynamics implies that the total entropy production is non-negative.

When the cold bath approaches the absolute zero temperature, it is necessary to eliminate the entropy production divergence at the cold side when , therefore

For the fulfillment of the second law depends on the entropy production of the other baths, which should compensate for the negative entropy production of the cold bath.

The first formulation of the third law modifies this restriction. Instead of the third law imposes , guaranteeing that at absolute zero the entropy production at the cold bath is zero: . This requirement leads to the scaling condition of the heat current .

The second formulation, known as the unattainability principle can be rephrased as;

  • No refrigerator can cool a system to absolute zero temperature at finite time.

The dynamics of the cooling process is governed by the equation:

where is the heat capacity of the bath. Taking and with , we can quantify this formulation by evaluating the characteristic exponent of the cooling process,

This equation introduces the relation between the characteristic exponents and . When then the bath is cooled to zero temperature in a finite time, which implies a violation of the third law. It is apparent from the last equation, that the unattainability principle is more restrictive than the Nernst heat theorem.

Typicality as a source of emergence of thermodynamic phenomena

The basic idea of quantum typicality is that the vast majority of all pure states featuring a common expectation value of some generic observable at a given time will yield very similar expectation values of the same observable at any later time. This is meant to apply to Schrödinger type dynamics in high dimensional Hilbert spaces. As a consequence individual dynamics of expectation values are then typically well described by the ensemble average.

Quantum ergodic theorem originated by John von Neumann is a strong result arising from the mere mathematical structure of quantum mechanics. The QET is a precise formulation of termed normal typicality, i.e. the statement that, for typical large systems, every initial wave function from an energy shell is ‘normal’: it evolves in such a way that for most t, is macroscopically equivalent to the micro-canonical density matrix.

Resource theory

The second law of thermodynamics can be interpreted as quantifying state transformations which are statistically unlikely so that they become effectively forbidden. The second law typically applies to systems composed of many particles interacting; Quantum thermodynamics resource theory is a formulation of thermodynamics in the regime where it can be applied to a small number of particles interacting with a heat bath. For processes which are cyclic or very close to cyclic, the second law for microscopic systems takes on a very different form than it does at the macroscopic scale, imposing not just one constraint on what state transformations are possible, but an entire family of constraints. These second laws are not only relevant for small systems, but also apply to individual macroscopic systems interacting via long-range interactions, which only satisfy the ordinary second law on average. By making precise the definition of thermal operations, the laws of thermodynamics take on a form with the first law defining the class of thermal operations, the zeroth law emerging as a unique condition ensuring the theory is nontrivial, and the remaining laws being a monotonicity property of generalised free energies.

Engineered reservoirs

Nanoscale allows for the preparation of quantum systems in physical states without classical analogs. There, complex out-of-equilibrium scenarios may be produced by the initial preparation of either the working substance or the reservoirs of quantum particles, the latter dubbed as "engineered reservoirs".

There are different forms of engineered reservoirs. Some of them involve subtle quantum coherence or correlation effects, while others rely solely on nonthermal classical probability distribution functions. Interesting phenomena may emerge from the use of engineered reservoirs such as efficiencies greater than the Otto limit, violations of Clausius inequalities, or simultaneous extraction of heat and work from the reservoirs.

Sneaker wave

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Sneaker_wave ...