Search This Blog

Saturday, June 29, 2024

Quantum decoherence

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Quantum_decoherence
In classical scattering of a target body by environmental photons, the motion of the target body will not be changed by the scattered photons on the average. In quantum scattering, the interaction between the scattered photons and the superposed target body will cause them to be entangled, thereby delocalizing the phase coherence from the target body to the whole system, rendering the interference pattern unobservable.

Quantum decoherence is the loss of quantum coherence. Quantum decoherence has been studied to understand how quantum systems convert to systems which can be explained by classical mechanics. Beginning out of attempts to extend the understanding of quantum mechanics, the theory has developed in several directions and experimental studies have confirmed some of the key issues. Quantum computing relies on quantum coherence and is the primary practical applications of the concept.

Concept

In quantum mechanics, particles such as electrons are described by a wave function, a mathematical representation of the quantum state of a system; a probabilistic interpretation of the wave function is used to explain various quantum effects. As long as there exists a definite phase relation between different states, the system is said to be coherent. A definite phase relationship is necessary to perform quantum computing on quantum information encoded in quantum states. Coherence is preserved under the laws of quantum physics.

If a quantum system were perfectly isolated, it would maintain coherence indefinitely, but it would be impossible to manipulate or investigate it. If it is not perfectly isolated, for example during a measurement, coherence is shared with the environment and appears to be lost with time ─ a process called quantum decoherence or environmental decoherence. As a result of this process, quantum behavior is apparently lost, just as energy appears to be lost by friction in classical mechanics.

Decoherence can be viewed as the loss of information from a system into the environment (often modeled as a heat bath),[1] since every system is loosely coupled with the energetic state of its surroundings. Viewed in isolation, the system's dynamics are non-unitary (although the combined system plus environment evolves in a unitary fashion).[2] Thus the dynamics of the system alone are irreversible. As with any coupling, entanglements are generated between the system and environment. These have the effect of sharing quantum information with—or transferring it to—the surroundings.

History and interpretation

Relation to interpretation of quantum mechanics

An interpretation of quantum mechanics is an attempt to explain how the mathematical theory of quantum physics might correspond to experienced reality. Decoherence calculations can be done in any interpretation of quantum mechanics, since those calculations are an application of the standard mathematical tools of quantum theory. However, the subject of decoherence has been closely related to the problem of interpretation throughout its history.

Decoherence has been used to understand the possibility of the collapse of the wave function in quantum mechanics. Decoherence does not generate actual wave-function collapse. It only provides a framework for apparent wave-function collapse, as the quantum nature of the system "leaks" into the environment. That is, components of the wave function are decoupled from a coherent system and acquire phases from their immediate surroundings. A total superposition of the global or universal wavefunction still exists (and remains coherent at the global level), but its ultimate fate remains an interpretational issue.

With respect to the measurement problem, decoherence provides an explanation for the transition of the system to a mixture of states that seem to correspond to those states observers perceive. Moreover, observation indicates that this mixture looks like a proper quantum ensemble in a measurement situation, as the measurements lead to the "realization" of precisely one state in the "ensemble".

The philosophical views of Werner Heisenberg and Niels Bohr have often been grouped together as the "Copenhagen interpretation", despite significant divergences between them on important points. In 1955, Heisenberg suggested that the interaction of a system with its surrounding environment would eliminate quantum interference effects. However, Heisenberg did not provide a detailed account of how this might transpire, nor did he make explicit the importance of entanglement in the process.

Origin of the concepts

Nevill Mott's solution to the iconic Mott problem in 1929 is considered in retrospect to be the first quantum decoherence work. It was cited by the first modern theoretical treatment.

Although he did not use the term, the concept of quantum decoherence was first introduced in 1951 by the American physicist David Bohm, who called it the "destruction of interference in the process of measurement". Bohm later used decoherence to handle the measurement process in the de Broglie-Bohm interpretation of quantum theory.

The significance of decoherence was further highlighted in 1970 by the German physicist H. Dieter Zeh, and it has been a subject of active research since the 1980s. Decoherence has been developed into a complete framework, but there is controversy as to whether it solves the measurement problem, as the founders of decoherence theory admit in their seminal papers.

The study of decoherence as a proper subject began in 1970, with H. Dieter Zeh's paper "On the Interpretation of Measurement in Quantum Theory". Zeh regarded the wavefunction as a physical entity, rather than a calculational device or a compendium of statistical information (as is typical for Copenhagen-type interpretations), and he proposed that it should evolve unitarily, in accord with the Schrödinger equation, at all times. Zeh was initially unaware of Hugh Everett III's earlier work, which also proposed a universal wavefunction evolving unitarily; he revised his paper to reference Everett after learning of Everett's "relative-state interpretation" through an article by Bryce DeWitt. (DeWitt was the one who termed Everett's proposal the many-worlds interpretation, by which name it is commonly known.) For Zeh, the question of how to interpret quantum mechanics was of key importance, and an interpretation along the lines of Everett's was the most natural. Partly because of a general disinterest among physicists for interpretational questions, Zeh's work remained comparatively neglected until the early 1980s, when two papers by Wojciech Zurek invigorated the subject. Unlike Zeh's publications, Zurek's articles were fairly agnostic about interpretation, focusing instead on specific problems of density-matrix dynamics. Zurek's interest in decoherence stemmed from furthering Bohr's analysis of the double-slit experiment in his reply to the Einstein–Podolsky–Rosen paradox, work he had undertaken with Bill Wootters, and he has since argued that decoherence brings a kind of rapprochement between Everettian and Copenhagen-type views.

Decoherence does not claim to provide a mechanism for some actual wave-function collapse; rather it puts forth a reasonable framework for the appearance of wave-function collapse. The quantum nature of the system is simply "leaked" into the environment so that a total superposition of the wave function still exists, but exists—at least for all practical purposes—beyond the realm of measurement. By definition, the claim that a merged but unmeasurable wave function still exists cannot be proven experimentally. Decoherence is needed to understand why a quantum system begins to obey classical probability rules after interacting with its environment (due to the suppression of the interference terms when applying Born's probability rules to the system).

Criticism of the adequacy of decoherence theory to solve the measurement problem has been expressed by Anthony Leggett.

Mechanisms

To examine how decoherence operates, an "intuitive" model is presented below. The model requires some familiarity with quantum theory basics. Analogies are made between visualizable classical phase spaces and Hilbert spaces. A more rigorous derivation in Dirac notation shows how decoherence destroys interference effects and the "quantum nature" of systems. Next, the density matrix approach is presented for perspective.

Phase-space picture

An N-particle system can be represented in non-relativistic quantum mechanics by a wave function , where each xi is a point in 3-dimensional space. This has analogies with the classical phase space. A classical phase space contains a real-valued function in 6N dimensions (each particle contributes 3 spatial coordinates and 3 momenta). In this case a "quantum" phase space, on the other hand, involves a complex-valued function on a 3N-dimensional space. The position and momenta are represented by operators that do not commute, and lives in the mathematical structure of a Hilbert space. Aside from these differences, however, the rough analogy holds.

Different previously isolated, non-interacting systems occupy different phase spaces. Alternatively we can say that they occupy different lower-dimensional subspaces in the phase space of the joint system. The effective dimensionality of a system's phase space is the number of degrees of freedom present, which—in non-relativistic models—is 6 times the number of a system's free particles. For a macroscopic system this will be a very large dimensionality. When two systems (the environment being one system) start to interact, though, their associated state vectors are no longer constrained to the subspaces. Instead the combined state vector time-evolves a path through the "larger volume", whose dimensionality is the sum of the dimensions of the two subspaces. The extent to which two vectors interfere with each other is a measure of how "close" they are to each other (formally, their overlap or Hilbert space multiplies together) in the phase space. When a system couples to an external environment, the dimensionality of, and hence "volume" available to, the joint state vector increases enormously. Each environmental degree of freedom contributes an extra dimension.

The original system's wave function can be expanded in many different ways as a sum of elements in a quantum superposition. Each expansion corresponds to a projection of the wave vector onto a basis. The basis can be chosen at will. Choosing an expansion where the resulting basis elements interact with the environment in an element-specific way, such elements will—with overwhelming probability—be rapidly separated from each other by their natural unitary time evolution along their own independent paths. After a very short interaction, there is almost no chance of further interference. The process is effectively irreversible. The different elements effectively become "lost" from each other in the expanded phase space created by coupling with the environment. In phase space, this decoupling is monitored through the Wigner quasi-probability distribution. The original elements are said to have decohered. The environment has effectively selected out those expansions or decompositions of the original state vector that decohere (or lose phase coherence) with each other. This is called "environmentally-induced superselection", or einselection. The decohered elements of the system no longer exhibit quantum interference between each other, as in a double-slit experiment. Any elements that decohere from each other via environmental interactions are said to be quantum-entangled with the environment. The converse is not true: not all entangled states are decohered from each other.

Any measuring device or apparatus acts as an environment, since at some stage along the measuring chain, it has to be large enough to be read by humans. It must possess a very large number of hidden degrees of freedom. In effect, the interactions may be considered to be quantum measurements. As a result of an interaction, the wave functions of the system and the measuring device become entangled with each other. Decoherence happens when different portions of the system's wave function become entangled in different ways with the measuring device. For two einselected elements of the entangled system's state to interfere, both the original system and the measuring in both elements device must significantly overlap, in the scalar product sense. If the measuring device has many degrees of freedom, it is very unlikely for this to happen.

As a consequence, the system behaves as a classical statistical ensemble of the different elements rather than as a single coherent quantum superposition of them. From the perspective of each ensemble member's measuring device, the system appears to have irreversibly collapsed onto a state with a precise value for the measured attributes, relative to that element. This provides one explanation of how the Born rule coefficients effectively act as probabilities as per the measurement postulate constituting a solution to the quantum measurement problem.

Dirac notation

Using Dirac notation, let the system initially be in the state

where the s form an einselected basis (environmentally induced selected eigenbasis), and let the environment initially be in the state . The vector basis of the combination of the system and the environment consists of the tensor products of the basis vectors of the two subsystems. Thus, before any interaction between the two subsystems, the joint state can be written as

where is shorthand for the tensor product . There are two extremes in the way the system can interact with its environment: either (1) the system loses its distinct identity and merges with the environment (e.g. photons in a cold, dark cavity get converted into molecular excitations within the cavity walls), or (2) the system is not disturbed at all, even though the environment is disturbed (e.g. the idealized non-disturbing measurement). In general, an interaction is a mixture of these two extremes that we examine.

System absorbed by environment

If the environment absorbs the system, each element of the total system's basis interacts with the environment such that

evolves into

and so

evolves into

The unitarity of time evolution demands that the total state basis remains orthonormal, i.e. the scalar or inner products of the basis vectors must vanish, since :

This orthonormality of the environment states is the defining characteristic required for einselection.

System not disturbed by environment

In an idealized measurement, the system disturbs the environment, but is itself undisturbed by the environment. In this case, each element of the basis interacts with the environment such that

evolves into the product

and so

evolves into

In this case, unitarity demands that

where was used. Additionally, decoherence requires, by virtue of the large number of hidden degrees of freedom in the environment, that

As before, this is the defining characteristic for decoherence to become einselection. The approximation becomes more exact as the number of environmental degrees of freedom affected increases.

Note that if the system basis were not an einselected basis, then the last condition is trivial, since the disturbed environment is not a function of , and we have the trivial disturbed environment basis . This would correspond to the system basis being degenerate with respect to the environmentally defined measurement observable. For a complex environmental interaction (which would be expected for a typical macroscale interaction) a non-einselected basis would be hard to define.

Loss of interference and the transition from quantum to classical probabilities

The utility of decoherence lies in its application to the analysis of probabilities, before and after environmental interaction, and in particular to the vanishing of quantum interference terms after decoherence has occurred. If we ask what is the probability of observing the system making a transition from to before has interacted with its environment, then application of the Born probability rule states that the transition probability is the squared modulus of the scalar product of the two states:

where , , and etc.

The above expansion of the transition probability has terms that involve ; these can be thought of as representing interference between the different basis elements or quantum alternatives. This is a purely quantum effect and represents the non-additivity of the probabilities of quantum alternatives.

To calculate the probability of observing the system making a quantum leap from to after has interacted with its environment, then application of the Born probability rule states that we must sum over all the relevant possible states of the environment before squaring the modulus:

The internal summation vanishes when we apply the decoherence/einselection condition , and the formula simplifies to

If we compare this with the formula we derived before the environment introduced decoherence, we can see that the effect of decoherence has been to move the summation sign from inside of the modulus sign to outside. As a result, all the cross- or quantum interference-terms

have vanished from the transition-probability calculation. The decoherence has irreversibly converted quantum behaviour (additive probability amplitudes) to classical behaviour (additive probabilities). However, Ballentine shows that the significant impact of decoherence to reduce interference need not have significance for the transition of quantum systems to classical limits.

In terms of density matrices, the loss of interference effects corresponds to the diagonalization of the "environmentally traced-over" density matrix.

Density-matrix approach

The effect of decoherence on density matrices is essentially the decay or rapid vanishing of the off-diagonal elements of the partial trace of the joint system's density matrix, i.e. the trace, with respect to any environmental basis, of the density matrix of the combined system and its environment. The decoherence irreversibly converts the "averaged" or "environmentally traced-over" density matrix from a pure state to a reduced mixture; it is this that gives the appearance of wave-function collapse. Again, this is called "environmentally induced superselection", or einselection. The advantage of taking the partial trace is that this procedure is indifferent to the environmental basis chosen.

Initially, the density matrix of the combined system can be denoted as

where is the state of the environment. Then if the transition happens before any interaction takes place between the system and the environment, the environment subsystem has no part and can be traced out, leaving the reduced density matrix for the system:

Now the transition probability will be given as

where , , and etc.

Now the case when transition takes place after the interaction of the system with the environment. The combined density matrix will be

To get the reduced density matrix of the system, we trace out the environment and employ the decoherence/einselection condition and see that the off-diagonal terms vanish (a result obtained by Erich Joos and H. D. Zeh in 1985):

Similarly, the final reduced density matrix after the transition will be

The transition probability will then be given as

which has no contribution from the interference terms

The density-matrix approach has been combined with the Bohmian approach to yield a reduced-trajectory approach, taking into account the system reduced density matrix and the influence of the environment.

Operator-sum representation

Consider a system S and environment (bath) B, which are closed and can be treated quantum-mechanically. Let and be the system's and bath's Hilbert spaces respectively. Then the Hamiltonian for the combined system is

where are the system and bath Hamiltonians respectively, is the interaction Hamiltonian between the system and bath, and are the identity operators on the system and bath Hilbert spaces respectively. The time-evolution of the density operator of this closed system is unitary and, as such, is given by

where the unitary operator is . If the system and bath are not entangled initially, then we can write . Therefore, the evolution of the system becomes

The system–bath interaction Hamiltonian can be written in a general form as

where is the operator acting on the combined system–bath Hilbert space, and are the operators that act on the system and bath respectively. This coupling of the system and bath is the cause of decoherence in the system alone. To see this, a partial trace is performed over the bath to give a description of the system alone:

is called the reduced density matrix and gives information about the system only. If the bath is written in terms of its set of orthogonal basis kets, that is, if it has been initially diagonalized, then . Computing the partial trace with respect to this (computational) basis gives

where are defined as the Kraus operators and are represented as (the index combines indices and ):

This is known as the operator-sum representation (OSR). A condition on the Kraus operators can be obtained by using the fact that ; this then gives

This restriction determines whether decoherence will occur or not in the OSR. In particular, when there is more than one term present in the sum for , then the dynamics of the system will be non-unitary, and hence decoherence will take place.

Semigroup approach

A more general consideration for the existence of decoherence in a quantum system is given by the master equation, which determines how the density matrix of the system alone evolves in time (see also the Belavkin equation for the evolution under continuous measurement). This uses the Schrödinger picture, where evolution of the state (represented by its density matrix) is considered. The master equation is

where is the system Hamiltonian along with a (possible) unitary contribution from the bath, and is the Lindblad decohering term. The Lindblad decohering term is represented as

The are basis operators for the M-dimensional space of bounded operators that act on the system Hilbert space and are the error generators. The matrix elements represent the elements of a positive semi-definite Hermitian matrix; they characterize the decohering processes and, as such, are called the noise parameters. The semigroup approach is particularly nice, because it distinguishes between the unitary and decohering (non-unitary) processes, which is not the case with the OSR. In particular, the non-unitary dynamics are represented by , whereas the unitary dynamics of the state are represented by the usual Heisenberg commutator. Note that when , the dynamical evolution of the system is unitary. The conditions for the evolution of the system density matrix to be described by the master equation are:[2]

  1. the evolution of the system density matrix is determined by a one-parameter semigroup
  2. the evolution is "completely positive" (i.e. probabilities are preserved)
  3. the system and bath density matrices are initially decoupled

Non-unitary modelling examples

Decoherence can be modelled as a non-unitary process by which a system couples with its environment (although the combined system plus environment evolves in a unitary fashion). Thus the dynamics of the system alone, treated in isolation, are non-unitary and, as such, are represented by irreversible transformations acting on the system's Hilbert space . Since the system's dynamics are represented by irreversible representations, then any information present in the quantum system can be lost to the environment or heat bath. Alternatively, the decay of quantum information caused by the coupling of the system to the environment is referred to as decoherence. Thus decoherence is the process by which information of a quantum system is altered by the system's interaction with its environment (which form a closed system), hence creating an entanglement between the system and heat bath (environment). As such, since the system is entangled with its environment in some unknown way, a description of the system by itself cannot be made without also referring to the environment (i.e. without also describing the state of the environment).

Rotational decoherence

Consider a system of N qubits that is coupled to a bath symmetrically. Suppose this system of N qubits undergoes a rotation around the eigenstates of . Then under such a rotation, a random phase will be created between the eigenstates , of . Thus these basis qubits and will transform in the following way:

This transformation is performed by the rotation operator

Since any qubit in this space can be expressed in terms of the basis qubits, then all such qubits will be transformed under this rotation. Consider the th qubit in a pure state where . Before application of the rotation this state is:

.

This state will decohere, since it is not ‘encoded’ with (dependent upon) the dephasing factor . This can be seen by examining the density matrix averaged over the random phase :

,

where is a probability measure of the random phase, . Although not entirely necessary, let us assume for simplicity that this is given by the Gaussian distribution, i.e. , where represents the spread of the random phase. Then the density matrix computed as above is

.

Observe that the off-diagonal elements—the coherence terms—decay as the spread of the random phase, , increases over time (which is a realistic expectation). Thus the density matrices for each qubit of the system become indistinguishable over time. This means that no measurement can distinguish between the qubits, thus creating decoherence between the various qubit states. In particular, this dephasing process causes the qubits to collapse to one of the pure states in . This is why this type of decoherence process is called collective dephasing, because the mutual phases between all qubits of the N-qubit system are destroyed.

Depolarizing

Depolarizing is a non-unitary transformation on a quantum system which maps pure states to mixed states. This is a non-unitary process because any transformation that reverses this process will map states out of their respective Hilbert space thus not preserving positivity (i.e. the original probabilities are mapped to negative probabilities, which is not allowed). The 2-dimensional case of such a transformation would consist of mapping pure states on the surface of the Bloch sphere to mixed states within the Bloch sphere. This would contract the Bloch sphere by some finite amount and the reverse process would expand the Bloch sphere, which cannot happen.

Dissipation

Dissipation is a decohering process by which the populations of quantum states are changed due to entanglement with a bath. An example of this would be a quantum system that can exchange its energy with a bath through the interaction Hamiltonian. If the system is not in its ground state and the bath is at a temperature lower than that of the system's, then the system will give off energy to the bath, and thus higher-energy eigenstates of the system Hamiltonian will decohere to the ground state after cooling and, as such, will all be non-degenerate. Since the states are no longer degenerate, they are not distinguishable, and thus this process is irreversible (non-unitary).

Timescales

Decoherence represents an extremely fast process for macroscopic objects, since these are interacting with many microscopic objects, with an enormous number of degrees of freedom in their natural environment. The process is needed if we are to understand why we tend not to observe quantum behavior in everyday macroscopic objects and why we do see classical fields emerge from the properties of the interaction between matter and radiation for large amounts of matter. The time taken for off-diagonal components of the density matrix to effectively vanish is called the decoherence time. It is typically extremely short for everyday, macroscale processes. A modern basis-independent definition of the decoherence time relies on the short-time behavior of the fidelity between the initial and the time-dependent state or, equivalently, the decay of the purity.

Mathematical details

Assume for the moment that the system in question consists of a subsystem A being studied and the "environment" , and the total Hilbert space is the tensor product of a Hilbert space describing A and a Hilbert space describing , that is,

This is a reasonably good approximation in the case where A and are relatively independent (e.g. there is nothing like parts of A mixing with parts of or conversely). The point is, the interaction with the environment is for all practical purposes unavoidable (e.g. even a single excited atom in a vacuum would emit a photon, which would then go off). Let's say this interaction is described by a unitary transformation U acting upon . Assume that the initial state of the environment is , and the initial state of A is the superposition state

where and are orthogonal, and there is no entanglement initially. Also, choose an orthonormal basis for . (This could be a "continuously indexed basis" or a mixture of continuous and discrete indexes, in which case we would have to use a rigged Hilbert space and be more careful about what we mean by orthonormal, but that's an inessential detail for expository purposes.) Then, we can expand

and

uniquely as

and

respectively. One thing to realize is that the environment contains a huge number of degrees of freedom, a good number of them interacting with each other all the time. This makes the following assumption reasonable in a handwaving way, which can be shown to be true in some simple toy models. Assume that there exists a basis for such that and are all approximately orthogonal to a good degree if ij and the same thing for and and also for and for any i and j (the decoherence property).

This often turns out to be true (as a reasonable conjecture) in the position basis because how A interacts with the environment would often depend critically upon the position of the objects in A. Then, if we take the partial trace over the environment, we would find the density state is approximately described by

that is, we have a diagonal mixed state, there is no constructive or destructive interference, and the "probabilities" add up classically. The time it takes for U(t) (the unitary operator as a function of time) to display the decoherence property is called the decoherence time.

Experimental observations

Quantitative measurement

The decoherence rate depends on a number of factors, including temperature or uncertainty in position, and many experiments have tried to measure it depending on the external environment.

The process of a quantum superposition gradually obliterated by decoherence was quantitatively measured for the first time by Serge Haroche and his co-workers at the École Normale Supérieure in Paris in 1996. Their approach involved sending individual rubidium atoms, each in a superposition of two states, through a microwave-filled cavity. The two quantum states both cause shifts in the phase of the microwave field, but by different amounts, so that the field itself is also put into a superposition of two states. Due to photon scattering on cavity-mirror imperfection, the cavity field loses phase coherence to the environment. Haroche and his colleagues measured the resulting decoherence via correlations between the states of pairs of atoms sent through the cavity with various time delays between the atoms.

In July 2011, researchers from University of British Columbia and University of California, Santa Barbara showed that applying high magnetic fields to single molecule magnets suppressed two of three known sources of decoherence. They were able to measure the dependence of decoherence on temperature and magnetic field strength.

Applications

Decoherence represents a challenge for the practical realization of quantum computers, since such machines are expected to rely heavily on the undisturbed evolution of quantum coherences. Simply put, they require that the coherence of states be preserved and that decoherence be managed, in order to actually perform quantum computation. The preservation of coherence, and mitigation of decoherence effects, are thus related to the concept of quantum error correction

In August 2020 scientists reported that ionizing radiation from environmental radioactive materials and cosmic rays may substantially limit the coherence times of qubits if they aren't shielded adequately which may be critical for realizing fault-tolerant superconducting quantum computers in the future.

Quantum programming

From Wikipedia, the free encyclopedia

Quantum programming is the process of designing or assembling sequences of instructions, called quantum circuits, using gates, switches, and operators to manipulate a quantum system for a desired outcome or results of a given experiment. Quantum circuit algorithms can be implemented on integrated circuits, conducted with instrumentation, or written in a programming language for use with a quantum computer or a quantum processor.

With quantum processor based systems, quantum programming languages help express quantum algorithms using high-level constructs. The field is deeply rooted in the open-source philosophy and as a result most of the quantum software discussed in this article is freely available as open-source software.

Quantum computers, such as those based on the KLM protocol, a linear optical quantum computing (LOQC) model, use quantum algorithms (circuits) implemented with electronics, integrated circuits, instrumentation, sensors, and/or by other physical means.

Other circuits designed for experimentation related to quantum systems can be instrumentation and sensor based.

Quantum instruction sets

Quantum instruction sets are used to turn higher level algorithms into physical instructions that can be executed on quantum processors. Sometimes these instructions are specific to a given hardware platform, e.g. ion traps or superconducting qubits.

cQASM

cQASM,[3] also known as common QASM, is a hardware-agnostic quantum assembly language which guarantees the interoperability between all the quantum compilation and simulation tools. It was introduced by the QCA Lab at TUDelft.

Quil

Quil is an instruction set architecture for quantum computing that first introduced a shared quantum/classical memory model. It was introduced by Robert Smith, Michael Curtis, and William Zeng in A Practical Quantum Instruction Set Architecture. Many quantum algorithms (including quantum teleportation, quantum error correction, simulation, and optimization algorithms) require a shared memory architecture.

OpenQASM

OpenQASM is the intermediate representation introduced by IBM for use with Qiskit and the IBM Q Experience.

Blackbird

Blackbird is a quantum instruction set and intermediate representation used by Xanadu Quantum Technologies and Strawberry Fields. It is designed to represent continuous-variable quantum programs that can run on photonic quantum hardware.

Quantum software development kits

Quantum software development kits provide collections of tools to create and manipulate quantum programs. They also provide the means to simulate the quantum programs or prepare them to be run using cloud-based quantum devices and self-hosted quantum devices.

SDKs with access to quantum processors

The following software development kits can be used to run quantum circuits on prototype quantum devices, as well as on simulators.

Perceval

An open-source project created by Quandela [fr] for designing photonic quantum circuits and developing quantum algorithms, based on Python. Simulations are run either on the user's own computer or on the cloud. Perceval is also used to connect to Quandela's cloud-based photonic quantum processor.

Ocean

An Open Source suite of tools developed by D-Wave. Written mostly in the Python programming language, it enables users to formulate problems in Ising Model and Quadratic Unconstrained Binary Optimization formats (QUBO). Results can be obtained by submitting to an online quantum computer in Leap, D-Wave's real-time Quantum Application Environment, customer-owned machines, or classical samplers.

A sample code using projectq with Python

ProjectQ

An Open Source project developed at the Institute for Theoretical Physics at ETH, which uses the Python programming language to create and manipulate quantum circuits. Results are obtained either using a simulator, or by sending jobs to IBM quantum devices.

Qrisp

Qrisp is an Open Source project coordinated by the Eclipse Foundation and developed in Python programming by Fraunhofer FOKUS Qrisp is a high-level programming language for creating and compiling quantum algorithms. Its structured programming model enables scalable development and maintenance. The expressive syntax is based on variables instead of qubits, with the QuantumVariable as core class, and functions instead of gates. Additional tools, such as a performant simulator and automatic uncomputation, complement the extensive framework. Furthermore, it is platform independent, since it offers alternative compilation of elementary functions down to the circuit level, based on device-specific gate sets.

Qiskit

An Open Source project developed by IBM. Quantum circuits are created and manipulated using Python. Results are obtained either using simulators that run on the user's own device, simulators provided by IBM or prototype quantum devices provided by IBM. As well as the ability to create programs using basic quantum operations, higher level tools for algorithms and benchmarking are available within specialized packages. Qiskit is based on the OpenQASM standard for representing quantum circuits. It also supports pulse level control of quantum systems via QiskitPulse standard.

Qibo

An open source full-stack API for quantum simulation, quantum hardware control and calibration developed by multiple research laboratories, including QRC, CQT and INFN. Qibo is a modular framework which includes multiple backends for quantum simulation and hardware control. This project aims at providing a platform agnostic quantum hardware control framework with drivers for multiple instruments and tools for quantum calibration, characterization and validation. This framework focuses on self-hosted quantum devices by simplifying the software development required in labs.

Forest

An open source project developed by Rigetti, which uses the Python programming language to create and manipulate quantum circuits. Results are obtained either using simulators or prototype quantum devices provided by Rigetti. As well as the ability to create programs using basic quantum operations, higher level algorithms are available within the Grove package. Forest is based on the Quil instruction set.

t|ket>

A quantum programming environment and optimizing compiler developed by Cambridge Quantum Computing that targets simulators and several quantum hardware back-ends, released in December 2018.

Strawberry Fields

An open-source Python library developed by Xanadu Quantum Technologies for designing, simulating, and optimizing continuous variable (CV) quantum optical circuits. Three simulators are provided - one in the Fock basis, one using the Gaussian formulation of quantum optics, and one using the TensorFlow machine learning library. Strawberry Fields is also the library for executing programs on Xanadu's quantum photonic hardware.

PennyLane

An open-source Python library developed by Xanadu Quantum Technologies for differentiable programming of quantum computers. PennyLane provides users the ability to create models using TensorFlow, NumPy, or PyTorch, and connect them with quantum computer backends available from IBMQ, Google Quantum, Rigetti, Quantinuum and Alpine Quantum Technologies.

Quantum Development Kit

A project developed by Microsoft as part of the .NET Framework. Quantum programs can be written and run within Visual Studio and VSCode using the quantum programming language Q#. Programs developed in the QDK can be run on Microsoft's Azure Quantum, and run on quantum computers from Quantinuum, IonQ, and Pasqal.

Cirq

An Open Source project developed by Google, which uses the Python programming language to create and manipulate quantum circuits. Programs written in Cirq can be run on IonQ, Pasqal, Rigetti, and Alpine Quantum Technologies.

Quantum programming languages

There are two main groups of quantum programming languages: imperative quantum programming languages and functional quantum programming languages.

Imperative languages

The most prominent representatives of the imperative languages are QCL, LanQ and Q|SI>.

Ket

Ket is an open-source embedded language designed to facilitate quantum programming, leveraging the familiar syntax and simplicity of Python. It serves as an integral component of the Ket Quantum Programming Platform, seamlessly integrating with a Rust runtime library and a quantum simulator. Maintained by Quantuloop, the project emphasizes accessibility and versatility for researchers and developers. The following example demonstrates the implementation of a Bell state using Ket:

from ket import *
a, b = quant(2) # Allocate two quantum bits
H(a) # Put qubit `a` in a superposition
cnot(a, b) # Entangle the two qubits in the Bell state
m_a = measure(a) # Measure qubit `a`, collapsing qubit `b` as well
m_b = measure(b) # Measure qubit `b`
# Assert that the measurement of both qubits will always be equal
assert m_a.value == m_b.value

QCL

Quantum Computation Language (QCL) is one of the first implemented quantum programming languages. The most important feature of QCL is the support for user-defined operators and functions. Its syntax resembles the syntax of the C programming language and its classical data types are similar to primitive data types in C. One can combine classical code and quantum code in the same program.

Quantum pseudocode

Quantum pseudocode proposed by E. Knill is the first formalized language for description of quantum algorithms. It was introduced and, moreover, was tightly connected with a model of quantum machine called Quantum Random Access Machine (QRAM).

Q#

A language developed by Microsoft to be used with the Quantum Development Kit.

Q|SI>

Q|SI> is a platform embedded in .Net language supporting quantum programming in a quantum extension of while-language. This platform includes a compiler of the quantum while-language and a chain of tools for the simulation of quantum computation, optimisation of quantum circuits, termination analysis of quantum programs, and verification of quantum programs.

Q language

Q Language is the second implemented imperative quantum programming language. Q Language was implemented as an extension of C++ programming language. It provides classes for basic quantum operations like QHadamard, QFourier, QNot, and QSwap, which are derived from the base class Qop. New operators can be defined using C++ class mechanism.

Quantum memory is represented by class Qreg.

Qreg x1; // 1-qubit quantum register with initial value 0
Qreg x2(2,0); // 2-qubit quantum register with initial value 0

The computation process is executed using a provided simulator. Noisy environments can be simulated using parameters of the simulator.

qGCL

Quantum Guarded Command Language (qGCL) was defined by P. Zuliani in his PhD thesis. It is based on Guarded Command Language created by Edsger Dijkstra.

It can be described as a language of quantum programs specification.

QMASM

Quantum Macro Assembler (QMASM) is a low-level language specific to quantum annealers such as the D-Wave.

Scaffold

Scaffold is C-like language, that compiles to QASM and OpenQASM. It is built on top of the LLVM Compiler Infrastructure to perform optimizations on Scaffold code before generating a specified instruction set.

Silq

Silq is a high-level programming language for quantum computing with a strong static type system, developed at ETH Zürich.

LQP

The Logic of Quantum Programs (LQP) is a dynamic quantum logic, capable of expressing important features of quantum measurements and unitary evolutions of multi-partite states, and provides logical characterizations of various forms of entanglement. The logic has been used to specify and verify the correctness of various protocols in quantum computation.

Functional languages

Efforts are underway to develop functional programming languages for quantum computing. Functional programming languages are well-suited for reasoning about programs. Examples include Selinger's QPL, and the Haskell-like language QML by Altenkirch and Grattage. Higher-order quantum programming languages, based on lambda calculus, have been proposed by van Tonder, Selinger and Valiron and by Arrighi and Dowek.

QFC and QPL

QFC and QPL are two closely related quantum programming languages defined by Peter Selinger. They differ only in their syntax: QFC uses a flow chart syntax, whereas QPL uses a textual syntax. These languages have classical control flow but can operate on quantum or classical data. Selinger gives a denotational semantics for these languages in a category of superoperators.

QML

QML is a Haskell-like quantum programming language by Altenkirch and Grattage. Unlike Selinger's QPL, this language takes duplication, rather than discarding, of quantum information as a primitive operation. Duplication in this context is understood to be the operation that maps to , and is not to be confused with the impossible operation of cloning; the authors claim it is akin to how sharing is modeled in classical languages. QML also introduces both classical and quantum control operators, whereas most other languages rely on classical control.

An operational semantics for QML is given in terms of quantum circuits, while a denotational semantics is presented in terms of superoperators, and these are shown to agree. Both the operational and denotational semantics have been implemented (classically) in Haskell.

LIQUi|>

LIQUi|> (pronounced liquid) is a quantum simulation extension on the F# programming language. It is currently being developed by the Quantum Architectures and Computation Group (QuArC) part of the StationQ efforts at Microsoft Research. LIQUi|> seeks to allow theorists to experiment with quantum algorithm design before physical quantum computers are available for use.

It includes a programming language, optimization and scheduling algorithms, and quantum simulators. LIQUi|> can be used to translate a quantum algorithm written in the form of a high-level program into the low-level machine instructions for a quantum device.

Quantum lambda calculi

Quantum lambda calculi are extensions of the classical lambda calculus introduced by Alonzo Church and Stephen Cole Kleene in the 1930s. The purpose of quantum lambda calculi is to extend quantum programming languages with a theory of higher-order functions.

The first attempt to define a quantum lambda calculus was made by Philip Maymin in 1996. His lambda-q calculus is powerful enough to express any quantum computation. However, this language can efficiently solve NP-complete problems, and therefore appears to be strictly stronger than the standard quantum computational models (such as the quantum Turing machine or the quantum circuit model). Therefore, Maymin's lambda-q calculus is probably not implementable on a physical device.

In 2003, André van Tonder defined an extension of the lambda calculus suitable for proving correctness of quantum programs. He also provided an implementation in the Scheme programming language.

In 2004, Selinger and Valiron defined a strongly typed lambda calculus for quantum computation with a type system based on linear logic.

Quipper

Quipper was published in 2013. It is implemented as an embedded language, using Haskell as the host language. For this reason, quantum programs written in Quipper are written in Haskell using provided libraries. For example, the following code implements preparation of a superposition

import Quipper

spos :: Bool -> Circ Qubit
spos b = do q <- qinit b
            r <- hadamard q
            return r

Natural aristocracy

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Natural_aristocracy ...