Quantum programming is the process of designing or assembling
sequences of instructions, called quantum circuits, using gates,
switches, and operators to manipulate a quantum system for a desired
outcome or results of a given experiment. Quantum circuit algorithms can
be implemented on integrated circuits, conducted with instrumentation,
or written in a programming language for use with a quantum computer or a quantum processor.
With quantum processor based systems, quantum programming languages help express quantum algorithms using high-level constructs. The field is deeply rooted in the open-source philosophy and as a result most of the quantum software discussed in this article is freely available as open-source software.
Quantum computers, such as those based on the KLM protocol, a linear optical quantum computing
(LOQC) model, use quantum algorithms (circuits) implemented with
electronics, integrated circuits, instrumentation, sensors, and/or by
other physical means.
Other circuits designed for experimentation related to quantum systems can be instrumentation and sensor based.
Quantum instruction sets
Quantum
instruction sets are used to turn higher level algorithms into physical
instructions that can be executed on quantum processors. Sometimes
these instructions are specific to a given hardware platform, e.g. ion traps or superconducting qubits.
cQASM
cQASM,[3]
also known as common QASM, is a hardware-agnostic quantum assembly
language which guarantees the interoperability between all the quantum
compilation and simulation tools. It was introduced by the QCA Lab at TUDelft.
Quil
is an instruction set architecture for quantum computing that first
introduced a shared quantum/classical memory model. It was introduced by
Robert Smith, Michael Curtis, and William Zeng in A Practical Quantum Instruction Set Architecture. Many quantum algorithms (including quantum teleportation, quantum error correction, simulation, and optimization algorithms) require a shared memory architecture.
Blackbird is a quantum instruction set and intermediate representation used by Xanadu Quantum Technologies and Strawberry Fields. It is designed to represent continuous-variable quantum programs that can run on photonic quantum hardware.
Quantum software development kits
Quantum software development kits provide collections of tools to create and manipulate quantum programs. They also provide the means to simulate the quantum programs or prepare them to be run using cloud-based quantum devices and self-hosted quantum devices.
SDKs with access to quantum processors
The following software development kits can be used to run quantum circuits on prototype quantum devices, as well as on simulators.
Perceval
An open-source project created by Quandela [fr] for designing photonic quantum circuits and developing quantum algorithms, based on Python. Simulations are run either on the user's own computer or on the cloud. Perceval is also used to connect to Quandela's cloud-based photonic quantum processor.
Ocean
An Open
Source suite of tools developed by D-Wave. Written mostly in the Python
programming language, it enables users to formulate problems in Ising
Model and Quadratic Unconstrained Binary Optimization formats (QUBO).
Results can be obtained by submitting to an online quantum computer in
Leap, D-Wave's real-time Quantum Application Environment, customer-owned
machines, or classical samplers.
ProjectQ
An Open Source project developed at the Institute for Theoretical Physics at ETH, which uses the Python programming language to create and manipulate quantum circuits. Results are obtained either using a simulator, or by sending jobs to IBM quantum devices.
Qrisp
Qrisp is an Open Source project coordinated by the Eclipse Foundation and developed in Python programming by Fraunhofer FOKUS
Qrisp is a high-level programming language for creating and compiling
quantum algorithms. Its structured programming model enables scalable
development and maintenance. The expressive syntax is based on variables
instead of qubits, with the QuantumVariable as core class, and
functions instead of gates. Additional tools, such as a performant
simulator and automatic uncomputation, complement the extensive
framework.
Furthermore, it is platform independent, since it offers alternative
compilation of elementary functions down to the circuit level, based on
device-specific gate sets.
An Open Source project developed by IBM. Quantum circuits are created and manipulated using Python.
Results are obtained either using simulators that run on the user's own
device, simulators provided by IBM or prototype quantum devices
provided by IBM. As well as the ability to create programs using basic
quantum operations, higher level tools for algorithms and benchmarking
are available within specialized packages. Qiskit is based on the OpenQASM standard for representing quantum circuits. It also supports pulse level control of quantum systems via QiskitPulse standard.
Qibo
An open source
full-stack API for quantum simulation, quantum hardware control and
calibration developed by multiple research laboratories, including QRC, CQT and INFN. Qibo is a modular framework which includes multiple backends for quantum simulation and hardware control. This project aims at providing a platform agnostic quantum hardware control framework with drivers for multiple instruments and tools for quantum calibration, characterization and validation. This framework focuses on self-hosted quantum devices by simplifying the software development required in labs.
Forest
An open source project developed by Rigetti, which uses the Python programming
language to create and manipulate quantum circuits. Results are
obtained either using simulators or prototype quantum devices provided
by Rigetti. As well as the ability to create programs using basic
quantum operations, higher level algorithms are available within the
Grove package. Forest is based on the Quil instruction set.
t|ket>
A quantum programming environment and optimizing compiler developed by Cambridge Quantum Computing that targets simulators and several quantum hardware back-ends, released in December 2018.
Strawberry Fields
An open-sourcePythonlibrary developed by Xanadu Quantum Technologies for designing, simulating, and optimizing continuous variable (CV) quantum optical circuits.Three simulators are provided - one in the Fock basis, one using the Gaussian formulation of quantum optics, and one using the TensorFlow machine learning library. Strawberry Fields is also the library for executing programs on Xanadu's quantum photonic hardware.
A project developed by Microsoft as part of the .NET Framework. Quantum programs can be written and run within Visual Studio and VSCode using the quantum programming language Q#. Programs developed in the QDK can be run on Microsoft's Azure Quantum, and run on quantum computers from Quantinuum, IonQ, and Pasqal.
There are two main groups of quantum programming languages: imperative quantum programming languages and functional quantum programming languages.
Imperative languages
The most prominent representatives of the imperative languages are QCL, LanQ and Q|SI>.
Ket
Ket
is an open-source embedded language designed to facilitate quantum
programming, leveraging the familiar syntax and simplicity of Python. It
serves as an integral component of the Ket Quantum Programming
Platform,
seamlessly integrating with a Rust runtime library and a quantum
simulator. Maintained by Quantuloop, the project emphasizes
accessibility and versatility for researchers and developers. The
following example demonstrates the implementation of a Bell state using Ket:
fromketimport*a,b=quant(2)# Allocate two quantum bitsH(a)# Put qubit `a` in a superpositioncnot(a,b)# Entangle the two qubits in the Bell statem_a=measure(a)# Measure qubit `a`, collapsing qubit `b` as wellm_b=measure(b)# Measure qubit `b`# Assert that the measurement of both qubits will always be equalassertm_a.value==m_b.value
Quantum Computation Language (QCL) is one of the first implemented quantum programming languages. The most important feature of QCL is the support for user-defined operators and functions. Its syntax resembles the syntax of the C programming language and its classical data types are similar to primitive data types in C. One can combine classical code and quantum code in the same program.
Quantum pseudocode
Quantum pseudocode proposed by E. Knill is the first formalized language for description of quantum algorithms. It was introduced and, moreover, was tightly connected with a model of quantum machine called Quantum Random Access Machine (QRAM).
Q|SI> is a platform embedded in .Net language supporting quantum programming in a quantum extension of while-language. This platform includes a compiler of the quantum while-language
and a chain of tools for the simulation of quantum computation,
optimisation of quantum circuits, termination analysis of quantum
programs, and verification of quantum programs.
Q language
Q Language is the second implemented imperative quantum programming language.
Q Language was implemented as an extension of C++ programming language.
It provides classes for basic quantum operations like QHadamard,
QFourier, QNot, and QSwap, which are derived from the base class Qop.
New operators can be defined using C++ class mechanism.
Quantum memory is represented by class Qreg.
Qregx1;// 1-qubit quantum register with initial value 0Qregx2(2,0);// 2-qubit quantum register with initial value 0
The computation process is executed using a provided simulator. Noisy
environments can be simulated using parameters of the simulator.
It can be described as a language of quantum programs specification.
QMASM
Quantum Macro Assembler (QMASM) is a low-level language specific to quantum annealers such as the D-Wave.
Scaffold
Scaffold is C-like language, that compiles to QASM and OpenQASM. It is built on top of the LLVM Compiler Infrastructure to perform optimizations on Scaffold code before generating a specified instruction set.
Silq
Silq is a high-level programming language for quantum computing with a strong static type system, developed at ETH Zürich.
LQP
The Logic of
Quantum Programs (LQP) is a dynamic quantum logic, capable of expressing
important features of quantum measurements and unitary evolutions of
multi-partite states, and provides logical characterizations of various
forms of entanglement. The logic has been used to specify and verify the
correctness of various protocols in quantum computation.
Functional languages
Efforts are underway to develop functional programming languages for quantum computing. Functional programming languages are well-suited for reasoning about programs. Examples include Selinger's QPL, and the Haskell-like language QML by Altenkirch and Grattage. Higher-order quantum programming languages, based on lambda calculus, have been proposed by van Tonder, Selinger and Valiron and by Arrighi and Dowek.
QFC and QPL
QFC
and QPL are two closely related quantum programming languages defined
by Peter Selinger. They differ only in their syntax: QFC uses a flow
chart syntax, whereas QPL uses a textual syntax. These languages have
classical control flow but can operate on quantum or classical data.
Selinger gives a denotational semantics for these languages in a
category of superoperators.
QML
QML is a Haskell-like quantum programming language by Altenkirch and Grattage. Unlike Selinger's QPL, this language takes duplication, rather than
discarding, of quantum information as a primitive operation. Duplication
in this context is understood to be the operation that maps to , and is not to be confused with the impossible operation of cloning;
the authors claim it is akin to how sharing is modeled in classical
languages. QML also introduces both classical and quantum control
operators, whereas most other languages rely on classical control.
LIQUi|> (pronounced liquid) is a quantum simulation extension on the F# programming language. It is currently being developed by the Quantum Architectures and Computation Group (QuArC)
part of the StationQ efforts at Microsoft Research. LIQUi|> seeks to
allow theorists to experiment with quantum algorithm design before
physical quantum computers are available for use.
It includes a programming language, optimization and scheduling
algorithms, and quantum simulators. LIQUi|> can be used to translate a
quantum algorithm written in the form of a high-level program into the
low-level machine instructions for a quantum device.
The first attempt to define a quantum lambda calculus was made by Philip Maymin in 1996.
His lambda-q calculus is powerful enough to express any quantum computation. However, this language can efficiently solve NP-complete problems, and therefore appears to be strictly stronger than the standard quantum computational models (such as the quantum Turing machine or the quantum circuit model). Therefore, Maymin's lambda-q calculus is probably not implementable on a physical device.
In 2003, André van Tonder defined an extension of the lambda calculus suitable for proving correctness of quantum programs. He also provided an implementation in the Scheme programming language.
In 2004, Selinger and Valiron defined a strongly typed lambda calculus for quantum computation with a type system based on linear logic.
Quipper was published in 2013. It is implemented as an embedded language, using Haskell as the host language.
For this reason, quantum programs written in Quipper are written in
Haskell using provided libraries. For example, the following code
implements preparation of a superposition
A quantum computer is a computer that exploits quantum mechanical phenomena. On small scales, physical matter exhibits properties of both particles and waves, and quantum computing leverages this behavior using specialized hardware. Classical physics cannot explain the operation of these quantum devices, and a scalable quantum computer could perform some calculations exponentially faster than any modern "classical" computer. In particular, a large-scale quantum computer could break widely used encryption schemes and aid physicists in performing physical simulations; however, the current state of the art is largely experimental and impractical, with several obstacles to useful applications.
The basic unit of information in quantum computing, the qubit (or "quantum bit"), serves the same function as the bit in classical computing. However, unlike a classical bit, which can be in one of two states (a binary), a qubit can exist in a superposition of its two "basis" states, which loosely means that it is in both states simultaneously. When measuring a qubit, the result is a probabilistic output of a classical bit. If a quantum computer manipulates the qubit in a particular way, wave interference effects can amplify the desired measurement results. The design of quantum algorithms involves creating procedures that allow a quantum computer to perform calculations efficiently and quickly.
Physically engineering high-quality qubits has proven challenging. If a physical qubit is not sufficiently isolated from its environment, it suffers from quantum decoherence, introducing noise
into calculations. National governments have invested heavily in
experimental research that aims to develop scalable qubits with longer
coherence times and lower error rates. Two of the most promising
technologies are superconductors (which isolate an electrical current by eliminating electrical resistance) and ion traps (which confine a single atomic particle using electromagnetic fields).
In principle, a classical computer can solve the same
computational problems as a quantum computer, given enough time. Quantum
advantage comes in the form of time complexity rather than computability, and quantum complexity theory
shows that some quantum algorithms are exponentially more efficient
than the best known classical algorithms. A large-scale quantum computer
could in theory solve computational problems unsolvable by a classical
computer in any reasonable amount of time. While claims of such quantum supremacy have drawn significant attention to the discipline, near-term practical use cases remain limited.
As physicists applied quantum mechanical models to computational problems and swapped digital bits for qubits, the fields of quantum mechanics and computer science began to converge.
In 1980, Paul Benioff introduced the quantum Turing machine, which uses quantum theory to describe a simplified computer.
When digital computers became faster, physicists faced an exponential increase in overhead when simulating quantum dynamics, prompting Yuri Manin and Richard Feynman to independently suggest that hardware based on quantum phenomena might be more efficient for computer simulation.In a 1984 paper, Charles Bennett and Gilles Brassard applied quantum theory to cryptography protocols and demonstrated that quantum key distribution could enhance information security.
Peter Shor built on these results with his 1994 algorithms for breaking the widely used RSA and Diffie–Hellman encryption protocols, which drew significant attention to the field of quantum computing.
In 1996, Grover's algorithm established a quantum speedup for the widely applicable unstructured search problem. The same year, Seth Lloyd proved that quantum computers could simulate quantum systems without the exponential overhead present in classical simulations, validating Feynman's 1982 conjecture.
Over the years, experimentalists have constructed small-scale quantum computers using trapped ions and superconductors.
In 1998, a two-qubit quantum computer demonstrated the feasibility of the technology, and subsequent experiments have increased the number of qubits and reduced error rates.
In 2019, Google AI and NASA announced that they had achieved quantum supremacy with a 54-qubit machine, performing a computation that is impossible for any classical computer. However, the validity of this claim is still being actively researched.
The threshold theorem shows how increasing the number of qubits can mitigate errors, yet fully fault-tolerant quantum computing remains "a rather distant dream". According to some researchers, noisy intermediate-scale quantum (NISQ) machines may have specialized uses in the near future, but noise in quantum gates limits their reliability.
Investment in quantum computing research has increased in the public and private sectors.
As one consulting firm summarized,
...investment dollars are pouring in, and quantum-computing start-ups are proliferating.... While quantum computing promises to help businesses solve problems that are beyond the reach and speed of conventional high-performance computers, use cases are largely experimental and hypothetical at this early stage.
With focus on business management’s point of view, the potential
applications of quantum computing into four major categories are
cybersecurity, data analytics and artificial intelligence, optimization
and simulation, and data management and searching.
In December 2023, physicists, for the first time, report the
entanglement of individual molecules, which may have significant
applications in quantum computing. Also in December 2023, scientists at Harvard
University successfully created "quantum circuits" that correct errors
more efficiently than alternative methods, which may potentially remove a
major obstacle to practical quantum computers. The Harvard research team was supported by MIT, QuEra Computing, Caltech, and Princeton University and funded by DARPA's Optimization with Noisy Intermediate-Scale Quantum devices (ONISQ) program.Research efforts are ongoing to jumpstart quantum computing through topological and photonic approaches as well.
As physicist Charlie Bennett describes the relationship between quantum and classical computers,
A classical computer is a quantum
computer ... so we shouldn't be asking about "where do quantum speedups
come from?" We should say, "well, all computers are quantum. ... Where
do classical slowdowns come from?"
Quantum information
Just as the bit is the basic concept of classical information theory, the qubit is the fundamental unit of quantum information. The same term qubit
is used to refer to an abstract mathematical model and to any physical
system that is represented by that model. A classical bit, by
definition, exists in either of two physical states, which can be
denoted 0 and 1. A qubit is also described by a state, and two states
often written and serve as the quantum counterparts of the classical states 0 and 1. However, the quantum states and belong to a vector space,
meaning that they can be multiplied by constants and added together,
and the result is again a valid quantum state. Such a combination is
known as a superposition of and .
A two-dimensional vector mathematically represents a qubit state. Physicists typically use Dirac notation for quantum mechanical linear algebra, writing 'ket psi' for a vector labeled . Because a qubit is a two-state system, any qubit state takes the form , where and are the standard basis states, and and are the probability amplitudes, which are in general complex numbers. If either or is zero, the qubit is effectively a classical bit; when both are nonzero, the qubit is in superposition. Such a quantum state vector acts similarly to a (classical) probability vector, with one key difference: unlike probabilities, probability amplitudes are not necessarily positive numbers. Negative amplitudes allow for destructive wave interference.
When a qubit is measured in the standard basis, the result is a classical bit.
The Born rule describes the norm-squared correspondence between amplitudes and probabilities—when measuring a qubit , the state collapses to with probability , or to with probability .
Any valid qubit state has coefficients and such that .
As an example, measuring the qubit would produce either or with equal probability.
Each additional qubit doubles the dimension of the state space.
As an example, the vector 1/√2|00⟩ + 1/√2|01⟩ represents a two-qubit state, a tensor product of the qubit |0⟩ with the qubit 1/√2|0⟩ + 1/√2|1⟩.
This vector inhabits a four-dimensional vector space spanned by the basis vectors |00⟩, |01⟩, |10⟩, and |11⟩.
The Bell state1/√2|00⟩ + 1/√2|11⟩ is impossible to decompose into the tensor product of two individual qubits—the two qubits are entangled because their probability amplitudes are correlated.
In general, the vector space for an n-qubit system is 2n-dimensional,
and this makes it challenging for a classical computer to simulate a
quantum one: representing a 100-qubit system requires storing 2100 classical values.
The state of this one-qubit quantum memory can be manipulated by applying quantum logic gates, analogous to how classical memory can be manipulated with classical logic gates. One important gate for both classical and quantum computation is the NOT gate, which can be represented by a matrix
Mathematically, the application of such a logic gate to a quantum state vector is modelled with matrix multiplication. Thus
and .
The mathematics of single qubit gates can be extended to operate on
multi-qubit quantum memories in two important ways. One way is simply to
select a qubit and apply that gate to the target qubit while leaving
the remainder of the memory unaffected. Another way is to apply the gate
to its target only if another part of the memory is in a desired state.
These two choices can be illustrated using another example. The
possible states of a two-qubit quantum memory are
The controlled NOT (CNOT) gate can then be represented using the following matrix:
As a mathematical consequence of this definition, , , , and . In other words, the CNOT applies a NOT gate ( from before) to the second qubit if and only if the first qubit is in the state . If the first qubit is , nothing is done to either qubit.
In summary, quantum computation can be described as a network of quantum logic gates and measurements. However, any measurement can be deferred to the end of quantum computation, though this deferment may come at a computational cost, so most quantum circuits depict a network consisting only of quantum logic gates and no measurements.
Quantum parallelism
Quantum parallelism
is the heuristic that quantum computers can be thought of as evaluating
a function for multiple input values simultaneously. This can be
achieved by preparing a quantum system in a superposition of input
states, and applying a unitary transformation that encodes the function
to be evaluated. The resulting state encodes the function's output
values for all input values in the superposition, allowing for the
computation of multiple outputs simultaneously. This property is key to
the speedup of many quantum algorithms. However, "parallelism" in this
sense is insufficient to speed up a computation, because the measurement
at the end of the computation gives only one value. To be useful, a
quantum algorithm must also incorporate some other conceptual
ingredient.
There are a number of models of computation for quantum computing, distinguished by the basic elements in which the computation is decomposed.
Gate array
A quantum gate array decomposes computation into a sequence of few-qubit quantum gates.
A quantum computation can be described as a network of quantum logic
gates and measurements. However, any measurement can be deferred to the
end of quantum computation, though this deferment may come at a
computational cost, so most quantum circuits depict a network consisting only of quantum logic gates and no measurements.
Any quantum computation (which is, in the above formalism, any unitary matrix of size over
qubits) can be represented as a network of quantum logic gates from a
fairly small family of gates. A choice of gate family that enables this
construction is known as a universal gate set, since a computer that can run such circuits is a universal quantum computer.
One common such set includes all single-qubit gates as well as the CNOT
gate from above. This means any quantum computation can be performed by
executing a sequence of single-qubit gates together with CNOT gates.
Though this gate set is infinite, it can be replaced with a finite gate
set by appealing to the Solovay-Kitaev theorem.
Neuromorphic
quantum computing (abbreviated as ‘n.quantum computing’) is an
unconventional computing type of computing that uses neuromorphic computing to perform quantum operations. It was suggested that quantum algorithms, which are algorithms that run on a realistic model of quantum computation, can be computed equally efficiently with neuromorphic quantum computing. Both, traditional quantum computing and neuromorphic quantum computing are physics-based unconventional computing approaches to computations and don’t follow the von Neumann architecture.
They both construct a system (a circuit) that represents the physical
problem at hand, and then leverage their respective physics properties
of the system to seek the “minimum”. Neuromorphic quantum computing and quantum computing share similar physical properties during computation.
A quantum Turing machine is the quantum analog of a Turing machine. All of these models of computation—quantum circuits, one-way quantum computation, adiabatic quantum computation, and topological quantum computation—have
been shown to be equivalent to the quantum Turing machine; given a
perfect implementation of one such quantum computer, it can simulate all
the others with no more than polynomial overhead. This equivalence need
not hold for practical quantum computers, since the overhead of
simulation may be too large to be practical.
Quantum cryptography and cybersecurity
Quantum computing has significant potential applications in the
fields of cryptography and cybersecurity. Quantum cryptography, which
relies on the principles of quantum mechanics, offers the possibility of
secure communication channels that are resistant to eavesdropping.
Quantum key distribution (QKD) protocols, such as BB84, enable the
secure exchange of cryptographic keys between parties, ensuring the
confidentiality and integrity of communication. Moreover, quantum random
number generators (QRNGs) can produce high-quality random numbers,
which are essential for secure encryption.
However, quantum computing also poses challenges to traditional
cryptographic systems. Shor's algorithm, a quantum algorithm for integer
factorization, could potentially break widely used public-key
cryptography schemes like RSA, which rely on the difficulty of factoring
large numbers. Post-quantum cryptography, which involves the
development of cryptographic algorithms that are resistant to attacks by
both classical and quantum computers, is an active area of research
aimed at addressing this concern.
Ongoing research in quantum cryptography and post-quantum
cryptography is crucial for ensuring the security of communication and
data in the face of evolving quantum computing capabilities. Advances in
these fields, such as the development of new QKD protocols, the
improvement of QRNGs, and the standardization of post-quantum
cryptographic algorithms, will play a key role in maintaining the
integrity and confidentiality of information in the quantum era.
Quantum cryptography enables new ways to transmit data securely; for example, quantum key distribution uses entangled quantum states to establish secure cryptographic keys. When a sender and receiver exchange quantum states, they can guarantee that an adversary
does not intercept the message, as any unauthorized eavesdropper would
disturb the delicate quantum system and introduce a detectable change. With appropriate cryptographic protocols, the sender and receiver can thus establish shared private information resistant to eavesdropping.
Modern fiber-optic cables
can transmit quantum information over relatively short distances.
Ongoing experimental research aims to develop more reliable hardware
(such as quantum repeaters), hoping to scale this technology to
long-distance quantum networks
with end-to-end entanglement. Theoretically, this could enable novel
technological applications, such as distributed quantum computing and
enhanced quantum sensing.
Algorithms
Progress in finding quantum algorithms typically focuses on this quantum circuit model, though exceptions like the quantum adiabatic algorithm exist. Quantum algorithms can be roughly categorized by the type of speedup achieved over corresponding classical algorithms.
Quantum algorithms that offer more than a polynomial speedup over the best-known classical algorithm include Shor's algorithm for factoring and the related quantum algorithms for computing discrete logarithms, solving Pell's equation, and more generally solving the hidden subgroup problem for abelian finite groups. These algorithms depend on the primitive of the quantum Fourier transform.
No mathematical proof has been found that shows that an equally fast
classical algorithm cannot be discovered, but evidence suggests that
this is unlikely. Certain oracle problems like Simon's problem and the Bernstein–Vazirani problem do give provable speedups, though this is in the quantum query model,
which is a restricted model where lower bounds are much easier to prove
and doesn't necessarily translate to speedups for practical problems.
Other problems, including the simulation of quantum physical
processes from chemistry and solid-state physics, the approximation of
certain Jones polynomials, and the quantum algorithm for linear systems of equations have quantum algorithms appearing to give super-polynomial speedups and are BQP-complete. Because these problems are BQP-complete, an equally fast classical algorithm for them would imply that no quantum algorithm gives a super-polynomial speedup, which is believed to be unlikely.
Some quantum algorithms, like Grover's algorithm and amplitude amplification, give polynomial speedups over corresponding classical algorithms.
Though these algorithms give comparably modest quadratic speedup, they
are widely applicable and thus give speedups for a wide range of
problems.
Since chemistry and nanotechnology rely on understanding quantum
systems, and such systems are impossible to simulate in an efficient
manner classically, quantum simulation may be an important application of quantum computing.
Quantum simulation could also be used to simulate the behavior of atoms
and particles at unusual conditions such as the reactions inside a collider.
In June 2023, IBM computer scientists reported that a quantum computer
produced better results for a physics problem than a conventional
supercomputer.
About 2% of the annual global energy output is used for nitrogen fixation to produce ammonia for the Haber process
in the agricultural fertilizer industry (even though naturally
occurring organisms also produce ammonia). Quantum simulations might be
used to understand this process and increase the energy efficiency of
production. It is expected that an early use of quantum computing will be modeling that improves the efficiency of the Haber–Bosch process by the mid 2020s although some have predicted it will take longer.
A notable application of quantum computation is for attacks on cryptographic systems that are currently in use. Integer factorization, which underpins the security of public key cryptographic
systems, is believed to be computationally infeasible with an ordinary
computer for large integers if they are the product of few prime numbers (e.g., products of two 300-digit primes). By comparison, a quantum computer could solve this problem exponentially faster using Shor's algorithm to find its factors. This ability would allow a quantum computer to break many of the cryptographic systems in use today, in the sense that there would be a polynomial time (in the number of digits of the integer) algorithm for solving the problem. In particular, most of the popular public key ciphers are based on the difficulty of factoring integers or the discrete logarithm problem, both of which can be solved by Shor's algorithm. In particular, the RSA, Diffie–Hellman, and elliptic curve Diffie–Hellman
algorithms could be broken. These are used to protect secure Web pages,
encrypted email, and many other types of data. Breaking these would
have significant ramifications for electronic privacy and security.
Identifying cryptographic systems that may be secure against
quantum algorithms is an actively researched topic under the field of post-quantum cryptography.
Some public-key algorithms are based on problems other than the integer
factorization and discrete logarithm problems to which Shor's algorithm
applies, like the McEliece cryptosystem based on a problem in coding theory. Lattice-based cryptosystems are also not known to be broken by quantum computers, and finding a polynomial time algorithm for solving the dihedralhidden subgroup problem, which would break many lattice based cryptosystems, is a well-studied open problem. It has been proven that applying Grover's algorithm to break a symmetric (secret key) algorithm by brute force requires time equal to roughly 2n/2 invocations of the underlying cryptographic algorithm, compared with roughly 2n in the classical case,
meaning that symmetric key lengths are effectively halved: AES-256
would have the same security against an attack using Grover's algorithm
that AES-128 has against classical brute-force search (see Key size).
The most well-known example of a problem that allows for a polynomial quantum speedup is unstructured search, which involves finding a marked item out of a list of items in a database. This can be solved by Grover's algorithm using queries to the database, quadratically fewer than the
queries required for classical algorithms. In this case, the advantage
is not only provable but also optimal: it has been shown that Grover's
algorithm gives the maximal possible probability of finding the desired
element for any number of oracle lookups. Many examples of provable
quantum speedups for query problems are based on Grover's algorithm,
including Brassard, Høyer, and Tapp's algorithm for finding collisions in two-to-one functions, and Farhi, Goldstone, and Gutmann's algorithm for evaluating NAND trees.
Problems that can be efficiently addressed with Grover's algorithm have the following properties:
There is no searchable structure in the collection of possible answers,
The number of possible answers to check is the same as the number of inputs to the algorithm, and
There exists a boolean function that evaluates each input and determines whether it is the correct answer.
For problems with all these properties, the running time of Grover's
algorithm on a quantum computer scales as the square root of the number
of inputs (or elements in the database), as opposed to the linear
scaling of classical algorithms. A general class of problems to which
Grover's algorithm can be applied is a Boolean satisfiability problem, where the database through which the algorithm iterates is that of all possible answers. An example and possible application of this is a password cracker that attempts to guess a password. Breaking symmetric ciphers with this algorithm is of interest to government agencies.
Quantum annealing
Quantum annealing
relies on the adiabatic theorem to undertake calculations. A system is
placed in the ground state for a simple Hamiltonian, which slowly
evolves to a more complicated Hamiltonian whose ground state represents
the solution to the problem in question. The adiabatic theorem states
that if the evolution is slow enough the system will stay in its ground
state at all times through the process. Adiabatic optimization may be helpful for solving computational biology problems.
Since quantum computers can produce outputs that classical computers
cannot produce efficiently, and since quantum computation is
fundamentally linear algebraic, some express hope in developing quantum
algorithms that can speed up machine learning tasks.
For example, the HHL Algorithm, named after its discoverers Harrow, Hassidim, and Lloyd, is believed to provide speedup over classical counterparts. Some research groups have recently explored the use of quantum annealing hardware for training Boltzmann machines and deep neural networks.
Deep generative chemistry models emerge as powerful tools to expedite drug discovery.
However, the immense size and complexity of the structural space of all
possible drug-like molecules pose significant obstacles, which could be
overcome in the future by quantum computers. Quantum computers are
naturally good for solving complex quantum many-body problems
and thus may be instrumental in applications involving quantum
chemistry. Therefore, one can expect that quantum-enhanced generative
models including quantum GANs may eventually be developed into ultimate generative chemistry algorithms.
Engineering
As of 2023, classical computers outperform quantum computers for all
real-world applications. While current quantum computers may speed up
solutions to particular mathematical problems, they give no
computational advantage for practical tasks. Scientists and engineers
are exploring multiple technologies for quantum computing hardware and
hope to develop scalable quantum architectures, but serious obstacles
remain.
Challenges
There are a number of technical challenges in building a large-scale quantum computer. Physicist David DiVincenzo has listed these requirements for a practical quantum computer:
Physically scalable to increase the number of qubits
Qubits that can be initialized to arbitrary values
Quantum gates that are faster than decoherence time
The control of multi-qubit systems requires the generation and
coordination of a large number of electrical signals with tight and
deterministic timing resolution. This has led to the development of quantum controllers
that enable interfacing with the qubits. Scaling these systems to
support a growing number of qubits is an additional challenge.
Decoherence
One of the greatest challenges involved with constructing quantum computers is controlling or removing quantum decoherence.
This usually means isolating the system from its environment as
interactions with the external world cause the system to decohere.
However, other sources of decoherence also exist. Examples include the
quantum gates, and the lattice vibrations and background thermonuclear
spin of the physical system used to implement the qubits. Decoherence is
irreversible, as it is effectively non-unitary, and is usually
something that should be highly controlled, if not avoided. Decoherence
times for candidate systems in particular, the transverse relaxation
time T2 (for NMR and MRI technology, also called the dephasing time), typically range between nanoseconds and seconds at low temperature. Currently, some quantum computers require their qubits to be cooled to 20 millikelvin (usually using a dilution refrigerator) in order to prevent significant decoherence. A 2020 study argues that ionizing radiation such as cosmic rays can nevertheless cause certain systems to decohere within milliseconds.
As a result, time-consuming tasks may render some quantum
algorithms inoperable, as attempting to maintain the state of qubits for
a long enough duration will eventually corrupt the superpositions.
These issues are more difficult for optical approaches as the
timescales are orders of magnitude shorter and an often-cited approach
to overcoming them is optical pulse shaping.
Error rates are typically proportional to the ratio of operating time
to decoherence time, hence any operation must be completed much more
quickly than the decoherence time.
As described by the threshold theorem, if the error rate is small enough, it is thought to be possible to use quantum error correction
to suppress errors and decoherence. This allows the total calculation
time to be longer than the decoherence time if the error correction
scheme can correct errors faster than decoherence introduces them. An
often-cited figure for the required error rate in each gate for
fault-tolerant computation is 10−3, assuming the noise is depolarizing.
Meeting this scalability condition is possible for a wide range
of systems. However, the use of error correction brings with it the cost
of a greatly increased number of required qubits. The number required
to factor integers using Shor's algorithm is still polynomial, and
thought to be between L and L2, where L
is the number of digits in the number to be factored; error correction
algorithms would inflate this figure by an additional factor of L. For a 1000-bit number, this implies a need for about 104 bits without error correction. With error correction, the figure would rise to about 107 bits. Computation time is about L2 or about 107 steps and at 1MHz,
about 10 seconds. However, the encoding and error-correction overheads
increase the size of a real fault-tolerant quantum computer by several
orders of magnitude. Careful estimates show that at least 3million
physical qubits would factor 2,048-bit integer in 5 months on a fully
error-corrected trapped-ion quantum computer. In terms of the number of
physical qubits, to date, this remains the lowest estimate for practically useful integer factorization problem sizing 1,024-bit or larger.
Physicist John Preskill coined the term quantum supremacy
to describe the engineering feat of demonstrating that a programmable
quantum device can solve a problem beyond the capabilities of
state-of-the-art classical computers. The problem need not be useful, so some view the quantum supremacy test only as a potential future benchmark.
In October 2019, Google AI Quantum, with the help of NASA, became
the first to claim to have achieved quantum supremacy by performing
calculations on the Sycamore quantum computer more than 3,000,000 times faster than they could be done on Summit, generally considered the world's fastest computer. This claim has been subsequently challenged: IBM has stated that Summit can perform samples much faster than claimed,
and researchers have since developed better algorithms for the sampling
problem used to claim quantum supremacy, giving substantial reductions
to the gap between Sycamore and classical supercomputers and even beating it.
In December 2020, a group at USTC implemented a type of Boson sampling on 76 photons with a photonic quantum computer, Jiuzhang, to demonstrate quantum supremacy.The authors claim that a classical contemporary supercomputer would
require a computational time of 600 million years to generate the number
of samples their quantum processor can generate in 20 seconds.
Claims of quantum supremacy have generated hype around quantum computing, but they are based on contrived benchmark tasks that do not directly imply useful real-world applications.
In January 2024, a study published in Physical Review Letters
provided direct verification of quantum supremacy experiments by
computing exact amplitudes for experimentally generated bitstrings using
a new-generation Sunway supercomputer, demonstrating a significant leap
in simulation capability built on a multiple-amplitude tensor network
contraction algorithm. This development underscores the evolving
landscape of quantum computing, highlighting both the progress and the
complexities involved in validating quantum supremacy claims.
Skepticism
Despite high hopes for quantum computing, significant progress in hardware, and optimism about future applications, a 2023 Nature spotlight article summarised current quantum computers as being "For now, [good for] absolutely nothing".
The article elaborated that quantum computers are yet to be more useful
or efficient than conventional computers in any case, though it also
argued that in the long term such computers are likely to be useful. A
2023 Communications of the ACM article
found that current quantum computing algorithms are "insufficient for
practical quantum advantage without significant improvements across the
software/hardware stack". It argues that the most promising candidates
for achieving speedup with quantum computers are "small-data problems",
for example in chemistry and materials science. However, the article
also concludes that a large range of the potential applications it
considered, such as machine learning, "will not achieve quantum
advantage with current quantum algorithms in the foreseeable future",
and it identified I/O constraints that make speedup unlikely for "big
data problems, unstructured linear systems, and database search based on
Grover's algorithm".
This state of affairs can be traced to several current and long-term considerations.
Conventional computer hardware and algorithms are not only
optimized for practical tasks, but are still improving rapidly,
particularly GPU accelerators.
Current quantum computing hardware generates only a limited amount of entanglement before getting overwhelmed by noise.
Quantum algorithms
provide speedup over conventional algorithms only for some tasks, and
matching these tasks with practical applications proved challenging.
Some promising tasks and applications require resources far beyond those
available today. In particular, processing large amounts of non-quantum data is a challenge for quantum computers.
Some promising algorithms have been "dequantized", i.e., their non-quantum analogues with similar complexity have been found.
If quantum error correction
is used to scale quantum computers to practical applications, its
overhead may undermine speedup offered by many quantum algorithms.
Complexity analysis of algorithms sometimes makes abstract
assumptions that do not hold in applications. For example, input data
may not already be available encoded in quantum states, and "oracle
functions" used in Grover's algorithm often have internal structure that can be exploited for faster algorithms.
In particular, building computers with large numbers of qubits may be
futile if those qubits are not connected well enough and cannot
maintain sufficiently high degree of entanglement for long time. When
trying to outperform conventional computers, quantum computing
researchers often look for new tasks that can be solved on quantum
computers, but this leaves the possibility that efficient non-quantum
techniques will be developed in response, as seen for Quantum supremacy
demonstrations. Therefore, it is desirable to prove lower bounds on the
complexity of best possible non-quantum algorithms (which may be
unknown) and show that some quantum algorithms asymptomatically improve
upon those bounds.
Some researchers have expressed skepticism that scalable quantum
computers could ever be built, typically because of the issue of
maintaining coherence at large scales, but also for other reasons.
Bill Unruh doubted the practicality of quantum computers in a paper published in 1994. Paul Davies argued that a 400-qubit computer would even come into conflict with the cosmological information bound implied by the holographic principle. Skeptics like Gil Kalai doubt that quantum supremacy will ever be achieved. Physicist Mikhail Dyakonov has expressed skepticism of quantum computing as follows:
"So the number of continuous parameters describing the state of
such a useful quantum computer at any given moment must be... about 10300... Could we ever learn to control the more than 10300 continuously variable parameters defining the quantum state of such a system? My answer is simple. No, never."
A practical quantum computer must use a physical system as a programmable quantum register. Researchers are exploring several technologies as candidates for reliable qubit implementations. Superconductors and trapped ions are some of the most developed proposals, but experimentalists are considering other hardware possibilities as well.
Any computational problem solvable by a classical computer is also solvable by a quantum computer.
Intuitively, this is because it is believed that all physical
phenomena, including the operation of classical computers, can be
described using quantum mechanics, which underlies the operation of quantum computers.
Conversely, any problem solvable by a quantum computer is also
solvable by a classical computer. It is possible to simulate both
quantum and classical computers manually with just some paper and a pen,
if given enough time. More formally, any quantum computer can be
simulated by a Turing machine. In other words, quantum computers provide no additional power over classical computers in terms of computability. This means that quantum computers cannot solve undecidable problems like the halting problem, and the existence of quantum computers does not disprove the Church–Turing thesis.
While quantum computers cannot solve any problems that classical
computers cannot already solve, it is suspected that they can solve
certain problems faster than classical computers. For instance, it is
known that quantum computers can efficiently factor integers, while this is not believed to be the case for classical computers.
The class of problems that can be efficiently solved by a quantum computer with bounded error is called BQP,
for "bounded error, quantum, polynomial time". More formally, BQP is
the class of problems that can be solved by a polynomial-time quantum Turing machine with an error probability of at most 1/3. As a class of probabilistic problems, BQP is the quantum counterpart to BPP ("bounded error, probabilistic, polynomial time"), the class of problems that can be solved by polynomial-time probabilistic Turing machines with bounded error. It is known that and is widely suspected that , which intuitively would mean that quantum computers are more powerful than classical computers in terms of time complexity.
The exact relationship of BQP to P, NP, and PSPACE is not known. However, it is known that ;
that is, all problems that can be efficiently solved by a deterministic
classical computer can also be efficiently solved by a quantum
computer, and all problems that can be efficiently solved by a quantum
computer can also be solved by a deterministic classical computer with
polynomial space resources. It is further suspected that BQP is a strict
superset of P, meaning there are problems that are efficiently solvable
by quantum computers that are not efficiently solvable by deterministic
classical computers. For instance, integer factorization and the discrete logarithm problem
are known to be in BQP and are suspected to be outside of P. On the
relationship of BQP to NP, little is known beyond the fact that some NP
problems that are believed not to be in P are also in BQP (integer
factorization and the discrete logarithm problem are both in NP, for
example). It is suspected that ;
that is, it is believed that there are efficiently checkable problems
that are not efficiently solvable by a quantum computer. As a direct
consequence of this belief, it is also suspected that BQP is disjoint
from the class of NP-complete problems (if an NP-complete problem were in BQP, then it would follow from NP-hardness that all problems in NP are in BQP).