Search This Blog

Wednesday, February 18, 2026

Introduction to quantum mechanics

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Introduction_to_quantum_mechanics

Quantum mechanics is the study of matter and matter's interactions with energy on the scale of atomic and subatomic particles. By contrast, classical physics explains matter and energy only on a scale familiar to human experience, including the behavior of astronomical bodies such as the Moon. Classical physics is still used in much of modern science and technology. However, towards the end of the 19th century, scientists discovered phenomena in both the large (macro) and the small (micro) worlds that classical physics could not explain. The desire to resolve inconsistencies between observed phenomena and classical theory led to a revolution in physics, a shift in the original scientific paradigm: the development of quantum mechanics.

Many aspects of quantum mechanics yield unexpected results, defying expectations and deemed counterintuitive. These aspects can seem paradoxical as they map behaviors quite differently from those seen at larger scales. In the words of quantum physicist Richard Feynman, quantum mechanics deals with "nature as she is—absurd". Features of quantum mechanics often defy simple explanations in everyday language. One example of this is the uncertainty principle—precise measurements of position cannot be combined with precise measurements of velocity. Another example is entanglement—a measurement made on one particle (such as an electron that is measured to have spin 'up') will correlate with a measurement on a second particle (an electron will be found to have spin 'down') if the two particles have a shared history. This will apply even if it is impossible for the result of the first measurement to have been transmitted to the second particle before the second measurement takes place.

Quantum mechanics helps people understand chemistry, because it explains how atoms interact with each other and form molecules. Many remarkable phenomena can be explained using quantum mechanics, like superfluidity. For example, if liquid helium cooled to a temperature near absolute zero is placed in a container, it spontaneously flows up and over the rim of its container; this is an effect which cannot be explained by classical physics.

History

James C. Maxwell's unification of the equations governing electricity, magnetism, and light in the late 19th century led to experiments on the interaction of light and matter. Some of these experiments had aspects which could not be explained until quantum mechanics emerged in the early part of the 20th century.

Evidence of quanta from the photoelectric effect

The seeds of the quantum revolution appear in the discovery by J.J. Thomson in 1897 that cathode rays were not continuous but "corpuscles" (electrons). Electrons had been named just six years earlier as part of the emerging theory of atoms. In 1900, Max Planck, unconvinced by the atomic theory, discovered that he needed discrete entities like atoms or electrons to explain black-body radiation.

Black-body radiation intensity vs color and temperature. The rainbow bar represents visible light; 5000 K objects are "white hot" by mixing differing colors of visible light. To the right is the invisible infrared. Classical theory (black curve for 5000 K) fails to predict the colors; the other curves are correctly predicted by quantum theories.

Very hot – red hot or white hot – objects look similar when heated to the same temperature. This look results from a common curve of light intensity at different frequencies (colors), which is called black-body radiation. White hot objects have intensity across many colors in the visible range. The lowest frequencies above visible colors are infrared light, which also give off heat. Continuous wave theories of light and matter cannot explain the black-body radiation curve. Planck spread the heat energy among individual "oscillators" of an undefined character but with discrete energy capacity; this model explained black-body radiation.

At the time, electrons, atoms, and discrete oscillators were all exotic ideas to explain exotic phenomena. But in 1905 Albert Einstein proposed that light was also corpuscular, consisting of "energy quanta", in contradiction to the established science of light as a continuous wave, stretching back a hundred years to Thomas Young's work on diffraction.

Einstein's revolutionary proposal started by reanalyzing Planck's black-body theory, arriving at the same conclusions by using the new "energy quanta". Einstein then showed how energy quanta connected to Thomson's electron. In 1902, Philipp Lenard directed light from an arc lamp onto freshly cleaned metal plates housed in an evacuated glass tube. He measured the electric current coming off the metal plate, at higher and lower intensities of light and for different metals. Lenard showed that amount of current – the number of electrons – depended on the intensity of the light, but that the velocity of these electrons did not depend on intensity. This is the photoelectric effect. The continuous wave theories of the time predicted that more light intensity would accelerate the same amount of current to higher velocity, contrary to this experiment. Einstein's energy quanta explained the volume increase: one electron is ejected for each quantum: more quanta mean more electrons.

Einstein then predicted that the electron velocity would increase in direct proportion to the light frequency above a fixed value that depended upon the metal. Here the idea is that energy in energy-quanta depends upon the light frequency; the energy transferred to the electron comes in proportion to the light frequency. The type of metal gives a barrier, the fixed value, that the electrons must climb over to exit their atoms, to be emitted from the metal surface and be measured.

Ten years elapsed before Millikan's definitive experiment verified Einstein's prediction. During that time many scientists rejected the revolutionary idea of quanta. But Planck's and Einstein's concept was in the air and soon began to affect other physics and quantum theories.

Quantization of bound electrons in atoms

Experiments with light and matter in the late 1800s uncovered a reproducible but puzzling regularity. When light was shown through purified gases, certain frequencies (colors) did not pass. These dark absorption 'lines' followed a distinctive pattern: the gaps between the lines decreased steadily. By 1889, the Rydberg formula predicted the lines for hydrogen gas using only a constant number and the integers to index the lines. The origin of this regularity was unknown. Solving this mystery would eventually become the first major step toward quantum mechanics.

Throughout the 19th century evidence grew for the atomic nature of matter. With Thomson's discovery of the electron in 1897, scientists began the search for a model of the interior of the atom. Thomson proposed negative electrons swimming in a pool of positive charge. Between 1908 and 1911, Rutherford showed that the positive part was only 1/3000th of the diameter of the atom.

Models of "planetary" electrons orbiting a nuclear "Sun" were proposed, but cannot explain why the electron does not simply fall into the positive charge. In 1913 Niels Bohr and Ernest Rutherford connected the new atom models to the mystery of the Rydberg formula: the orbital radius of the electrons were constrained and the resulting energy differences matched the energy differences in the absorption lines. This meant that absorption and emission of light from atoms was energy quantized: only specific energies that matched the difference in orbital energy would be emitted or absorbed.

Trading one mystery – the regular pattern of the Rydberg formula – for another mystery – constraints on electron orbits – might not seem like a big advance, but the new atom model summarized many other experimental findings. The quantization of the photoelectric effect and now the quantization of the electron orbits set the stage for the final revolution.

Throughout the first and the modern era of quantum mechanics the concept that classical mechanics must be valid macroscopically constrained possible quantum models. This concept was formalized by Bohr in 1923 as the correspondence principle. It requires quantum theory to converge to classical limits. A related concept is Ehrenfest's theorem, which shows that the average values obtained from quantum mechanics (e.g. position and momentum) obey classical laws.

Quantization of spin

Stern–Gerlach experiment: Silver atoms travelling through an inhomogeneous magnetic field, and being deflected up or down depending on their spin; (1) furnace, (2) beam of silver atoms, (3) inhomogeneous magnetic field, (4) classically expected result, (5) observed result

In 1922 Otto Stern and Walther Gerlach demonstrated that the magnetic properties of silver atoms defy classical explanation, the work contributing to Stern’s 1943 Nobel Prize in Physics. They fired a beam of silver atoms through a magnetic field. According to classical physics, the atoms should have emerged in a spray, with a continuous range of directions. Instead, the beam separated into two, and only two, diverging streams of atoms. Unlike the other quantum effects known at the time, this striking result involves the state of a single atom. In 1927, Thomas Erwin Phipps and John Bellamy Taylor [de] obtained a similar, but less pronounced effect using hydrogen atoms in their ground state, thereby eliminating any doubts that may have been caused by the use of silver atoms.

In 1924, Wolfgang Pauli called it "two-valuedness not describable classically" and associated it with electrons in the outermost shell. The experiments lead to formulation of its theory described to arise from spin of the electron in 1925, by Samuel Goudsmit and George Uhlenbeck, under the advice of Paul Ehrenfest.

Quantization of matter

In 1924 Louis de Broglie proposed that electrons in an atom are constrained not in "orbits" but as standing waves. In detail his solution did not work, but his hypothesis – that the electron "corpuscle" moves in the atom as a wave – spurred Erwin Schrödinger to develop a wave equation for electrons; when applied to hydrogen the Rydberg formula was accurately reproduced.

Example original electron diffraction photograph from the laboratory of G. P. Thomson, recorded 1925–1927

Max Born's 1924 paper "Zur Quantenmechanik" was the first use of the words "quantum mechanics" in print. His later work included developing quantum collision models; in a footnote to a 1926 paper he proposed the Born rule connecting theoretical models to experiment.

In 1927 at Bell Labs, Clinton Davisson and Lester Germer fired slow-moving electrons at a crystalline nickel target which showed a diffraction pattern indicating wave nature of electron whose theory was fully explained by Hans Bethe. A similar experiment by George Paget Thomson and Alexander Reid, firing electrons at thin celluloid foils and later metal films, observing rings, independently discovered matter wave nature of electrons.

Further developments

In 1928 Paul Dirac published his relativistic wave equation simultaneously incorporating relativity, predicting anti-matter, and providing a complete theory for the Stern–Gerlach result. These successes launched a new fundamental understanding of our world at small scale: quantum mechanics.

Planck and Einstein started the revolution with quanta that broke down the continuous models of matter and light. Twenty years later "corpuscles" like electrons came to be modeled as continuous waves. This result came to be called wave-particle duality, one iconic idea along with the uncertainty principle that sets quantum mechanics apart from older models of physics.

Quantum radiation, quantum fields

In 1923 Compton demonstrated that the Planck-Einstein energy quanta from light also had momentum; three years later the "energy quanta" got a new name "photon". Despite its role in almost all stages of the quantum revolution, no explicit model for light quanta existed until 1927 when Paul Dirac began work on a quantum theory of radiation that became quantum electrodynamics. Over the following decades this work evolved into quantum field theory, the basis for modern quantum optics and particle physics.

Wave–particle duality

The concept of wave–particle duality says that neither the classical concept of "particle" nor of "wave" can fully describe the behavior of quantum-scale objects, either photons or matter. Wave–particle duality is an example of the principle of complementarity in quantum physics. An elegant example of wave-particle duality is the double-slit experiment.

The diffraction pattern produced when light is shone through one slit (top) and the interference pattern produced by two slits (bottom). Both patterns show oscillations due to the wave nature of light. The double slit pattern is more dramatic.

In the double-slit experiment, as originally performed by Thomas Young in 1803, and then Augustin Fresnel a decade later, a beam of light is directed through two narrow, closely spaced slits, producing an interference pattern of light and dark bands on a screen. The same behavior can be demonstrated in water waves: the double-slit experiment was seen as a demonstration of the wave nature of light.

Variations of the double-slit experiment have been performed using electrons, atoms, and even large molecules, and the same type of interference pattern is seen. Thus it has been demonstrated that all matter possesses wave characteristics.

If the source intensity is turned down, the same interference pattern will slowly build up, one "count" or particle (e.g. photon or electron) at a time. The quantum system acts as a wave when passing through the double slits, but as a particle when it is detected. This is a typical feature of quantum complementarity: a quantum system acts as a wave in an experiment to measure its wave-like properties, and like a particle in an experiment to measure its particle-like properties. The point on the detector screen where any individual particle shows up is the result of a random process. However, the distribution pattern of many individual particles mimics the diffraction pattern produced by waves.

Uncertainty principle

Werner Heisenberg at the age of 26. Heisenberg won the Nobel Prize in Physics in 1932 for the work he did in the late 1920s.

Suppose it is desired to measure the position and speed of an object—for example, a car going through a radar speed trap. It can be assumed that the car has a definite position and speed at a particular moment in time. How accurately these values can be measured depends on the quality of the measuring equipment. If the precision of the measuring equipment is improved, it provides a result closer to the true value. It might be assumed that the speed of the car and its position could be operationally defined and measured simultaneously, as precisely as might be desired.

In 1927, Heisenberg proved that this last assumption is not correct. Quantum mechanics shows that certain pairs of physical properties, for example, position and speed, cannot be simultaneously measured, nor defined in operational terms, to arbitrary precision: the more precisely one property is measured, or defined in operational terms, the less precisely can the other be thus treated. This statement is known as the uncertainty principle. The uncertainty principle is not only a statement about the accuracy of our measuring equipment but, more deeply, is about the conceptual nature of the measured quantities—the assumption that the car had simultaneously defined position and speed does not work in quantum mechanics. On a scale of cars and people, these uncertainties are negligible, but when dealing with atoms and electrons they become critical.

Heisenberg gave, as an illustration, the measurement of the position and momentum of an electron using a photon of light. In measuring the electron's position, the higher the frequency of the photon, the more accurate is the measurement of the position of the impact of the photon with the electron, but the greater is the disturbance of the electron. This is because from the impact with the photon, the electron absorbs a random amount of energy, rendering the measurement obtained of its momentum increasingly uncertain, for one is necessarily measuring its post-impact disturbed momentum from the collision products and not its original momentum (momentum which should be simultaneously measured with position). With a photon of lower frequency, the disturbance (and hence uncertainty) in the momentum is less, but so is the accuracy of the measurement of the position of the impact.

At the heart of the uncertainty principle is a fact that for any mathematical analysis in the position and velocity domains, achieving a sharper (more precise) curve in the position domain can only be done at the expense of a more gradual (less precise) curve in the speed domain, and vice versa. More sharpness in the position domain requires contributions from more frequencies in the speed domain to create the narrower curve, and vice versa. It is a fundamental tradeoff inherent in any such related or complementary measurements, but is only really noticeable at the smallest (Planck) scale, near the size of elementary particles.

The uncertainty principle shows mathematically that the product of the uncertainty in the position and momentum of a particle (momentum is velocity multiplied by mass) could never be less than a certain value, and that this value is related to the Planck constant.

Wave function collapse

Wave function collapse means that a measurement has forced or converted a quantum (probabilistic or potential) state into a definite measured value. This phenomenon is only seen in quantum mechanics rather than classical mechanics.

For example, before a photon actually "shows up" on a detection screen it can be described only with a set of probabilities for where it might show up. When it does appear, for instance in the CCD of an electronic camera, the time and space where it interacted with the device are known within very tight limits. However, the photon has disappeared in the process of being captured (measured), and its quantum wave function has disappeared with it. In its place, some macroscopic physical change in the detection screen has appeared, e.g., an exposed spot in a sheet of photographic film, or a change in electric potential in some cell of a CCD.

Eigenstates and eigenvalues

Because of the uncertainty principle, statements about both the position and momentum of particles can assign only a probability that the position or momentum has some numerical value. Therefore, it is necessary to formulate clearly the difference between the state of something indeterminate, such as an electron in a probability cloud, and the state of something having a definite value. When an object can definitely be "pinned-down" in some respect, it is said to possess an eigenstate.

In the Stern–Gerlach experiment discussed above, the quantum model predicts two possible values of spin for the atom compared to the magnetic axis. These two eigenstates are named arbitrarily 'up' and 'down'. The quantum model predicts these states will be measured with equal probability, but no intermediate values will be seen. This is what the Stern–Gerlach experiment shows.

The eigenstates of spin about the vertical axis are not simultaneously eigenstates of spin about the horizontal axis, so this atom has an equal probability of being found to have either value of spin about the horizontal axis. As described in the section above, measuring the spin about the horizontal axis can allow an atom that was spun up to spin down: measuring its spin about the horizontal axis collapses its wave function into one of the eigenstates of this measurement, which means it is no longer in an eigenstate of spin about the vertical axis, so can take either value.

The Pauli exclusion principle

Wolfgang Pauli

In 1924, Wolfgang Pauli proposed a new quantum degree of freedom (or quantum number), with two possible values, to resolve inconsistencies between observed molecular spectra and the predictions of quantum mechanics. In particular, the spectrum of atomic hydrogen had a doublet, or pair of lines differing by a small amount, where only one line was expected. Pauli formulated his exclusion principle, stating, "There cannot exist an atom in such a quantum state that two electrons within [it] have the same set of quantum numbers."

A year later, Uhlenbeck and Goudsmit identified Pauli's new degree of freedom with the property called spin whose effects were observed in the Stern–Gerlach experiment.

Dirac wave equation

Paul Dirac (1902–1984)

In 1928, Paul Dirac extended the Pauli equation, which described spinning electrons, to account for special relativity. The result was a theory that dealt properly with events, such as the speed at which an electron orbits the nucleus, occurring at a substantial fraction of the speed of light. By using the simplest electromagnetic interaction, Dirac was able to predict the value of the magnetic moment associated with the electron's spin and found the experimentally observed value, which was too large to be that of a spinning charged sphere governed by classical physics. He was able to solve for the spectral lines of the hydrogen atom and to reproduce from physical first principles Sommerfeld's successful formula for the fine structure of the hydrogen spectrum.

Dirac's equations sometimes yielded a negative value for energy, for which he proposed a novel solution: he posited the existence of an antielectron and a dynamical vacuum. This led to the many-particle quantum field theory.

Quantum entanglement

In quantum physics, a group of particles can interact or be created together in such a way that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. This is known as quantum entanglement.

An early landmark in the study of entanglement was the Einstein–Podolsky–Rosen (EPR) paradox, a thought experiment proposed by Albert Einstein, Boris Podolsky and Nathan Rosen which argues that the description of physical reality provided by quantum mechanics is incomplete. In a 1935 paper titled "Can Quantum-Mechanical Description of Physical Reality be Considered Complete?", they argued for the existence of "elements of reality" that were not part of quantum theory, and speculated that it should be possible to construct a theory containing these hidden variables.

The thought experiment involves a pair of particles prepared in what would later become known as an entangled state. Einstein, Podolsky, and Rosen pointed out that, in this state, if the position of the first particle were measured, the result of measuring the position of the second particle could be predicted. If instead the momentum of the first particle were measured, then the result of measuring the momentum of the second particle could be predicted. They argued that no action taken on the first particle could instantaneously affect the other, since this would involve information being transmitted faster than light, which is forbidden by the theory of relativity. They invoked a principle, later known as the "EPR criterion of reality", positing that: "If, without in any way disturbing a system, we can predict with certainty (i.e., with probability equal to unity) the value of a physical quantity, then there exists an element of reality corresponding to that quantity." From this, they inferred that the second particle must have a definite value of both position and of momentum prior to either quantity being measured. But quantum mechanics considers these two observables incompatible and thus does not associate simultaneous values for both to any system. Einstein, Podolsky, and Rosen therefore concluded that quantum theory does not provide a complete description of reality. In the same year, Erwin Schrödinger used the word "entanglement" and declared: "I would not call that one but rather the characteristic trait of quantum mechanics."

The Irish physicist John Stewart Bell carried the analysis of quantum entanglement much further. He deduced that if measurements are performed independently on the two separated particles of an entangled pair, then the assumption that the outcomes depend upon hidden variables within each half implies a mathematical constraint on how the outcomes on the two measurements are correlated. This constraint would later be named the Bell inequality. Bell then showed that quantum physics predicts correlations that violate this inequality. Consequently, the only way that hidden variables could explain the predictions of quantum physics is if they are "nonlocal", which is to say that somehow the two particles are able to interact instantaneously no matter how widely they ever become separated. Performing experiments like those that Bell suggested, physicists have found that nature obeys quantum mechanics and violates Bell inequalities. In other words, the results of these experiments are incompatible with any local hidden variable theory.

Quantum field theory

The idea of quantum field theory began in the late 1920s with British physicist Paul Dirac, when he attempted to quantize the energy of the electromagnetic field; just as in quantum mechanics the energy of an electron in the hydrogen atom was quantized. Quantization is a procedure for constructing a quantum theory starting from a classical theory.

Merriam-Webster defines a field in physics as "a region or space in which a given effect (such as magnetism) exists". Other effects that manifest themselves as fields are gravitation and static electricity. In 2008, physicist Richard Hammond wrote:

Sometimes we distinguish between quantum mechanics (QM) and quantum field theory (QFT). QM refers to a system in which the number of particles is fixed, and the fields (such as the electromechanical field) are continuous classical entities. QFT ... goes a step further and allows for the creation and annihilation of particles ...

He added, however, that quantum mechanics is often used to refer to "the entire notion of quantum view".

In 1931, Dirac proposed the existence of particles that later became known as antimatter. Dirac shared the Nobel Prize in Physics for 1933 with Schrödinger "for the discovery of new productive forms of atomic theory".

Quantum electrodynamics

Quantum electrodynamics (QED) is the name of the quantum theory of the electromagnetic force. Understanding QED begins with understanding electromagnetism. Electromagnetism can be called "electrodynamics" because it is a dynamic interaction between electrical and magnetic forces. Electromagnetism begins with the electric charge.

Electric charges are the sources of and create, electric fields. An electric field is a field that exerts a force on any particles that carry electric charges, at any point in space. This includes the electron, proton, and even quarks, among others. As a force is exerted, electric charges move, a current flows, and a magnetic field is produced. The changing magnetic field, in turn, causes electric current (often moving electrons). The physical description of interacting charged particles, electrical currents, electrical fields, and magnetic fields is called electromagnetism.

In 1928 Paul Dirac produced a relativistic quantum theory of electromagnetism. This was the progenitor to modern quantum electrodynamics, in that it had essential ingredients of the modern theory. However, the problem of unsolvable infinities developed in this relativistic quantum theory. Years later, renormalization largely solved this problem. Initially viewed as a provisional, suspect procedure by some of its originators, renormalization eventually was embraced as an important and self-consistent tool in QED and other fields of physics. Also, in the late 1940s Feynman diagrams provided a way to make predictions with QED by finding a probability amplitude for each possible way that an interaction could occur. The diagrams showed in particular that the electromagnetic force is the exchange of photons between interacting particles.

The Lamb shift is an example of a quantum electrodynamics prediction that has been experimentally verified. It is an effect whereby the quantum nature of the electromagnetic field makes the energy levels in an atom or ion deviate slightly from what they would otherwise be. As a result, spectral lines may shift or split.

Similarly, within a freely propagating electromagnetic wave, the current can also be just an abstract displacement current, instead of involving charge carriers. In QED, its full description makes essential use of short-lived virtual particles. There, QED again validates an earlier, rather mysterious concept.

Standard Model

The Standard Model of particle physics is the quantum field theory that describes three of the four known fundamental forces (electromagnetic, weak and strong interactions – excluding gravity) in the universe and classifies all known elementary particles. It was developed in stages throughout the latter half of the 20th century, through the work of many scientists worldwide, with the current formulation being finalized in the mid-1970s upon experimental confirmation of the existence of quarks. Since then, proof of the top quark (1995), the tau neutrino (2000), and the Higgs boson (2012) have added further credence to the Standard Model. In addition, the Standard Model has predicted various properties of weak neutral currents and the W and Z bosons with great accuracy.

Although the Standard Model is believed to be theoretically self-consistent and has demonstrated success in providing experimental predictions, it leaves some physical phenomena unexplained and so falls short of being a complete theory of fundamental interactions. For example, it does not fully explain baryon asymmetry, incorporate the full theory of gravitation as described by general relativity, or account for the universe's accelerating expansion as possibly described by dark energy. The model does not contain any viable dark matter particle that possesses all of the required properties deduced from observational cosmology. It also does not incorporate neutrino oscillations and their non-zero masses. Accordingly, it is used as a basis for building more exotic models that incorporate hypothetical particles, extra dimensions, and elaborate symmetries (such as supersymmetry) to explain experimental results at variance with the Standard Model, such as the existence of dark matter and neutrino oscillations.

Interpretations

The physical measurements, equations, and predictions pertinent to quantum mechanics are all consistent and hold a very high level of confirmation. However, the question of what these abstract models say about the underlying nature of the real world has received competing answers. These interpretations are widely varying and sometimes somewhat abstract. For instance, the Copenhagen interpretation states that before a measurement, statements about a particle's properties are completely meaningless, while the many-worlds interpretation describes the existence of a multiverse made up of every possible universe.

Light behaves in some aspects like particles and in other aspects like waves. Matter—the "stuff" of the universe consisting of particles such as electrons and atoms—exhibits wavelike behavior too. Some light sources, such as neon lights, give off only certain specific frequencies of light, a small set of distinct pure colors determined by neon's atomic structure. Quantum mechanics shows that light, along with all other forms of electromagnetic radiation, comes in discrete units, called photons, and predicts its spectral energies (corresponding to pure colors), and the intensities of its light beams. A single photon is a quantum, or smallest observable particle, of the electromagnetic field. A partial photon is never experimentally observed. More broadly, quantum mechanics shows that many properties of objects, such as position, speed, and angular momentum, that appeared continuous in the zoomed-out view of classical mechanics, turn out to be (in the very tiny, zoomed-in scale of quantum mechanics) quantized. Such properties of elementary particles are required to take on one of a set of small, discrete allowable values, and since the gap between these values is also small, the discontinuities are only apparent at very tiny (atomic) scales.

Applications

Everyday applications

The relationship between the frequency of electromagnetic radiation and the energy of each photon is why ultraviolet light can cause sunburn, but visible or infrared light cannot. A photon of ultraviolet light delivers a high amount of energy—enough to contribute to cellular damage such as occurs in a sunburn. A photon of infrared light delivers less energy—only enough to warm one's skin. So, an infrared lamp can warm a large surface, perhaps large enough to keep people comfortable in a cold room, but it cannot give anyone a sunburn.

Technological applications

Applications of quantum mechanics include the laser, the transistor, the electron microscope, and magnetic resonance imaging. A special class of quantum mechanical applications is related to macroscopic quantum phenomena such as superfluid helium and superconductors. The study of semiconductors led to the invention of the diode and the transistor, which are indispensable for modern electronics.

In even a simple light switch, quantum tunneling is absolutely vital, as otherwise the electrons in the electric current could not penetrate the potential barrier made up of a layer of oxide. Flash memory chips found in USB drives also use quantum tunneling, to erase their memory cells.

Double-slit experiment

From Wikipedia, the free encyclopedia
Photons or matter (like electrons) produce an interference pattern when two slits are used.
 
Light from a green laser passing through two slits 0.1 millimeter wide and 0.4 millimeter apart

In modern physics, the double-slit experiment demonstrates that light and matter can exhibit behavior associated with both classical particles and classical waves. This type of experiment was first described by Thomas Young in 1801 when making his case for the wave behavior of visible light. In 1927, Davisson and Germer and, independently, George Paget Thomson and his research student Alexander Reid demonstrated that electrons show the same behavior, which was later extended to atoms and molecules.

The experiment belongs to a general class of "double path" experiments, in which a wave is split into two separate waves (the wave is typically made of many photons and better referred to as a wave front, not to be confused with the wave properties of the individual photon) that later combine into a single wave. Changes in the path-lengths of both waves result in a phase shift, creating an interference pattern. Another version is the Mach–Zehnder interferometer, which splits the beam with a beam splitter.

In the basic version of this experiment, a coherent light source, such as a laser beam, illuminates a plate pierced by two parallel slits, and the light passing through the slits is observed on a screen behind the plate. The wave nature of light causes the light waves passing through the two slits to interfere, producing bright and dark bands on the screen – a result that would not be expected if light consisted of classical particles. However, the light is always found to be absorbed at the screen at discrete points, as individual particles (not waves); the interference pattern appears via the varying density of these particle hits on the screen. Furthermore, versions of the experiment that include detectors at the slits find that each detected photon passes through one slit (as would a classical particle), and not through both slits (as would a wave). However, such experiments demonstrate that particles do not form the interference pattern if one detects which slit they pass through. These results demonstrate the principle of wave–particle duality.

Other atomic-scale entities, such as electrons, are found to exhibit the same behavior when fired towards a double slit. Additionally, the detection of individual discrete impacts is observed to be inherently probabilistic, which is inexplicable using classical mechanics.

The experiment can be done with entities much larger than electrons and photons, although it becomes more difficult as size increases. The largest entities for which the double-slit experiment has been performed were molecules that each comprised 2000 atoms (whose total mass was 25,000 daltons).

The double-slit experiment (and its variations) has become a classic for its clarity in expressing the central puzzles of quantum mechanics. Richard Feynman called it "a phenomenon which is impossible […] to explain in any classical way, and which has in it the heart of quantum mechanics. In reality, it contains the only mystery [of quantum mechanics]."

Overview

Same double-slit assembly (0.7 mm between slits); in top image, one slit is closed. In the single-slit image, a diffraction pattern (the faint spots on either side of the main band) forms due to the nonzero width of the slit. This diffraction pattern is also seen in the double-slit image, but with many smaller interference fringes.

If one illuminates two parallel slits, the light from the two slits again interferes. Here the interference is a more pronounced pattern with a series of alternating light and dark bands. The width of the bands is a property of the frequency of the illuminating light.

If light consisted strictly of ordinary or classical particles, and these particles were fired in a straight line through a slit and allowed to strike a screen on the other side, we would expect to see a pattern corresponding to the size and shape of the slit. However, when this "single-slit experiment" is actually performed, the pattern on the screen is a diffraction pattern in which the light is spread out. The smaller the slit, the greater the angle of spread. The top portion of the image shows the central portion of the pattern formed when a red laser illuminates a slit and, if one looks carefully, two faint side bands. More bands can be seen with a more highly refined apparatus. The wave theory of light explains the pattern as being the result of the interference of light waves from the slit.

Feynman was fond of saying that all of quantum mechanics can be gleaned from carefully thinking through the implications of this single experiment. He also proposed (as a thought experiment) that if detectors were placed before each slit, the interference pattern would disappear.

History

Fig. 442 in a 1807 book about Young's lectures with a caption by Young that reads: “the manner in which two portions of coloured light, admitted through two small apertures, produce light and dark stripes or fringes by their interference.”

In 1801, Thomas Young presented a famous paper to the Royal Society entitled "On the Theory of Light and Colours" which explained interference phenomena like Newton's rings in terms of wave interference. The first published account of what Young called his 'general law' of interference appeared in January 1802, in his book A Syllabus of a Course of Lectures on Natural and Experimental Philosophy:

But the general law, by which all these appearances are governed, may be very easily deduced from the interference of two coincident undulations, which either cooperate, or destroy each other, in the same manner as two musical notes produce an alternate intension and remission, in the beating of an imperfect unison.

In 1803, Young followed up with demonstrations of optical interference using sunlight, pinholes and cards. These demonstrations played a crucial role in the understanding of the wave theory of light, challenging the corpuscular theory of light proposed by Isaac Newton, which had been the accepted model of light propagation in the 17th and 18th centuries. As part of his discussion of these experiments he describes a double slit experiment. There is some question as to whether he ever actually performed a double-slit interference experiment. Later discovery of the photoelectric effect demonstrated that under different circumstances, light can behave as if it is composed of discrete particles. These seemingly contradictory discoveries, now called wave-particle duality, made it necessary to go beyond classical physics and take into account the quantum nature of light.

A low-intensity double-slit experiment was first performed by G. I. Taylor in 1909, by reducing the level of incident light until photon emission/absorption events were mostly non-overlapping. A slit interference experiment was not performed with anything other than light until 1961, when Claus Jönsson of the University of Tübingen performed it with coherent electron beams and multiple slits. In 1974, the Italian physicists Pier Giorgio Merli, Gian Franco Missiroli, and Giulio Pozzi performed a related experiment using single electrons from a coherent source and a biprism beam splitter, showing the statistical nature of the buildup of the interference pattern, as predicted by quantum theory. In 2002, the single-electron version of the experiment was voted "the most beautiful experiment" by readers of Physics World. Since that time a number of related experiments have been published, with a little controversy.

In 2012, Stefano Frabboni and co-workers sent single electrons onto nanofabricated slits (about 100 nm wide) and, by detecting the transmitted electrons with a single-electron detector, they could show the build-up of a double-slit interference pattern. Many related experiments involving the coherent interference have been performed; they are the basis of modern electron diffraction, microscopy and high resolution imaging.

In 2018, single particle interference was demonstrated for antimatter in the Positron Laboratory (L-NESS, Politecnico di Milano) of Rafael Ferragut in Como (Italy), by a group led by Marco Giammarchi.

Variations of the experiment

Interference from individual particles

An important version of this experiment involves single particle detection. Illuminating the double-slit with a low intensity results in single particles being detected as white dots on the screen. Remarkably, however, an interference pattern emerges when these particles are allowed to build up one by one (see the image below).

Experimental electron double-slit diffraction pattern
Electron diffraction pattern
Final result of dot-by-dot build up of diffraction. Across the middle of the image, the intensity alternates from high to low, showing interference in the signal from the two slits.
 
Dots slowly filling an interference pattern.
Movie of the pattern being built up dot by dot.

This demonstrates the wave–particle duality, which states that all matter exhibits both wave and particle properties: The particle is measured as a single pulse at a single position, while the modulus squared of the wave describes the probability of detecting the particle at a specific place on the screen giving a statistical interference pattern. This phenomenon has been shown to occur with photons, electrons, atoms, and even some molecules: with buckminsterfullerene (C
60
) in 2001, with 2 molecules of 430 atoms (C
60
(C
12
F
25
)
10
and C
168
H
94
F
152
O
8
N
4
S
4
) in 2011, and with molecules of up to 2000 atoms in 2019. In addition to interference patterns built up from single particles, up to 4 entangled photons can also show interference patterns.

Mach–Zehnder interferometer

Light in Mach–Zehnder interferometer produces interference (wave-like behavior) even when being detected one photon at a time (particle-like behavior).

The Mach–Zehnder interferometer can be seen as a simplified version of the double-slit experiment. Instead of propagating through free space after the two slits, and hitting any position in an extended screen, in the interferometer the photons can only propagate via two paths, and hit two discrete photodetectors. This makes it possible to describe it via simple linear algebra in dimension 2, rather than differential equations.

A photon emitted by the laser hits the first beam splitter and is then in a superposition between the two possible paths. In the second beam splitter these paths interfere, causing the photon to hit the photodetector on the right with probability one, and the photodetector on the bottom with probability zero.[51] Blocking one of the paths, or equivalently detecting the presence of a photon on a path eliminates interference between the paths: both photodetectors will be hit with probability 1/2. This indicates that after the first beam splitter the photon does not take one path or another, but rather exists in a quantum superposition of the two paths.

"Which-way" experiments and the principle of complementarity

A well-known thought experiment predicts that if particle detectors are positioned at the slits, showing through which slit a photon goes, the interference pattern will disappear. This which-way experiment illustrates the complementarity principle that photons can behave as either particles or waves, but cannot be observed as both at the same time. Despite the importance of this thought experiment in the history of quantum mechanics (for example, see the discussion on Einstein's version of this experiment), technically feasible realizations of this experiment were not proposed until the 1970s. (Naive implementations of the textbook thought experiment are not possible because photons cannot be detected without absorbing the photon.) Currently, multiple experiments have been performed illustrating various aspects of complementarity.

An experiment performed in 1987 produced results that demonstrated that partial information could be obtained regarding which path a particle had taken without destroying the interference altogether. This "wave-particle trade-off" takes the form of an inequality relating the visibility of the interference pattern and the distinguishability of the which-way paths.

Delayed choice and quantum eraser variations

Wheeler's Delayed Choice Experiment
A diagram of Wheeler's delayed choice experiment, showing the principle of determining the path of the photon after it passes through the slit

Wheeler's delayed-choice experiments demonstrate that extracting "which path" information after a particle passes through the slits can seem to retroactively alter its previous behavior at the slits.

Quantum eraser experiments demonstrate that wave behavior can be restored by erasing or otherwise making permanently unavailable the "which path" information.

A simple do-it-at-home illustration of the quantum eraser phenomenon was given in an article in Scientific American. If one sets polarizers before each slit with their axes orthogonal to each other, the interference pattern will be eliminated. The polarizers can be considered as introducing which-path information to each beam. Introducing a third polarizer in front of the detector with an axis of 45° relative to the other polarizers "erases" this information, allowing the interference pattern to reappear. This can also be accounted for by considering the light to be a classical wave, and also when using circular polarizers and single photons. Implementations of the polarizers using entangled photon pairs have no classical explanation.

Weak measurement

In a highly publicized experiment in 2012, researchers claimed to have identified the path each particle had taken without any adverse effects at all on the interference pattern generated by the particles. In order to do this, they used a setup such that particles coming to the screen were not from a point-like source, but from a source with two intensity maxima. However, commentators such as Svensson have pointed out that there is in fact no conflict between the weak measurements performed in this variant of the double-slit experiment and the Heisenberg uncertainty principle. Weak measurement followed by post-selection did not allow simultaneous position and momentum measurements for each individual particle, but rather allowed measurement of the average trajectory of the particles that arrived at different positions. In other words, the experimenters were creating a statistical map of the full trajectory landscape.

Other variations

A laboratory double-slit assembly; distance between top posts is approximately 2.5 cm (one inch).
Near-field intensity distribution patterns for plasmonic slits with equal widths (A) and non-equal widths (B)

In 1967, Pfleegor and Mandel demonstrated two-source interference using two separate lasers as light sources.

It was shown experimentally in 1972 that in a double-slit system where only one slit was open at any time, interference was nonetheless observed provided the path difference was such that the detected photon could have come from either slit. The experimental conditions were such that the photon density in the system was much less than 1.

In 1991, Carnal and Mlynek performed the classic Young's double slit experiment with metastable helium atoms passing through micrometer-scale slits in gold foil.

In 1999, a quantum interference experiment (using a diffraction grating, rather than two slits) was successfully performed with buckyball molecules (each of which comprises 60 carbon atoms). A buckyball is large enough (diameter about 0.7 nm, nearly half a million times larger than a proton) to be seen in an electron microscope.

In 2002, an electron field emission source was used to demonstrate the double-slit experiment. In this experiment, a coherent electron wave was emitted from two closely located emission sites on the needle apex, which acted as double slits, splitting the wave into two coherent electron waves in a vacuum. The interference pattern between the two electron waves could then be observed. In 2017, researchers performed the double-slit experiment using light-induced field electron emitters. With this technique, emission sites can be optically selected on a scale of ten nanometers. By selectively deactivating (closing) one of the two emissions (slits), researchers were able to show that the interference pattern disappeared.

In 2005, E. R. Eliel presented an experimental and theoretical study of the optical transmission of a thin metal screen perforated by two subwavelength slits, separated by many optical wavelengths. The total intensity of the far-field double-slit pattern is shown to be reduced or enhanced as a function of the wavelength of the incident light beam.

In 2012, researchers at the University of Nebraska–Lincoln performed the double-slit experiment with electrons as described by Richard Feynman, using new instruments that allowed control of the transmission of the two slits and the monitoring of single-electron detection events. Electrons were fired by an electron gun and passed through one or two slits of 62 nm wide × 4 μm tall.

In 2013, a quantum interference experiment (using diffraction gratings, rather than two slits) was successfully performed with molecules that each comprised 810 atoms (whose total mass was over 10,000 daltons). The record was raised to 2000 atoms (25,000 amu) in 2019.

In 2025, scientists at the Massachusetts Institute of Technology performed a version of the double-slit experiment using atoms trapped in an optical lattice and forming a Mott insulator. After being released from the optical lattice these ultra-cold atoms have Heisenberg uncertainty limited momentum and position. Single photons initially scatter from the atoms coherently and form optical interference pattern. As the wavepackets of the released atoms expand, the interference pattern disappears, confirming incoherent scattering. The experiment demonstrates wave interference using only quantum atoms and photons.

Hydrodynamic pilot wave analogs

Hydrodynamic analogs have been developed that can recreate various aspects of quantum mechanical systems, including single-particle interference through a double-slit. A silicone oil droplet, bouncing along the surface of a liquid, self-propels via resonant interactions with its own wave field. The droplet gently sloshes the liquid with every bounce. At the same time, ripples from past bounces affect its course. The droplet's interaction with its own ripples, which form what is known as a pilot wave, causes it to exhibit behaviors previously thought to be peculiar to elementary particles – including behaviors customarily taken as evidence that elementary particles are spread through space like waves, without any specific location, until they are measured.

Behaviors mimicked via this hydrodynamic pilot-wave system include quantum single particle diffraction, tunneling, quantized orbits, orbital level splitting, spin, and multimodal statistics. It is also possible to infer uncertainty relations and exclusion principles. Videos are available illustrating various features of this system. (See the External links.)

However, more complicated systems that involve two or more particles in superposition are not amenable to such a simple, classically intuitive explanation. Accordingly, no hydrodynamic analog of entanglement has been developed. Nevertheless, optical analogs are possible.

Double-slit experiment on time

In 2023, an experiment was reported recreating an interference pattern in time by shining a pump laser pulse at a screen coated in indium tin oxide (ITO) which would alter the properties of the electrons within the material due to the Kerr effect, changing it from transparent to reflective for around 200 femtoseconds long where a subsequent probe laser beam hitting the ITO screen would then see this temporary change in optical properties as a slit in time and two of them as a double slit with a phase difference adding up destructively or constructively on each frequency component resulting in an interference pattern. Similar results have been obtained classically on water waves.

Classical wave-optics formulation

Two-slit diffraction pattern with an incident plane wave
Photo of the double-slit interference of sunlight
Two slits are illuminated by a plane wave, showing the path difference.

Much of the behaviour of light can be modelled using classical wave theory. The Huygens–Fresnel principle is one such model; it states that each point on a wavefront generates a secondary wavelet, and that the disturbance at any subsequent point can be found by summing the contributions of the individual wavelets at that point. This summation needs to take into account the phase as well as the amplitude of the individual wavelets. Only the intensity of a light field can be measured—this is proportional to the square of the amplitude.

In the double-slit experiment, the two slits are illuminated by the quasi-monochromatic light of a single laser. If the width of the slits is small enough (much less than the wavelength of the laser light), the slits diffract the light into cylindrical waves. These two cylindrical wavefronts are superimposed, and the amplitude, and therefore the intensity, at any point in the combined wavefronts depends on both the magnitude and the phase of the two wavefronts. The difference in phase between the two waves is determined by the difference in the distance travelled by the two waves.

If the viewing distance is large compared with the separation of the slits (the far field), the phase difference can be found using the geometry shown in the figure below right. The path difference between two waves travelling at an angle θ is given by:

where d is the distance between the two slits. When the two waves are in phase, i.e. the path difference is equal to an integral number of wavelengths, the summed amplitude, and therefore the summed intensity is maximum, and when they are in anti-phase, i.e. the path difference is equal to half a wavelength, one and a half wavelengths, etc., then the two waves cancel and the summed intensity is zero. This effect is known as interference. The interference fringe maxima occur at angles

where λ is the wavelength of the light. The angular spacing of the fringes, θf, is given by

The spacing of the fringes at a distance z from the slits is given by

For example, if two slits are separated by 0.5 mm (d), and are illuminated with a 0.6 μm wavelength laser (λ), then at a distance of 1 m (z), the spacing of the fringes will be 1.2 mm.

If the width of the slits b is appreciable compared to the wavelength, the Fraunhofer diffraction equation is needed to determine the intensity of the diffracted light as follows:

where the sinc function is defined as sinc(x) = sin(x)/x for x ≠ 0, and sinc(0) = 1.

This is illustrated in the figure above, where the first pattern is the diffraction pattern of a single slit, given by the sinc function in this equation, and the second figure shows the combined intensity of the light diffracted from the two slits, where the cos function represents the fine structure, and the coarser structure represents diffraction by the individual slits as described by the sinc function.

Similar calculations for the near field can be made by applying the Fresnel diffraction equation, which implies that as the plane of observation gets closer to the plane in which the slits are located, the diffraction patterns associated with each slit decrease in size, so that the area in which interference occurs is reduced, and may vanish altogether when there is no overlap in the two diffracted patterns.

Path-integral formulation

The double-slit experiment can illustrate the path integral formulation of quantum mechanics provided by Feynman. The path integral formulation replaces the classical notion of a single, unique trajectory for a system, with a sum over all possible trajectories. The trajectories are added together by using functional integration.

Each path is considered equally likely, and thus contributes the same amount. However, the phase of this contribution at any given point along the path is determined by the action along the path:

All these contributions are then added together, and the magnitude of the final result is squared, to get the probability distribution for the position of a particle:

As is always the case when calculating probability, the results must then be normalized by imposing:

The probability distribution of the outcome is the normalized square of the norm of the superposition, over all paths from the point of origin to the final point, of waves propagating proportionally to the action along each path. The differences in the cumulative action along the different paths (and thus the relative phases of the contributions) produces the interference pattern observed by the double-slit experiment. Feynman stressed that his formulation is merely a mathematical description, not an attempt to describe a real process that we can measure.

Interpretations of the experiment

Like the Schrödinger's cat thought experiment, the double-slit experiment is often used to highlight the differences and similarities between the various interpretations of quantum mechanics.

Standard quantum physics

The standard interpretation of the double slit experiment is that the pattern is a wave phenomenon, representing interference between two probability amplitudes, one for each slit. Low intensity experiments demonstrate that the pattern is filled in one particle detection at a time. Any change to the apparatus designed to detect a particle at a particular slit alters the probability amplitudes and the interference disappears. This interpretation is independent of any conscious observer.

Complementarity

Niels Bohr interpreted quantum experiments like the double-slit experiment using the concept of complementarity. In Bohr's view quantum systems are not classical, but measurements can only give classical results. Certain pairs of classical properties will never be observed in a quantum system simultaneously: the interference pattern of waves in the double slit experiment will disappear if particles are detected at the slits. Modern quantitative versions of the concept allow for a continuous tradeoff between the visibility of the interference fringes and the probability of particle detection at a slit.

Copenhagen interpretation

The Copenhagen interpretation is a collection of views about the meaning of quantum mechanics, stemming from the work of Niels Bohr, Werner Heisenberg, Max Born, and others. The term "Copenhagen interpretation" was apparently coined by Heisenberg during the 1950s to refer to ideas developed in the 1925–1927 period, glossing over his disagreements with Bohr.Consequently, there is no definitive historical statement of what the interpretation entails. Features common across versions of the Copenhagen interpretation include the idea that quantum mechanics is intrinsically indeterministic, with probabilities calculated using the Born rule, and some form of complementarity principle. Moreover, the act of "observing" or "measuring" an object is irreversible, and no truth can be attributed to an object, except according to the results of its measurement. In the Copenhagen interpretation, complementarity means a particular experiment can demonstrate particle behavior (passing through a definite slit) or wave behavior (interference), but not both at the same time. In a Copenhagen-type view, the question of which slit a particle travels through has no meaning when there is no detector.

Relational interpretation

According to the relational interpretation of quantum mechanics, first proposed by Carlo Rovelli, observations such as those in the double-slit experiment result specifically from the interaction between the observer (measuring device) and the object being observed (physically interacted with), not any absolute property possessed by the object. In the case of an electron, if it is initially "observed" at a particular slit, then the observer–particle (photon–electron) interaction includes information about the electron's position. This partially constrains the particle's eventual location at the screen. If it is "observed" (measured with a photon) not at a particular slit but rather at the screen, then there is no "which path" information as part of the interaction, so the electron's "observed" position on the screen is determined strictly by its probability function. This makes the resulting pattern on the screen the same as if each individual electron had passed through both slits.

Many-worlds interpretation

As with Copenhagen, there are multiple variants of the many-worlds interpretation. The unifying theme is that physical reality is identified with a wavefunction, and this wavefunction always evolves unitarily, i.e., following the Schrödinger equation with no collapses. Consequently, there are many parallel universes, which only interact with each other through interference. David Deutsch argues that the way to understand the double-slit experiment is that in each universe the particle travels through a specific slit, but its motion is affected by interference with particles in other universes, and this interference creates the observable fringes. David Wallace, another advocate of the many-worlds interpretation, writes that in the familiar setup of the double-slit experiment the two paths are not sufficiently separated for a description in terms of parallel universes to make sense.

De Broglie–Bohm theory

An alternative to the standard understanding of quantum mechanics, the De Broglie–Bohm theory states that particles also have precise locations at all times, and that their velocities are defined by the wave-function. So while a single particle will travel through one particular slit in the double-slit experiment, the so-called "pilot wave" that influences it will travel through both. The two slit de Broglie–Bohm trajectories were first calculated by Chris Dewdney while working with Chris Philippidis and Basil Hiley at Birkbeck College (London). Despite being deterministic, the de Broglie–Bohm theory produces the same statistical results as standard quantum mechanics under the "quantum equilibrium hypothesis", which requires the initially prepared particle positions to be randomly distributed according to the modulus squared of the initially prepared wave function. Although de Broglie–Bohm theory overcomes many of the conceptual difficulties of quantum mechanics in the non-relativistic setting, it is incompatible with Lorentz invariance.

Bohmian trajectories
Trajectories of particles in De Broglie–Bohm theory in the double-slit experiment
 
100 trajectories guided by the wave function. In De Broglie–Bohm's theory, a particle is represented, at any time, by a wave function and a position (center of mass). This is a kind of augmented reality compared to the standard interpretation.
 
Numerical simulation of the double-slit experiment with electrons. Figure on the left: evolution (from left to right) of the intensity of the electron beam at the exit of the slits (left) up to the detection screen located 10 cm after the slits (right). The higher the intensity, the more the color is light blue – Figure in the center: impacts of the electrons observed on the screen – Figure on the right: intensity of the electrons in the far field approximation (on the screen). Numerical data from Claus Jönsson's experiment (1961). Photons, atoms and molecules follow a similar evolution.

Neurohacking

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Neurohacking   ...