Search This Blog

Sunday, October 3, 2021

Antimatter

From Wikipedia, the free encyclopedia

In modern physics, antimatter is defined as matter composed of the antiparticles (or "partners") of the corresponding particles in "ordinary" matter. Minuscule numbers of antiparticles are generated daily at particle accelerators—total production has been only a few nanograms—and in natural processes like cosmic ray collisions and some types of radioactive decay, but only a tiny fraction of these have successfully been bound together in experiments to form anti-atoms. No macroscopic amount of antimatter has ever been assembled due to the extreme cost and difficulty of production and handling.

Theoretically, a particle and its anti-particle (for example, a proton and an antiproton) have the same mass, but opposite electric charge, and other differences in quantum numbers. For example, a proton has positive charge while an antiproton has negative charge.

A collision between any particle and its anti-particle partner leads to their mutual annihilation, giving rise to various proportions of intense photons (gamma rays), neutrinos, and sometimes less-massive particle–antiparticle pairs. The majority of the total energy of annihilation emerges in the form of ionizing radiation. If surrounding matter is present, the energy content of this radiation will be absorbed and converted into other forms of energy, such as heat or light. The amount of energy released is usually proportional to the total mass of the collided matter and antimatter, in accordance with the notable mass–energy equivalence equation, E=mc2.

Antimatter particles bind with each other to form antimatter, just as ordinary particles bind to form normal matter. For example, a positron (the antiparticle of the electron) and an antiproton (the antiparticle of the proton) can form an antihydrogen atom. The nuclei of antihelium have been artificially produced, albeit with difficulty, and are the most complex anti-nuclei so far observed. Physical principles indicate that complex antimatter atomic nuclei are possible, as well as anti-atoms corresponding to the known chemical elements.

There is strong evidence that the observable universe is composed almost entirely of ordinary matter, as opposed to an equal mixture of matter and antimatter. This asymmetry of matter and antimatter in the visible universe is one of the great unsolved problems in physics. The process by which this inequality between matter and antimatter particles developed is called baryogenesis.

There are some 500 terrestrial gamma-ray flashes daily. The red dots show those spotted by the Fermi Gamma-ray Space Telescope in 2010. The blue areas indicate where potential lightning can occur for terrestrial gamma-ray flashes.
 
A video showing how scientists used the Fermi Gamma-ray Space Telescope's gamma-ray detector to uncover bursts of antimatter from thunderstorms

Definitions

Antimatter particles can be defined by their negative baryon number or lepton number, while "normal" (non-antimatter) matter particles have a positive baryon or lepton number. These two classes of particles are the antiparticle partners of each other. A "positron" is the antimatter equivalent of the "electron".

The French term contra-terrene led to the initialism "C.T." and the science fiction term "seetee", as used in such novels as Seetee Ship.

Conceptual history

The idea of negative matter appears in past theories of matter that have now been abandoned. Using the once popular vortex theory of gravity, the possibility of matter with negative gravity was discussed by William Hicks in the 1880s. Between the 1880s and the 1890s, Karl Pearson proposed the existence of "squirts" and sinks of the flow of aether. The squirts represented normal matter and the sinks represented negative matter. Pearson's theory required a fourth dimension for the aether to flow from and into.

The term antimatter was first used by Arthur Schuster in two rather whimsical letters to Nature in 1898, in which he coined the term. He hypothesized antiatoms, as well as whole antimatter solar systems, and discussed the possibility of matter and antimatter annihilating each other. Schuster's ideas were not a serious theoretical proposal, merely speculation, and like the previous ideas, differed from the modern concept of antimatter in that it possessed negative gravity.

The modern theory of antimatter began in 1928, with a paper by Paul Dirac. Dirac realised that his relativistic version of the Schrödinger wave equation for electrons predicted the possibility of antielectrons. These were discovered by Carl D. Anderson in 1932 and named positrons from "positive electron". Although Dirac did not himself use the term antimatter, its use follows on naturally enough from antielectrons, antiprotons, etc. A complete periodic table of antimatter was envisaged by Charles Janet in 1929.

The Feynman–Stueckelberg interpretation states that antimatter and antiparticles are regular particles traveling backward in time.

Notation

One way to denote an antiparticle is by adding a bar over the particle's symbol. For example, the proton and antiproton are denoted as
p
and
p
, respectively. The same rule applies if one were to address a particle by its constituent components. A proton is made up of
u

u

d
quarks, so an antiproton must therefore be formed from
u

u

d
antiquarks. Another convention is to distinguish particles by positive and negative electric charge. Thus, the electron and positron are denoted simply as
e
and
e+
respectively. To prevent confusion, however, the two conventions are never mixed.

Properties

Theorized anti-gravitational properties of antimatter are currently being tested at the AEGIS experiment at CERN. Antimatter coming in contact with matter will annihilate both while leaving behind pure energy. Research is needed to study the possible gravitational effects between matter and antimatter, and between antimatter and antimatter. However research is difficult considering when the two meet they annihilate, along with the current difficulties of capturing and containing antimatter.

There are compelling theoretical reasons to believe that, aside from the fact that antiparticles have different signs on all charges (such as electric and baryon charges), matter and antimatter have exactly the same properties. This means a particle and its corresponding antiparticle must have identical masses and decay lifetimes (if unstable). It also implies that, for example, a star made up of antimatter (an "antistar") will shine just like an ordinary star. This idea was tested experimentally in 2016 by the ALPHA experiment, which measured the transition between the two lowest energy states of antihydrogen. The results, which are identical to that of hydrogen, confirmed the validity of quantum mechanics for antimatter.

Origin and asymmetry

Most matter observable from the Earth seems to be made of matter rather than antimatter. If antimatter-dominated regions of space existed, the gamma rays produced in annihilation reactions along the boundary between matter and antimatter regions would be detectable.

Antiparticles are created everywhere in the universe where high-energy particle collisions take place. High-energy cosmic rays impacting Earth's atmosphere (or any other matter in the Solar System) produce minute quantities of antiparticles in the resulting particle jets, which are immediately annihilated by contact with nearby matter. They may similarly be produced in regions like the center of the Milky Way and other galaxies, where very energetic celestial events occur (principally the interaction of relativistic jets with the interstellar medium). The presence of the resulting antimatter is detectable by the two gamma rays produced every time positrons annihilate with nearby matter. The frequency and wavelength of the gamma rays indicate that each carries 511 keV of energy (that is, the rest mass of an electron multiplied by c2).

Observations by the European Space Agency's INTEGRAL satellite may explain the origin of a giant antimatter cloud surrounding the galactic center. The observations show that the cloud is asymmetrical and matches the pattern of X-ray binaries (binary star systems containing black holes or neutron stars), mostly on one side of the galactic center. While the mechanism is not fully understood, it is likely to involve the production of electron–positron pairs, as ordinary matter gains kinetic energy while falling into a stellar remnant.

Antimatter may exist in relatively large amounts in far-away galaxies due to cosmic inflation in the primordial time of the universe. Antimatter galaxies, if they exist, are expected to have the same chemistry and absorption and emission spectra as normal-matter galaxies, and their astronomical objects would be observationally identical, making them difficult to distinguish. NASA is trying to determine if such galaxies exist by looking for X-ray and gamma-ray signatures of annihilation events in colliding superclusters.

In October 2017, scientists working on the BASE experiment at CERN reported a measurement of the antiproton magnetic moment to a precision of 1.5 parts per billion. It is consistent with the most precise measurement of the proton magnetic moment (also made by BASE in 2014), which supports the hypothesis of CPT symmetry. This measurement represents the first time that a property of antimatter is known more precisely than the equivalent property in matter.

Antimatter quantum interferometry has been first demonstrated in the L-NESS Laboratory of R. Ferragut in Como (Italy), by a group led by M. Giammarchi.

Natural production

Positrons are produced naturally in β+ decays of naturally occurring radioactive isotopes (for example, potassium-40) and in interactions of gamma quanta (emitted by radioactive nuclei) with matter. Antineutrinos are another kind of antiparticle created by natural radioactivity (β decay). Many different kinds of antiparticles are also produced by (and contained in) cosmic rays. In January 2011, research by the American Astronomical Society discovered antimatter (positrons) originating above thunderstorm clouds; positrons are produced in terrestrial gamma-ray flashes created by electrons accelerated by strong electric fields in the clouds. Antiprotons have also been found to exist in the Van Allen Belts around the Earth by the PAMELA module.

Antiparticles are also produced in any environment with a sufficiently high temperature (mean particle energy greater than the pair production threshold). It is hypothesized that during the period of baryogenesis, when the universe was extremely hot and dense, matter and antimatter were continually produced and annihilated. The presence of remaining matter, and absence of detectable remaining antimatter, is called baryon asymmetry. The exact mechanism that produced this asymmetry during baryogenesis remains an unsolved problem. One of the necessary conditions for this asymmetry is the violation of CP symmetry, which has been experimentally observed in the weak interaction.

Recent observations indicate black holes and neutron stars produce vast amounts of positron-electron plasma via the jets.

Observation in cosmic rays

Satellite experiments have found evidence of positrons and a few antiprotons in primary cosmic rays, amounting to less than 1% of the particles in primary cosmic rays. This antimatter cannot all have been created in the Big Bang, but is instead attributed to have been produced by cyclic processes at high energies. For instance, electron-positron pairs may be formed in pulsars, as a magnetized neutron star rotation cycle shears electron-positron pairs from the star surface. Therein the antimatter forms a wind that crashes upon the ejecta of the progenitor supernovae. This weathering takes place as "the cold, magnetized relativistic wind launched by the star hits the non-relativistically expanding ejecta, a shock wave system forms in the impact: the outer one propagates in the ejecta, while a reverse shock propagates back towards the star." The former ejection of matter in the outer shock wave and the latter production of antimatter in the reverse shock wave are steps in a space weather cycle.

Preliminary results from the presently operating Alpha Magnetic Spectrometer (AMS-02) on board the International Space Station show that positrons in the cosmic rays arrive with no directionality, and with energies that range from 10 GeV to 250 GeV. In September, 2014, new results with almost twice as much data were presented in a talk at CERN and published in Physical Review Letters. A new measurement of positron fraction up to 500 GeV was reported, showing that positron fraction peaks at a maximum of about 16% of total electron+positron events, around an energy of 275 ± 32 GeV. At higher energies, up to 500 GeV, the ratio of positrons to electrons begins to fall again. The absolute flux of positrons also begins to fall before 500 GeV, but peaks at energies far higher than electron energies, which peak about 10 GeV. These results on interpretation have been suggested to be due to positron production in annihilation events of massive dark matter particles.

Cosmic ray antiprotons also have a much higher energy than their normal-matter counterparts (protons). They arrive at Earth with a characteristic energy maximum of 2 GeV, indicating their production in a fundamentally different process from cosmic ray protons, which on average have only one-sixth of the energy.

There is an ongoing search for larger antimatter nuclei, such as antihelium nuclei (that is, anti-alpha particles), in cosmic rays. The detection of natural antihelium could imply the existence of large antimatter structures such as an antistar. A prototype of the AMS-02 designated AMS-01, was flown into space aboard the Space Shuttle Discovery on STS-91 in June 1998. By not detecting any antihelium at all, the AMS-01 established an upper limit of 1.1×10−6 for the antihelium to helium flux ratio. AMS-02 revealed in December 2016 that it had discovered a few signals consistent with antihelium nuclei amidst several billion helium nuclei. The result remains to be verified, and the team is currently trying to rule out contamination.

Artificial production

Positrons

Positrons were reported in November 2008 to have been generated by Lawrence Livermore National Laboratory in larger numbers than by any previous synthetic process. A laser drove electrons through a gold target's nuclei, which caused the incoming electrons to emit energy quanta that decayed into both matter and antimatter. Positrons were detected at a higher rate and in greater density than ever previously detected in a laboratory. Previous experiments made smaller quantities of positrons using lasers and paper-thin targets; new simulations showed that short bursts of ultra-intense lasers and millimeter-thick gold are a far more effective source.

Antiprotons, antineutrons, and antinuclei

The existence of the antiproton was experimentally confirmed in 1955 by University of California, Berkeley physicists Emilio Segrè and Owen Chamberlain, for which they were awarded the 1959 Nobel Prize in Physics. An antiproton consists of two up antiquarks and one down antiquark (
u

u

d
). The properties of the antiproton that have been measured all match the corresponding properties of the proton, with the exception of the antiproton having opposite electric charge and magnetic moment from the proton. Shortly afterwards, in 1956, the antineutron was discovered in proton–proton collisions at the Bevatron (Lawrence Berkeley National Laboratory) by Bruce Cork and colleagues.

In addition to antibaryons, anti-nuclei consisting of multiple bound antiprotons and antineutrons have been created. These are typically produced at energies far too high to form antimatter atoms (with bound positrons in place of electrons). In 1965, a group of researchers led by Antonino Zichichi reported production of nuclei of antideuterium at the Proton Synchrotron at CERN. At roughly the same time, observations of antideuterium nuclei were reported by a group of American physicists at the Alternating Gradient Synchrotron at Brookhaven National Laboratory.

Antihydrogen atoms

Antimatter facilities
Low Energy Antiproton Ring (1982-1996)
Antiproton AccumulatorAntiproton production
Antiproton CollectorDecelerated and stored antiprotons
Antimatter Factory (2000-Present)
Antiproton Decelerator (AD)Decelerates antiprotons
Extra Low Energy Antiproton ring(ELENA)Delerates antiprotons received from AD

In 1995, CERN announced that it had successfully brought into existence nine hot antihydrogen atoms by implementing the SLAC/Fermilab concept during the PS210 experiment. The experiment was performed using the Low Energy Antiproton Ring (LEAR), and was led by Walter Oelert and Mario Macri. Fermilab soon confirmed the CERN findings by producing approximately 100 antihydrogen atoms at their facilities. The antihydrogen atoms created during PS210 and subsequent experiments (at both CERN and Fermilab) were extremely energetic and were not well suited to study. To resolve this hurdle, and to gain a better understanding of antihydrogen, two collaborations were formed in the late 1990s, namely, ATHENA and ATRAP.

In 1999, CERN activated the Antiproton Decelerator, a device capable of decelerating antiprotons from 3500 MeV to 5.3 MeV – still too "hot" to produce study-effective antihydrogen, but a huge leap forward. In late 2002 the ATHENA project announced that they had created the world's first "cold" antihydrogen. The ATRAP project released similar results very shortly thereafter. The antiprotons used in these experiments were cooled by decelerating them with the Antiproton Decelerator, passing them through a thin sheet of foil, and finally capturing them in a Penning–Malmberg trap. The overall cooling process is workable, but highly inefficient; approximately 25 million antiprotons leave the Antiproton Decelerator and roughly 25,000 make it to the Penning–Malmberg trap, which is about 1/1000 or 0.1% of the original amount.

The antiprotons are still hot when initially trapped. To cool them further, they are mixed into an electron plasma. The electrons in this plasma cool via cyclotron radiation, and then sympathetically cool the antiprotons via Coulomb collisions. Eventually, the electrons are removed by the application of short-duration electric fields, leaving the antiprotons with energies less than 100 meV. While the antiprotons are being cooled in the first trap, a small cloud of positrons is captured from radioactive sodium in a Surko-style positron accumulator. This cloud is then recaptured in a second trap near the antiprotons. Manipulations of the trap electrodes then tip the antiprotons into the positron plasma, where some combine with antiprotons to form antihydrogen. This neutral antihydrogen is unaffected by the electric and magnetic fields used to trap the charged positrons and antiprotons, and within a few microseconds the antihydrogen hits the trap walls, where it annihilates. Some hundreds of millions of antihydrogen atoms have been made in this fashion.

In 2005, ATHENA disbanded and some of the former members (along with others) formed the ALPHA Collaboration, which is also based at CERN. The ultimate goal of this endeavour is to test CPT symmetry through comparison of the atomic spectra of hydrogen and antihydrogen (see hydrogen spectral series).

In 2016 a new antiproton decelerator and cooler called ELENA (Extra Low ENergy Antiproton decelerator) was built. It takes the antiprotons from the antiproton decelerator and cools them to 90 keV, which is "cold" enough to study. This machine works by using high energy and accelerating the particles within the chamber. More than one hundred antiprotons can be captured per second, a huge improvement, but it would still take several thousand years to make a nanogram of antimatter.

Most of the sought-after high-precision tests of the properties of antihydrogen could only be performed if the antihydrogen were trapped, that is, held in place for a relatively long time. While antihydrogen atoms are electrically neutral, the spins of their component particles produce a magnetic moment. These magnetic moments can interact with an inhomogeneous magnetic field; some of the antihydrogen atoms can be attracted to a magnetic minimum. Such a minimum can be created by a combination of mirror and multipole fields. Antihydrogen can be trapped in such a magnetic minimum (minimum-B) trap; in November 2010, the ALPHA collaboration announced that they had so trapped 38 antihydrogen atoms for about a sixth of a second. This was the first time that neutral antimatter had been trapped.

On 26 April 2011, ALPHA announced that they had trapped 309 antihydrogen atoms, some for as long as 1,000 seconds (about 17 minutes). This was longer than neutral antimatter had ever been trapped before. ALPHA has used these trapped atoms to initiate research into the spectral properties of the antihydrogen.

The biggest limiting factor in the large-scale production of antimatter is the availability of antiprotons. Recent data released by CERN states that, when fully operational, their facilities are capable of producing ten million antiprotons per minute. Assuming a 100% conversion of antiprotons to antihydrogen, it would take 100 billion years to produce 1 gram or 1 mole of antihydrogen (approximately 6.02×1023 atoms of anti-hydrogen).

Antihelium

Antihelium-3 nuclei (3
He
) were first observed in the 1970s in proton–nucleus collision experiments at the Institute for High Energy Physics by Y. Prockoshkin's group (Protvino near Moscow, USSR) and later created in nucleus–nucleus collision experiments. Nucleus–nucleus collisions produce antinuclei through the coalescence of antiprotons and antineutrons created in these reactions. In 2011, the STAR detector reported the observation of artificially created antihelium-4 nuclei (anti-alpha particles) (4
He
) from such collisions.

The Alpha Magnetic Spectrometer on the International Space Station has, as of 2021, recorded eight events that seem to indicate the detection of antihelium-3.

Preservation

Antimatter cannot be stored in a container made of ordinary matter because antimatter reacts with any matter it touches, annihilating itself and an equal amount of the container. Antimatter in the form of charged particles can be contained by a combination of electric and magnetic fields, in a device called a Penning trap. This device cannot, however, contain antimatter that consists of uncharged particles, for which atomic traps are used. In particular, such a trap may use the dipole moment (electric or magnetic) of the trapped particles. At high vacuum, the matter or antimatter particles can be trapped and cooled with slightly off-resonant laser radiation using a magneto-optical trap or magnetic trap. Small particles can also be suspended with optical tweezers, using a highly focused laser beam.

In 2011, CERN scientists were able to preserve antihydrogen for approximately 17 minutes. The record for storing antiparticles is currently held by the TRAP experiment at CERN: antiprotons were kept in a Penning trap for 405 days. A proposal was made in 2018, to develop containment technology advanced enough to contain a billion anti-protons in a portable device to be driven to another lab for further experimentation.

Cost

Scientists claim that antimatter is the costliest material to make. In 2006, Gerald Smith estimated $250 million could produce 10 milligrams of positrons (equivalent to $25 billion per gram); in 1999, NASA gave a figure of $62.5 trillion per gram of antihydrogen. This is because production is difficult (only very few antiprotons are produced in reactions in particle accelerators) and because there is higher demand for other uses of particle accelerators. According to CERN, it has cost a few hundred million Swiss francs to produce about 1 billionth of a gram (the amount used so far for particle/antiparticle collisions). In comparison, to produce the first atomic weapon, the cost of the Manhattan Project was estimated at $23 billion with inflation during 2007.

Several studies funded by the NASA Institute for Advanced Concepts are exploring whether it might be possible to use magnetic scoops to collect the antimatter that occurs naturally in the Van Allen belt of the Earth, and ultimately, the belts of gas giants, like Jupiter, hopefully at a lower cost per gram.

Uses

Medical

Matter–antimatter reactions have practical applications in medical imaging, such as positron emission tomography (PET). In positive beta decay, a nuclide loses surplus positive charge by emitting a positron (in the same event, a proton becomes a neutron, and a neutrino is also emitted). Nuclides with surplus positive charge are easily made in a cyclotron and are widely generated for medical use. Antiprotons have also been shown within laboratory experiments to have the potential to treat certain cancers, in a similar method currently used for ion (proton) therapy.

Fuel

Isolated and stored anti-matter could be used as a fuel for interplanetary or interstellar travel as part of an antimatter-catalyzed nuclear pulse propulsion or another antimatter rocket. Since the energy density of antimatter is higher than that of conventional fuels, an antimatter-fueled spacecraft would have a higher thrust-to-weight ratio than a conventional spacecraft.

If matter–antimatter collisions resulted only in photon emission, the entire rest mass of the particles would be converted to kinetic energy. The energy per unit mass (9×1016 J/kg) is about 10 orders of magnitude greater than chemical energies, and about 3 orders of magnitude greater than the nuclear potential energy that can be liberated, today, using nuclear fission (about 200 MeV per fission reaction or 8×1013 J/kg), and about 2 orders of magnitude greater than the best possible results expected from fusion (about 6.3×1014 J/kg for the proton–proton chain). The reaction of kg of antimatter with 1 kg of matter would produce 1.8×1017 J (180 petajoules) of energy (by the mass–energy equivalence formula, E=mc2), or the rough equivalent of 43 megatons of TNT – slightly less than the yield of the 27,000 kg Tsar Bomba, the largest thermonuclear weapon ever detonated.

Not all of that energy can be utilized by any realistic propulsion technology because of the nature of the annihilation products. While electron–positron reactions result in gamma ray photons, these are difficult to direct and use for thrust. In reactions between protons and antiprotons, their energy is converted largely into relativistic neutral and charged pions. The neutral pions decay almost immediately (with a lifetime of 85 attoseconds) into high-energy photons, but the charged pions decay more slowly (with a lifetime of 26 nanoseconds) and can be deflected magnetically to produce thrust.

Charged pions ultimately decay into a combination of neutrinos (carrying about 22% of the energy of the charged pions) and unstable charged muons (carrying about 78% of the charged pion energy), with the muons then decaying into a combination of electrons, positrons and neutrinos (cf. muon decay; the neutrinos from this decay carry about 2/3 of the energy of the muons, meaning that from the original charged pions, the total fraction of their energy converted to neutrinos by one route or another would be about 0.22 + (2/3)⋅0.78 = 0.74).

Weapons

Antimatter has been considered as a trigger mechanism for nuclear weapons. A major obstacle is the difficulty of producing antimatter in large enough quantities, and there is no evidence that it will ever be feasible. Nonetheless, the U.S. Air Force funded studies of the physics of antimatter in the Cold War, and began considering its possible use in weapons, not just as a trigger, but as the explosive itself.

Dark matter

From Wikipedia, the free encyclopedia

Dark matter is a hypothetical form of matter thought to account for approximately 85% of the matter in the universe. Its presence is implied in a variety of astrophysical observations, including gravitational effects that cannot be explained by accepted theories of gravity unless more matter is present than can be seen. For this reason, most experts think that dark matter is abundant in the universe and that it has had a strong influence on its structure and evolution. Dark matter is called dark because it does not appear to interact with the electromagnetic field, which means it does not absorb, reflect or emit electromagnetic radiation, and is therefore difficult to detect.

Primary evidence for dark matter comes from calculations showing that many galaxies would fly apart, or that they would not have formed or would not move as they do, if they did not contain a large amount of unseen matter. Other lines of evidence include observations in gravitational lensing and in the cosmic microwave background, along with astronomical observations of the observable universe's current structure, the formation and evolution of galaxies, mass location during galactic collisions, and the motion of galaxies within galaxy clusters. In the standard Lambda-CDM model of cosmology, the total mass–energy of the universe contains 5% ordinary matter and energy, 27% dark matter and 68% of a form of energy known as dark energy. Thus, dark matter constitutes 85% of total mass, while dark energy plus dark matter constitute 95% of total mass–energy content.

Because dark matter has not yet been observed directly, if it exists, it must barely interact with ordinary baryonic matter and radiation, except through gravity. Most dark matter is thought to be non-baryonic in nature; it may be composed of some as-yet undiscovered subatomic particles. The primary candidate for dark matter is some new kind of elementary particle that has not yet been discovered, in particular, weakly interacting massive particles (WIMPs). Many experiments to directly detect and study dark matter particles are being actively undertaken, but none have yet succeeded. Dark matter is classified as "cold", "warm", or "hot" according to its velocity (more precisely, its free streaming length). Current models favor a cold dark matter scenario, in which structures emerge by gradual accumulation of particles.

Although the existence of dark matter is generally accepted by the scientific community, some astrophysicists, intrigued by certain observations which are not well-explained by standard dark matter, argue for various modifications of the standard laws of general relativity, such as modified Newtonian dynamics, tensor–vector–scalar gravity, or entropic gravity. These models attempt to account for all observations without invoking supplemental non-baryonic matter.

History

Early history

The hypothesis of dark matter has an elaborate history. In a talk given in 1884, Lord Kelvin estimated the number of dark bodies in the Milky Way from the observed velocity dispersion of the stars orbiting around the center of the galaxy. By using these measurements, he estimated the mass of the galaxy, which he determined is different from the mass of visible stars. Lord Kelvin thus concluded "many of our stars, perhaps a great majority of them, may be dark bodies". In 1906 Henri Poincaré in "The Milky Way and Theory of Gases" used "dark matter", or "matière obscure" in French, in discussing Kelvin's work.

The first to suggest the existence of dark matter using stellar velocities was Dutch astronomer Jacobus Kapteyn in 1922. Fellow Dutchman and radio astronomy pioneer Jan Oort also hypothesized the existence of dark matter in 1932. Oort was studying stellar motions in the local galactic neighborhood and found the mass in the galactic plane must be greater than what was observed, but this measurement was later determined to be erroneous.

In 1933, Swiss astrophysicist Fritz Zwicky, who studied galaxy clusters while working at the California Institute of Technology, made a similar inference. Zwicky applied the virial theorem to the Coma Cluster and obtained evidence of unseen mass he called dunkle Materie ('dark matter'). Zwicky estimated its mass based on the motions of galaxies near its edge and compared that to an estimate based on its brightness and number of galaxies. He estimated the cluster had about 400 times more mass than was visually observable. The gravity effect of the visible galaxies was far too small for such fast orbits, thus mass must be hidden from view. Based on these conclusions, Zwicky inferred some unseen matter provided the mass and associated gravitation attraction to hold the cluster together. Zwicky's estimates were off by more than an order of magnitude, mainly due to an obsolete value of the Hubble constant; the same calculation today shows a smaller fraction, using greater values for luminous mass. Nonetheless, Zwicky did correctly conclude from his calculation that the bulk of the matter was dark.

Further indications the mass-to-light ratio was not unity came from measurements of galaxy rotation curves. In 1939, Horace W. Babcock reported the rotation curve for the Andromeda nebula (known now as the Andromeda Galaxy), which suggested the mass-to-luminosity ratio increases radially. He attributed it to either light absorption within the galaxy or modified dynamics in the outer portions of the spiral and not to the missing matter he had uncovered. Following Babcock's 1939 report of unexpectedly rapid rotation in the outskirts of the Andromeda galaxy and a mass-to-light ratio of 50; in 1940 Jan Oort discovered and wrote about the large non-visible halo of NGC 3115.

1970s

Vera Rubin, Kent Ford, and Ken Freeman's work in the 1960s and 1970s provided further strong evidence, also using galaxy rotation curves. Rubin and Ford worked with a new spectrograph to measure the velocity curve of edge-on spiral galaxies with greater accuracy. This result was confirmed in 1978. An influential paper presented Rubin and Ford's results in 1980. They showed most galaxies must contain about six times as much dark as visible mass; thus, by around 1980 the apparent need for dark matter was widely recognized as a major unsolved problem in astronomy. At the same time Rubin and Ford were exploring optical rotation curves, radio astronomers were making use of new radio telescopes to map the 21 cm line of atomic hydrogen in nearby galaxies. The radial distribution of interstellar atomic hydrogen (H-I) often extends to much larger galactic radii than those accessible by optical studies, extending the sampling of rotation curves – and thus of the total mass distribution – to a new dynamical regime. Early mapping of Andromeda with the 300 foot telescope at Green Bank and the 250 foot dish at Jodrell Bank already showed the H-I rotation curve did not trace the expected Keplerian decline. As more sensitive receivers became available, Morton Roberts and Robert Whitehurst were able to trace the rotational velocity of Andromeda to 30 kpc, much beyond the optical measurements. Illustrating the advantage of tracing the gas disk at large radii, Figure 16 of that paper combines the optical data (the cluster of points at radii of less than 15 kpc with a single point further out) with the H-I data between 20–30 kpc, exhibiting the flatness of the outer galaxy rotation curve; the solid curve peaking at the center is the optical surface density, while the other curve shows the cumulative mass, still rising linearly at the outermost measurement. In parallel, the use of interferometric arrays for extragalactic H-I spectroscopy was being developed. In 1972, David Rogstad and Seth Shostak published H-I rotation curves of five spirals mapped with the Owens Valley interferometer; the rotation curves of all five were very flat, suggesting very large values of mass-to-light ratio in the outer parts of their extended H-I disks.

A stream of observations in the 1980s supported the presence of dark matter, including gravitational lensing of background objects by galaxy clusters, the temperature distribution of hot gas in galaxies and clusters, and the pattern of anisotropies in the cosmic microwave background. According to consensus among cosmologists, dark matter is composed primarily of a not yet characterized type of subatomic particle. The search for this particle, by a variety of means, is one of the major efforts in particle physics.

Technical definition

In standard cosmology, matter is anything whose energy density scales with the inverse cube of the scale factor, i.e., ρa−3. This is in contrast to radiation, which scales as the inverse fourth power of the scale factor ρa−4, and a cosmological constant, which is independent of a. These scalings can be understood intuitively: For an ordinary particle in a cubical box, doubling the length of the sides of the box decreases the density (and hence energy density) by a factor of 8 (= 23). For radiation, the energy density decreases by a factor of 16 (= 24), because any act whose effect increases the scale factor must also cause a proportional redshift. A cosmological constant, as an intrinsic property of space, has a constant energy density regardless of the volume under consideration.

In principle, "dark matter" means all components of the universe which are not visible but still obey ρa−3. In practice, the term "dark matter" is often used to mean only the non-baryonic component of dark matter, i.e., excluding "missing baryons". Context will usually indicate which meaning is intended.

Observational evidence

This artist's impression shows the expected distribution of dark matter in the Milky Way galaxy as a blue halo of material surrounding the galaxy.

Galaxy rotation curves

Rotation curve of a typical spiral galaxy: predicted (A) and observed (B). Dark matter can explain the 'flat' appearance of the velocity curve out to a large radius.

The arms of spiral galaxies rotate around the galactic center. The luminous mass density of a spiral galaxy decreases as one goes from the center to the outskirts. If luminous mass were all the matter, then we can model the galaxy as a point mass in the centre and test masses orbiting around it, similar to the Solar System. From Kepler's Second Law, it is expected that the rotation velocities will decrease with distance from the center, similar to the Solar System. This is not observed. Instead, the galaxy rotation curve remains flat as distance from the center increases.

If Kepler's laws are correct, then the obvious way to resolve this discrepancy is to conclude the mass distribution in spiral galaxies is not similar to that of the Solar System. In particular, there is a lot of non-luminous matter (dark matter) in the outskirts of the galaxy.

Velocity dispersions

Stars in bound systems must obey the virial theorem. The theorem, together with the measured velocity distribution, can be used to measure the mass distribution in a bound system, such as elliptical galaxies or globular clusters. With some exceptions, velocity dispersion estimates of elliptical galaxies do not match the predicted velocity dispersion from the observed mass distribution, even assuming complicated distributions of stellar orbits.

As with galaxy rotation curves, the obvious way to resolve the discrepancy is to postulate the existence of non-luminous matter.

Galaxy clusters

Galaxy clusters are particularly important for dark matter studies since their masses can be estimated in three independent ways:

  • From the scatter in radial velocities of the galaxies within clusters
  • From X-rays emitted by hot gas in the clusters. From the X-ray energy spectrum and flux, the gas temperature and density can be estimated, hence giving the pressure; assuming pressure and gravity balance determines the cluster's mass profile.
  • Gravitational lensing (usually of more distant galaxies) can measure cluster masses without relying on observations of dynamics (e.g., velocity).

Generally, these three methods are in reasonable agreement that dark matter outweighs visible matter by approximately 5 to 1.

Gravitational lensing

Strong gravitational lensing as observed by the Hubble Space Telescope in Abell 1689 indicates the presence of dark matter – enlarge the image to see the lensing arcs.
 
Models of rotating disc galaxies in the present day (left) and ten billion years ago (right). In the present-day galaxy, dark matter – shown in red – is more concentrated near the center and it rotates more rapidly (effect exaggerated).
 
Dark matter map for a patch of sky based on gravitational lensing analysis of a Kilo-Degree survey.

One of the consequences of general relativity is massive objects (such as a cluster of galaxies) lying between a more distant source (such as a quasar) and an observer should act as a lens to bend the light from this source. The more massive an object, the more lensing is observed.

Strong lensing is the observed distortion of background galaxies into arcs when their light passes through such a gravitational lens. It has been observed around many distant clusters including Abell 1689. By measuring the distortion geometry, the mass of the intervening cluster can be obtained. In the dozens of cases where this has been done, the mass-to-light ratios obtained correspond to the dynamical dark matter measurements of clusters. Lensing can lead to multiple copies of an image. By analyzing the distribution of multiple image copies, scientists have been able to deduce and map the distribution of dark matter around the MACS J0416.1-2403 galaxy cluster.

Weak gravitational lensing investigates minute distortions of galaxies, using statistical analyses from vast galaxy surveys. By examining the apparent shear deformation of the adjacent background galaxies, the mean distribution of dark matter can be characterized. The mass-to-light ratios correspond to dark matter densities predicted by other large-scale structure measurements. Dark matter does not bend light itself; mass (in this case the mass of the dark matter) bends spacetime. Light follows the curvature of spacetime, resulting in the lensing effect.

In May 2021, a new detailed dark matter map was revealed by the Dark Energy Survey Collaboration. In addition, the map revealed previously undiscovered filamentary structures connecting galaxies, by using a machine learning method.

Cosmic microwave background

Although both dark matter and ordinary matter are matter, they do not behave in the same way. In particular, in the early universe, ordinary matter was ionized and interacted strongly with radiation via Thomson scattering. Dark matter does not interact directly with radiation, but it does affect the CMB by its gravitational potential (mainly on large scales), and by its effects on the density and velocity of ordinary matter. Ordinary and dark matter perturbations, therefore, evolve differently with time and leave different imprints on the cosmic microwave background (CMB).

The cosmic microwave background is very close to a perfect blackbody but contains very small temperature anisotropies of a few parts in 100,000. A sky map of anisotropies can be decomposed into an angular power spectrum, which is observed to contain a series of acoustic peaks at near-equal spacing but different heights. The series of peaks can be predicted for any assumed set of cosmological parameters by modern computer codes such as CMBFAST and CAMB, and matching theory to data, therefore, constrains cosmological parameters. The first peak mostly shows the density of baryonic matter, while the third peak relates mostly to the density of dark matter, measuring the density of matter and the density of atoms.

The CMB anisotropy was first discovered by COBE in 1992, though this had too coarse resolution to detect the acoustic peaks. After the discovery of the first acoustic peak by the balloon-borne BOOMERanG experiment in 2000, the power spectrum was precisely observed by WMAP in 2003–2012, and even more precisely by the Planck spacecraft in 2013–2015. The results support the Lambda-CDM model.

The observed CMB angular power spectrum provides powerful evidence in support of dark matter, as its precise structure is well fitted by the Lambda-CDM model, but difficult to reproduce with any competing model such as modified Newtonian dynamics (MOND).

Structure formation

Mass map
3-D map of the large-scale distribution of dark matter, reconstructed from measurements of weak gravitational lensing with the Hubble Space Telescope.

Structure formation refers to the period after the Big Bang when density perturbations collapsed to form stars, galaxies, and clusters. Prior to structure formation, the Friedmann solutions to general relativity describe a homogeneous universe. Later, small anisotropies gradually grew and condensed the homogeneous universe into stars, galaxies and larger structures. Ordinary matter is affected by radiation, which is the dominant element of the universe at very early times. As a result, its density perturbations are washed out and unable to condense into structure. If there were only ordinary matter in the universe, there would not have been enough time for density perturbations to grow into the galaxies and clusters currently seen.

Dark matter provides a solution to this problem because it is unaffected by radiation. Therefore, its density perturbations can grow first. The resulting gravitational potential acts as an attractive potential well for ordinary matter collapsing later, speeding up the structure formation process.

Bullet Cluster

If dark matter does not exist, then the next most likely explanation must be that general relativity – the prevailing theory of gravity – is incorrect and should be modified. The Bullet Cluster, the result of a recent collision of two galaxy clusters, provides a challenge for modified gravity theories because its apparent center of mass is far displaced from the baryonic center of mass. Standard dark matter models can easily explain this observation, but modified gravity has a much harder time, especially since the observational evidence is model-independent.

Type Ia supernova distance measurements

Type Ia supernovae can be used as standard candles to measure extragalactic distances, which can in turn be used to measure how fast the universe has expanded in the past. Data indicates the universe is expanding at an accelerating rate, the cause of which is usually ascribed to dark energy. Since observations indicate the universe is almost flat, it is expected the total energy density of everything in the universe should sum to 1 (Ωtot ≈ 1). The measured dark energy density is ΩΛ ≈ 0.690; the observed ordinary (baryonic) matter energy density is Ωb ≈ 0.0482 and the energy density of radiation is negligible. This leaves a missing Ωdm ≈ 0.258 which nonetheless behaves like matter (see technical definition section above) – dark matter.

Sky surveys and baryon acoustic oscillations

Baryon acoustic oscillations (BAO) are fluctuations in the density of the visible baryonic matter (normal matter) of the universe on large scales. These are predicted to arise in the Lambda-CDM model due to acoustic oscillations in the photon–baryon fluid of the early universe, and can be observed in the cosmic microwave background angular power spectrum. BAOs set up a preferred length scale for baryons. As the dark matter and baryons clumped together after recombination, the effect is much weaker in the galaxy distribution in the nearby universe, but is detectable as a subtle (≈1 percent) preference for pairs of galaxies to be separated by 147 Mpc, compared to those separated by 130–160 Mpc. This feature was predicted theoretically in the 1990s and then discovered in 2005, in two large galaxy redshift surveys, the Sloan Digital Sky Survey and the 2dF Galaxy Redshift Survey. Combining the CMB observations with BAO measurements from galaxy redshift surveys provides a precise estimate of the Hubble constant and the average matter density in the Universe. The results support the Lambda-CDM model.

Redshift-space distortions

Large galaxy redshift surveys may be used to make a three-dimensional map of the galaxy distribution. These maps are slightly distorted because distances are estimated from observed redshifts; the redshift contains a contribution from the galaxy's so-called peculiar velocity in addition to the dominant Hubble expansion term. On average, superclusters are expanding more slowly than the cosmic mean due to their gravity, while voids are expanding faster than average. In a redshift map, galaxies in front of a supercluster have excess radial velocities towards it and have redshifts slightly higher than their distance would imply, while galaxies behind the supercluster have redshifts slightly low for their distance. This effect causes superclusters to appear squashed in the radial direction, and likewise voids are stretched. Their angular positions are unaffected. This effect is not detectable for any one structure since the true shape is not known, but can be measured by averaging over many structures. It was predicted quantitatively by Nick Kaiser in 1987, and first decisively measured in 2001 by the 2dF Galaxy Redshift Survey. Results are in agreement with the Lambda-CDM model.

Lyman-alpha forest

In astronomical spectroscopy, the Lyman-alpha forest is the sum of the absorption lines arising from the Lyman-alpha transition of neutral hydrogen in the spectra of distant galaxies and quasars. Lyman-alpha forest observations can also constrain cosmological models. These constraints agree with those obtained from WMAP data.

Theoretical classifications

Composition

There are various hypotheses about what dark matter could consist of, as set out in the table below.

Unsolved problem in physics:

What is dark matter? How was it generated?

Some dark matter hypotheses
Light bosons quantum chromodynamics axions
axion-like particles
fuzzy cold dark matter
neutrinos Standard Model
sterile neutrinos
weak scale supersymmetry
extra dimensions
little Higgs
effective field theory
simplified models
other particles Weakly interacting massive particles
self-interacting dark matter
superfluid vacuum theory
macroscopic primordial black holes
massive compact halo objects (MaCHOs)
Macroscopic dark matter (Macros)
modified gravity (MOG) modified Newtonian dynamics (MoND)
Tensor–vector–scalar gravity (TeVeS)
Entropic gravity

Dark matter can refer to any substance which interacts predominantly via gravity with visible matter (e.g., stars and planets). Hence in principle it need not be composed of a new type of fundamental particle but could, at least in part, be made up of standard baryonic matter, such as protons or neutrons. However, for the reasons outlined below, most scientists think the dark matter is dominated by a non-baryonic component, which is likely composed of a currently unknown fundamental particle (or similar exotic state).

Fermi-LAT observations of dwarf galaxies provide new insights on dark matter.

Baryonic matter

Baryons (protons and neutrons) make up ordinary stars and planets. However, baryonic matter also encompasses less common non-primordial black holes, neutron stars, faint old white dwarfs and brown dwarfs, collectively known as massive compact halo objects (MACHOs), which can be hard to detect.

However, multiple lines of evidence suggest the majority of dark matter is not made of baryons:

  • Sufficient diffuse, baryonic gas or dust would be visible when backlit by stars.
  • The theory of Big Bang nucleosynthesis predicts the observed abundance of the chemical elements. If there are more baryons, then there should also be more helium, lithium and heavier elements synthesized during the Big Bang. Agreement with observed abundances requires that baryonic matter makes up between 4–5% of the universe's critical density. In contrast, large-scale structure and other observations indicate that the total matter density is about 30% of the critical density.
  • Astronomical searches for gravitational microlensing in the Milky Way found at most only a small fraction of the dark matter may be in dark, compact, conventional objects (MACHOs, etc.); the excluded range of object masses is from half the Earth's mass up to 30 solar masses, which covers nearly all the plausible candidates.
  • Detailed analysis of the small irregularities (anisotropies) in the cosmic microwave background. Observations by WMAP and Planck indicate that around five-sixths of the total matter is in a form that interacts significantly with ordinary matter or photons only through gravitational effects.

Non-baryonic matter

Candidates for non-baryonic dark matter are hypothetical particles such as axions, sterile neutrinos, weakly interacting massive particles (WIMPs), gravitationally-interacting massive particles (GIMPs), supersymmetric particles, geons, or primordial black holes. The three neutrino types already observed are indeed abundant, and dark, and matter, but because their individual masses – however uncertain they may be – are almost certainly too tiny, they can only supply a small fraction of dark matter, due to limits derived from large-scale structure and high-redshift galaxies.

Unlike baryonic matter, nonbaryonic matter did not contribute to the formation of the elements in the early universe (Big Bang nucleosynthesis) and so its presence is revealed only via its gravitational effects, or weak lensing. In addition, if the particles of which it is composed are supersymmetric, they can undergo annihilation interactions with themselves, possibly resulting in observable by-products such as gamma rays and neutrinos (indirect detection).

Dark matter aggregation and dense dark matter objects

If dark matter is composed of weakly-interacting particles, then an obvious question is whether it can form objects equivalent to planets, stars, or black holes. Historically, the answer has been it cannot, because of two factors:

It lacks an efficient means to lose energy
Ordinary matter forms dense objects because it has numerous ways to lose energy. Losing energy would be essential for object formation, because a particle that gains energy during compaction or falling "inward" under gravity, and cannot lose it any other way, will heat up and increase velocity and momentum. Dark matter appears to lack a means to lose energy, simply because it is not capable of interacting strongly in other ways except through gravity. The virial theorem suggests that such a particle would not stay bound to the gradually forming object – as the object began to form and compact, the dark matter particles within it would speed up and tend to escape.
 
It lacks a range of interactions needed to form structures
Ordinary matter interacts in many different ways, which allows the matter to form more complex structures. For example, stars form through gravity, but the particles within them interact and can emit energy in the form of neutrinos and electromagnetic radiation through fusion when they become energetic enough. Protons and neutrons can bind via the strong interaction and then form atoms with electrons largely through electromagnetic interaction. There is no evidence that dark matter is capable of such a wide variety of interactions, since it seems to only interact through gravity (and possibly through some means no stronger than the weak interaction, although until dark matter is better understood, this is only speculation).

In 2015–2017, the idea that dense dark matter was composed of primordial black holes made a comeback following results of gravitational wave measurements which detected the merger of intermediate-mass black holes. Black holes with about 30 solar masses are not predicted to form by either stellar collapse (typically less than 15 solar masses) or by the merger of black holes in galactic centers (millions or billions of solar masses). It was proposed that the intermediate-mass black holes causing the detected merger formed in the hot dense early phase of the universe due to denser regions collapsing. A later survey of about a thousand supernovae detected no gravitational lensing events, when about eight would be expected if intermediate-mass primordial black holes above a certain mass range accounted for the majority of dark matter.

The possibility that atom-sized primordial black holes account for a significant fraction of dark matter was ruled out by measurements of positron and electron fluxes outside the Sun's heliosphere by the Voyager 1 spacecraft. Tiny black holes are theorized to emit Hawking radiation. However the detected fluxes were too low and did not have the expected energy spectrum, suggesting that tiny primordial black holes are not widespread enough to account for dark matter. Nonetheless, research and theories proposing dense dark matter accounts for dark matter continue as of 2018, including approaches to dark matter cooling, and the question remains unsettled. In 2019, the lack of microlensing effects in the observation of Andromeda suggests that tiny black holes do not exist.

However, there still exists a largely unconstrained mass range smaller than that which can be limited by optical microlensing observations, where primordial black holes may account for all dark matter.

Free streaming length

Dark matter can be divided into cold, warm, and hot categories. These categories refer to velocity rather than an actual temperature, indicating how far corresponding objects moved due to random motions in the early universe, before they slowed due to cosmic expansion – this is an important distance called the free streaming length (FSL). Primordial density fluctuations smaller than this length get washed out as particles spread from overdense to underdense regions, while larger fluctuations are unaffected; therefore this length sets a minimum scale for later structure formation.

The categories are set with respect to the size of a protogalaxy (an object that later evolves into a dwarf galaxy): Dark matter particles are classified as cold, warm, or hot according to their FSL; much smaller (cold), similar to (warm), or much larger (hot) than a protogalaxy. Mixtures of the above are also possible: a theory of mixed dark matter was popular in the mid-1990s, but was rejected following the discovery of dark energy.

Cold dark matter leads to a bottom-up formation of structure with galaxies forming first and galaxy clusters at a latter stage, while hot dark matter would result in a top-down formation scenario with large matter aggregations forming early, later fragmenting into separate galaxies; the latter is excluded by high-redshift galaxy observations.

Fluctuation spectrum effects

These categories also correspond to fluctuation spectrum effects and the interval following the Big Bang at which each type became non-relativistic. Davis et al. wrote in 1985:

Candidate particles can be grouped into three categories on the basis of their effect on the fluctuation spectrum (Bond et al. 1983). If the dark matter is composed of abundant light particles which remain relativistic until shortly before recombination, then it may be termed "hot". The best candidate for hot dark matter is a neutrino ... A second possibility is for the dark matter particles to interact more weakly than neutrinos, to be less abundant, and to have a mass of order 1 keV. Such particles are termed "warm dark matter", because they have lower thermal velocities than massive neutrinos ... there are at present few candidate particles which fit this description. Gravitinos and photinos have been suggested (Pagels and Primack 1982; Bond, Szalay and Turner 1982) ... Any particles which became nonrelativistic very early, and so were able to diffuse a negligible distance, are termed "cold" dark matter (CDM). There are many candidates for CDM including supersymmetric particles.

— M. Davis, G. Efstathiou, C.S. Frenk, and S.D.M. White, The evolution of large-scale structure in a universe dominated by cold dark matter

Alternative definitions

Another approximate dividing line is warm dark matter became non-relativistic when the universe was approximately 1 year old and 1 millionth of its present size and in the radiation-dominated era (photons and neutrinos), with a photon temperature 2.7 million Kelvins. Standard physical cosmology gives the particle horizon size as 2 c t (speed of light multiplied by time) in the radiation-dominated era, thus 2 light-years. A region of this size would expand to 2 million light-years today (absent structure formation). The actual FSL is approximately 5 times the above length, since it continues to grow slowly as particle velocities decrease inversely with the scale factor after they become non-relativistic. In this example the FSL would correspond to 10 million light-years, or 3 megaparsecs, today, around the size containing an average large galaxy.

The 2.7 million K photon temperature gives a typical photon energy of 250 electronvolts, thereby setting a typical mass scale for warm dark matter: particles much more massive than this, such as GeV–TeV mass WIMPs, would become non-relativistic much earlier than one year after the Big Bang and thus have FSLs much smaller than a protogalaxy, making them cold. Conversely, much lighter particles, such as neutrinos with masses of only a few eV, have FSLs much larger than a protogalaxy, thus qualifying them as hot.

Cold dark matter

Cold dark matter offers the simplest explanation for most cosmological observations. It is dark matter composed of constituents with an FSL much smaller than a protogalaxy. This is the focus for dark matter research, as hot dark matter does not seem capable of supporting galaxy or galaxy cluster formation, and most particle candidates slowed early.

The constituents of cold dark matter are unknown. Possibilities range from large objects like MACHOs (such as black holes and Preon stars) or RAMBOs (such as clusters of brown dwarfs), to new particles such as WIMPs and axions.

Studies of Big Bang nucleosynthesis and gravitational lensing convinced most cosmologists that MACHOs cannot make up more than a small fraction of dark matter. According to A. Peter: "... the only really plausible dark-matter candidates are new particles."

The 1997 DAMA/NaI experiment and its successor DAMA/LIBRA in 2013, claimed to directly detect dark matter particles passing through the Earth, but many researchers remain skeptical, as negative results from similar experiments seem incompatible with the DAMA results.

Many supersymmetric models offer dark matter candidates in the form of the WIMPy Lightest Supersymmetric Particle (LSP). Separately, heavy sterile neutrinos exist in non-supersymmetric extensions to the standard model which explain the small neutrino mass through the seesaw mechanism.

Warm dark matter

Warm dark matter comprises particles with an FSL comparable to the size of a protogalaxy. Predictions based on warm dark matter are similar to those for cold dark matter on large scales, but with less small-scale density perturbations. This reduces the predicted abundance of dwarf galaxies and may lead to lower density of dark matter in the central parts of large galaxies. Some researchers consider this a better fit to observations. A challenge for this model is the lack of particle candidates with the required mass ≈ 300 eV to 3000 eV.

No known particles can be categorized as warm dark matter. A postulated candidate is the sterile neutrino: A heavier, slower form of neutrino that does not interact through the weak force, unlike other neutrinos. Some modified gravity theories, such as scalar–tensor–vector gravity, require "warm" dark matter to make their equations work.

Hot dark matter

Hot dark matter consists of particles whose FSL is much larger than the size of a protogalaxy. The neutrino qualifies as such particle. They were discovered independently, long before the hunt for dark matter: they were postulated in 1930, and detected in 1956. Neutrinos' mass is less than 10−6 that of an electron. Neutrinos interact with normal matter only via gravity and the weak force, making them difficult to detect (the weak force only works over a small distance, thus a neutrino triggers a weak force event only if it hits a nucleus head-on). This makes them “weakly interacting slender particles” (WISPs), as opposed to WIMPs.

The three known flavours of neutrinos are the electron, muon, and tau. Their masses are slightly different. Neutrinos oscillate among the flavours as they move. It is hard to determine an exact upper bound on the collective average mass of the three neutrinos (or for any of the three individually). For example, if the average neutrino mass were over 50 eV/c2 (less than 10−5 of the mass of an electron), the universe would collapse. CMB data and other methods indicate that their average mass probably does not exceed 0.3 eV/c2. Thus, observed neutrinos cannot explain dark matter.

Because galaxy-size density fluctuations get washed out by free-streaming, hot dark matter implies the first objects that can form are huge supercluster-size pancakes, which then fragment into galaxies. Deep-field observations show instead that galaxies formed first, followed by clusters and superclusters as galaxies clump together.

Detection of dark matter particles

If dark matter is made up of subatomic particles, then millions, possibly billions, of such particles must pass through every square centimeter of the Earth each second. Many experiments aim to test this hypothesis. Although WIMPs are popular search candidates, the Axion Dark Matter Experiment (ADMX) searches for axions. Another candidate is heavy hidden sector particles which only interact with ordinary matter via gravity.

These experiments can be divided into two classes: direct detection experiments, which search for the scattering of dark matter particles off atomic nuclei within a detector; and indirect detection, which look for the products of dark matter particle annihilations or decays.

Direct detection

Direct detection experiments aim to observe low-energy recoils (typically a few keVs) of nuclei induced by interactions with particles of dark matter, which (in theory) are passing through the Earth. After such a recoil the nucleus will emit energy in the form of scintillation light or phonons, as they pass through sensitive detection apparatus. To do this effectively, it is crucial to maintain a low background, and so such experiments operate deep underground to reduce the interference from cosmic rays. Examples of underground laboratories with direct detection experiments include the Stawell mine, the Soudan mine, the SNOLAB underground laboratory at Sudbury, the Gran Sasso National Laboratory, the Canfranc Underground Laboratory, the Boulby Underground Laboratory, the Deep Underground Science and Engineering Laboratory and the China Jinping Underground Laboratory.

These experiments mostly use either cryogenic or noble liquid detector technologies. Cryogenic detectors operating at temperatures below 100 mK, detect the heat produced when a particle hits an atom in a crystal absorber such as germanium. Noble liquid detectors detect scintillation produced by a particle collision in liquid xenon or argon. Cryogenic detector experiments include: CDMS, CRESST, EDELWEISS, EURECA. Noble liquid experiments include ZEPLIN, XENON, DEAP, ArDM, WARP, DarkSide, PandaX, and LUX, the Large Underground Xenon experiment. Both of these techniques focus strongly on their ability to distinguish background particles (which predominantly scatter off electrons) from dark matter particles (that scatter off nuclei). Other experiments include SIMPLE and PICASSO.

Currently there has been no well-established claim of dark matter detection from a direct detection experiment, leading instead to strong upper limits on the mass and interaction cross section with nucleons of such dark matter particles. The DAMA/NaI and more recent DAMA/LIBRA experimental collaborations have detected an annual modulation in the rate of events in their detectors, which they claim is due to dark matter. This results from the expectation that as the Earth orbits the Sun, the velocity of the detector relative to the dark matter halo will vary by a small amount. This claim is so far unconfirmed and in contradiction with negative results from other experiments such as LUX, SuperCDMS and XENON100.

A special case of direct detection experiments covers those with directional sensitivity. This is a search strategy based on the motion of the Solar System around the Galactic Center. A low-pressure time projection chamber makes it possible to access information on recoiling tracks and constrain WIMP-nucleus kinematics. WIMPs coming from the direction in which the Sun travels (approximately towards Cygnus) may then be separated from background, which should be isotropic. Directional dark matter experiments include DMTPC, DRIFT, Newage and MIMAC.

Indirect detection

Collage of six cluster collisions with dark matter maps. The clusters were observed in a study of how dark matter in clusters of galaxies behaves when the clusters collide.
 
Video about the potential gamma-ray detection of dark matter annihilation around supermassive black holes. (Duration 0:03:13, also see file description.)

Indirect detection experiments search for the products of the self-annihilation or decay of dark matter particles in outer space. For example, in regions of high dark matter density (e.g., the centre of our galaxy) two dark matter particles could annihilate to produce gamma rays or Standard Model particle–antiparticle pairs. Alternatively, if a dark matter particle is unstable, it could decay into Standard Model (or other) particles. These processes could be detected indirectly through an excess of gamma rays, antiprotons or positrons emanating from high density regions in our galaxy or others. A major difficulty inherent in such searches is that various astrophysical sources can mimic the signal expected from dark matter, and so multiple signals are likely required for a conclusive discovery.

A few of the dark matter particles passing through the Sun or Earth may scatter off atoms and lose energy. Thus dark matter may accumulate at the center of these bodies, increasing the chance of collision/annihilation. This could produce a distinctive signal in the form of high-energy neutrinos. Such a signal would be strong indirect proof of WIMP dark matter. High-energy neutrino telescopes such as AMANDA, IceCube and ANTARES are searching for this signal. The detection by LIGO in September 2015 of gravitational waves opens the possibility of observing dark matter in a new way, particularly if it is in the form of primordial black holes.

Many experimental searches have been undertaken to look for such emission from dark matter annihilation or decay, examples of which follow. The Energetic Gamma Ray Experiment Telescope observed more gamma rays in 2008 than expected from the Milky Way, but scientists concluded this was most likely due to incorrect estimation of the telescope's sensitivity.

The Fermi Gamma-ray Space Telescope is searching for similar gamma rays. In April 2012, an analysis of previously available data from its Large Area Telescope instrument produced statistical evidence of a 130 GeV signal in the gamma radiation coming from the center of the Milky Way. WIMP annihilation was seen as the most probable explanation.

At higher energies, ground-based gamma-ray telescopes have set limits on the annihilation of dark matter in dwarf spheroidal galaxies and in clusters of galaxies.

The PAMELA experiment (launched in 2006) detected excess positrons. They could be from dark matter annihilation or from pulsars. No excess antiprotons were observed.

In 2013 results from the Alpha Magnetic Spectrometer on the International Space Station indicated excess high-energy cosmic rays which could be due to dark matter annihilation.

Collider searches for dark matter

An alternative approach to the detection of dark matter particles in nature is to produce them in a laboratory. Experiments with the Large Hadron Collider (LHC) may be able to detect dark matter particles produced in collisions of the LHC proton beams. Because a dark matter particle should have negligible interactions with normal visible matter, it may be detected indirectly as (large amounts of) missing energy and momentum that escape the detectors, provided other (non-negligible) collision products are detected. Constraints on dark matter also exist from the LEP experiment using a similar principle, but probing the interaction of dark matter particles with electrons rather than quarks. Any discovery from collider searches must be corroborated by discoveries in the indirect or direct detection sectors to prove that the particle discovered is, in fact, dark matter.

Alternative hypotheses

Because dark matter has not yet been conclusively identified, many other hypotheses have emerged aiming to explain the observational phenomena that dark matter was conceived to explain. The most common method is to modify general relativity. General relativity is well-tested on solar system scales, but its validity on galactic or cosmological scales has not been well proven. A suitable modification to general relativity can conceivably eliminate the need for dark matter. The best-known theories of this class are MOND and its relativistic generalization tensor-vector-scalar gravity (TeVeS), f(R) gravity, negative mass, dark fluid, and entropic gravity. Alternative theories abound.

A problem with alternative hypotheses is that observational evidence for dark matter comes from so many independent approaches (see the "observational evidence" section above). Explaining any individual observation is possible but explaining all of them in the absence of dark matter is very difficult. Nonetheless, there have been some scattered successes for alternative hypotheses, such as a 2016 test of gravitational lensing in entropic gravity and a 2020 measurement of a unique MOND effect.

The prevailing opinion among most astrophysicists is that while modifications to general relativity can conceivably explain part of the observational evidence, there is probably enough data to conclude there must be some form of dark matter present in the Universe.

In popular culture

Mention of dark matter is made in works of fiction. In such cases, it is usually attributed extraordinary physical or magical properties. Such descriptions are often inconsistent with the hypothesized properties of dark matter in physics and cosmology.

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...