Search This Blog

Monday, July 21, 2025

Antiparticle

From Wikipedia, the free encyclopedia
Diagram illustrating the particles and antiparticles of electron, neutron and proton, as well as their "size" (not to scale). It is easier to identify them by looking at the total mass of both the antiparticle and particle. On the left, from top to bottom, is shown an electron (small red dot), a proton (big blue dot), and a neutron (big dot, black in the middle, gradually fading to white near the edges). On the right, from top to bottom, are shown the anti electron (small blue dot), anti proton (big red dot) and anti neutron (big dot, white in the middle, fading to black near the edges).
Illustration of electric charge of particles (left) and antiparticles (right). From top to bottom; electron/positron, proton/antiproton, neutron/antineutron.

In particle physics, every type of particle of "ordinary" matter (as opposed to antimatter) is associated with an antiparticle with the same mass but with opposite physical charges (such as electric charge). For example, the antiparticle of the electron is the positron (also known as an antielectron). While the electron has a negative electric charge, the positron has a positive electric charge, and is produced naturally in certain types of radioactive decay. The opposite is also true: the antiparticle of the positron is the electron.

Some particles, such as the photon, are their own antiparticle. Otherwise, for each pair of antiparticle partners, one is designated as the normal particle (the one that occurs in matter usually interacted with in daily life). The other (usually given the prefix "anti-") is designated the antiparticle.

Particle–antiparticle pairs can annihilate each other, producing photons; since the charges of the particle and antiparticle are opposite, total charge is conserved. For example, the positrons produced in natural radioactive decay quickly annihilate themselves with electrons, producing pairs of gamma rays, a process exploited in positron emission tomography.

The laws of nature are very nearly symmetrical with respect to particles and antiparticles. For example, an antiproton and a positron can form an antihydrogen atom, which is believed to have the same properties as a hydrogen atom. This leads to the question of why the formation of matter after the Big Bang resulted in a universe consisting almost entirely of matter, rather than being a half-and-half mixture of matter and antimatter. The discovery of charge parity violation helped to shed light on this problem by showing that this symmetry, originally thought to be perfect, was only approximate. The question about how the formation of matter after the Big Bang resulted in a universe consisting almost entirely of matter remains an unanswered one, and explanations so far are not truly satisfactory, overall.

Because charge is conserved, it is not possible to create an antiparticle without either destroying another particle of the same charge (as is for instance the case when antiparticles are produced naturally via beta decay or the collision of cosmic rays with Earth's atmosphere), or by the simultaneous creation of both a particle and its antiparticle (pair production), which can occur in particle accelerators such as the Large Hadron Collider at CERN.

Particles and their antiparticles have equal and opposite charges, so that an uncharged particle also gives rise to an uncharged antiparticle. In many cases, the antiparticle and the particle coincide: pairs of photons, Z0 bosons, π0
 mesons, and hypothetical gravitons and some hypothetical WIMPs all self-annihilate. However, electrically neutral particles need not be identical to their antiparticles: for example, the neutron and antineutron are distinct.

History

Experiment

In 1932, soon after the prediction of positrons by Paul Dirac, Carl D. Anderson found that cosmic-ray collisions produced these particles in a cloud chamber – a particle detector in which moving electrons (or positrons) leave behind trails as they move through the gas. The electric charge-to-mass ratio of a particle can be measured by observing the radius of curling of its cloud-chamber track in a magnetic field. Positrons, because of the direction that their paths curled, were at first mistaken for electrons travelling in the opposite direction. Positron paths in a cloud-chamber trace the same helical path as an electron but rotate in the opposite direction with respect to the magnetic field direction due to their having the same magnitude of charge-to-mass ratio but with opposite charge and, therefore, opposite signed charge-to-mass ratios.

The antiproton and antineutron were found by Emilio Segrè and Owen Chamberlain in 1955 at the University of California, Berkeley. Since then, the antiparticles of many other subatomic particles have been created in particle accelerator experiments. In recent years, complete atoms of antimatter have been assembled out of antiprotons and positrons, collected in electromagnetic traps.

Dirac hole theory

... the development of quantum field theory made the interpretation of antiparticles as holes unnecessary, even though it lingers on in many textbooks.

Solutions of the Dirac equation contain negative energy quantum states. As a result, an electron could always radiate energy and fall into a negative energy state. Even worse, it could keep radiating infinite amounts of energy because there were infinitely many negative energy states available. To prevent this unphysical situation from happening, Dirac proposed that a "sea" of negative-energy electrons fills the universe, already occupying all of the lower-energy states so that, due to the Pauli exclusion principle, no other electron could fall into them. Sometimes, however, one of these negative-energy particles could be lifted out of this Dirac sea to become a positive-energy particle. But, when lifted out, it would leave behind a hole in the sea that would act exactly like a positive-energy electron with a reversed charge. These holes were interpreted as "negative-energy electrons" by Paul Dirac and mistakenly identified with protons in his 1930 paper A Theory of Electrons and Protons. However, these "negative-energy electrons" turned out to be positrons, and not protons.

This picture implied an infinite negative charge for the universe – a problem of which Dirac was aware. Dirac tried to argue that we would perceive this as the normal state of zero charge. Another difficulty was the difference in masses of the electron and the proton. Dirac tried to argue that this was due to the electromagnetic interactions with the sea, until Hermann Weyl proved that hole theory was completely symmetric between negative and positive charges. Dirac also predicted a reaction e
 + p+
 → γ + γ, where an electron and a proton annihilate to give two photons. Robert Oppenheimer and Igor Tamm, however, proved that this would cause ordinary matter to disappear too fast. A year later, in 1931, Dirac modified his theory and postulated the positron, a new particle of the same mass as the electron. The discovery of this particle the next year removed the last two objections to his theory.

Within Dirac's theory, the problem of infinite charge of the universe remains. Some bosons also have antiparticles, but since bosons do not obey the Pauli exclusion principle (only fermions do), hole theory does not work for them. A unified interpretation of antiparticles is now available in quantum field theory, which solves both these problems by describing antimatter as negative energy states of the same underlying matter field, i.e. particles moving backwards in time.

Elementary antiparticles

Antiquarks
Generation Name Symbol Spin Charge (e) Mass (MeV/c2)
Observed
1 up antiquark u 12 23 2.2+0.6
−0.4
Yes
down antiquark d 12 +13 4.6+0.5
−0.4
Yes
2 charm antiquark c 12 23 1280±30 Yes
strange antiquark s 12 +13 96+8
−4
Yes
3 top antiquark t 12 23 173100±600 Yes
bottom antiquark b 12 +13 4180+40
−30
Yes
Antileptons
Generation Name Symbol Spin Charge (e) Mass (MeV/c2)
Observed
1 positron e+
 1 /2 +1 0.511 Yes
electron antineutrino ν
e
 1 /2 0 < 0.0000022 Yes
2 antimuon μ+
 1 /2 +1 105.7 Yes
muon antineutrino ν
μ
 1 /2 0 < 0.170 Yes
3 antitau τ+
 1 /2 +1 1776.86±0.12 Yes
tau antineutrino ν
τ
 1 /2 0 < 15.5 Yes
Antibosons
Name Symbol Spin Charge (e) Mass (GeV/c2)
Interaction mediated Observed
anti W boson W+
1 +1 80.385±0.015 weak interaction Yes

Composite antiparticles


Class Subclass Name Symbol Spin Charge

(e)

Mass (MeV/c2) Mass (kg) Observed
Antihadron Antibaryon Antiproton p  1 /2 −1 938.27208943(29) 1.67262192595(52)×10−27 Yes
Antineutron n  1 /2 0 939.56542194(48) ? Yes

Particle–antiparticle annihilation

Feynman diagram of a kaon oscillation. A straight red line suddenly turns purple, showing a kaon changing into an antikaon. A medallion is show zooming in on the region where the line changes color. The medallion shows that the line is not straight, but rather that at the place the kaon changes into an antikaon, the red line breaks into two curved lines, corresponding the production of virtual pions, which rejoin into the violet line, corresponding to the annihilation of the virtual pions.
An example of a virtual pion pair that influences the propagation of a kaon, causing a neutral kaon to mix with the antikaon. This is an example of renormalization in quantum field theory – the field theory being necessary because of the change in particle number.

If a particle and antiparticle are in the appropriate quantum states, then they can annihilate each other and produce other particles. Reactions such as e
 + e+
 → γγ (the two-photon annihilation of an electron-positron pair) are an example. The single-photon annihilation of an electron-positron pair, e
 + e+
 → γ, cannot occur in free space because it is impossible to conserve energy and momentum together in this process. However, in the Coulomb field of a nucleus the translational invariance is broken and single-photon annihilation may occur. The reverse reaction (in free space, without an atomic nucleus) is also impossible for this reason. In quantum field theory, this process is allowed only as an intermediate quantum state for times short enough that the violation of energy conservation can be accommodated by the uncertainty principle. This opens the way for virtual pair production or annihilation in which a one particle quantum state may fluctuate into a two particle state and back. These processes are important in the vacuum state and renormalization of a quantum field theory. It also opens the way for neutral particle mixing through processes such as the one pictured here, which is a complicated example of mass renormalization.

Properties

Quantum states of a particle and an antiparticle are interchanged by the combined application of charge conjugation , parity and time reversal . and are linear, unitary operators, is antilinear and antiunitary, . If denotes the quantum state of a particle with momentum and spin whose component in the z-direction is , then one has

where denotes the charge conjugate state, that is, the antiparticle. In particular a massive particle and its antiparticle transform under the same irreducible representation of the Poincaré group which means the antiparticle has the same mass and the same spin.

If , and can be defined separately on the particles and antiparticles, then

where the proportionality sign indicates that there might be a phase on the right hand side.

As anticommutes with the charges, , particle and antiparticle have opposite electric charges q and -q.

Quantum field theory

This section draws upon the ideas, language and notation of canonical quantization of a quantum field theory.

One may try to quantize an electron field without mixing the annihilation and creation operators by writing

where we use the symbol k to denote the quantum numbers p and σ of the previous section and the sign of the energy, E(k), and ak denotes the corresponding annihilation operators. Of course, since we are dealing with fermions, we have to have the operators satisfy canonical anti-commutation relations. However, if one now writes down the Hamiltonian

then one sees immediately that the expectation value of H need not be positive. This is because E(k) can have any sign whatsoever, and the combination of creation and annihilation operators has expectation value 1 or 0.

So one has to introduce the charge conjugate antiparticle field, with its own creation and annihilation operators satisfying the relations

where k has the same p, and opposite σ and sign of the energy. Then one can rewrite the field in the form

where the first sum is over positive energy states and the second over those of negative energy. The energy becomes

where E0 is an infinite negative constant. The vacuum state is defined as the state with no particle or antiparticle, i.e., and . Then the energy of the vacuum is exactly E0. Since all energies are measured relative to the vacuum, H is positive definite. Analysis of the properties of ak and bk shows that one is the annihilation operator for particles and the other for antiparticles. This is the case of a fermion.

This approach is due to Vladimir Fock, Wendell Furry and Robert Oppenheimer. If one quantizes a real scalar field, then one finds that there is only one kind of annihilation operator; therefore, real scalar fields describe neutral bosons. Since complex scalar fields admit two different kinds of annihilation operators, which are related by conjugation, such fields describe charged bosons.

Feynman–Stückelberg interpretation

By considering the propagation of the negative energy modes of the electron field backward in time, Ernst Stückelberg reached a pictorial understanding of the fact that the particle and antiparticle have equal mass m and spin J but opposite charges q. This allowed him to rewrite perturbation theory precisely in the form of diagrams. Richard Feynman later gave an independent systematic derivation of these diagrams from a particle formalism, and they are now called Feynman diagrams. Each line of a diagram represents a particle propagating either backward or forward in time. In Feynman diagrams, anti-particles are shown traveling backwards in time relative to normal matter, and vice versa. This technique is the most widespread method of computing amplitudes in quantum field theory today.

Since this picture was first developed by Stückelberg, and acquired its modern form in Feynman's work, it is called the Feynman–Stückelberg interpretation of antiparticles to honor both scientists.

Strong interaction

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Strong_interaction
An animation of color confinement, a property of the strong interaction. If energy is supplied to the quarks as shown, the gluon tube connecting quarks elongates until it reaches a point where it "snaps" and the energy added to the system results in the formation of a quark–antiquark pair. Thus single quarks are never seen in isolation.
An animation of the strong interaction between a proton and a neutron, mediated by pions. The colored small double circles inside are gluons.

In nuclear physics and particle physics, the strong interaction, also called the strong force or strong nuclear force, is one of the four known fundamental interactions. It confines quarks into protons, neutrons, and other hadron particles, and also binds neutrons and protons to create atomic nuclei, where it is called the nuclear force.

Most of the mass of a proton or neutron is the result of the strong interaction energy; the individual quarks provide only about 1% of the mass of a proton. At the range of 10−15 m (1 femtometer, slightly more than the radius of a nucleon), the strong force is approximately 100 times as strong as electromagnetism, 106 times as strong as the weak interaction, and 1038 times as strong as gravitation.

In the context of atomic nuclei, the force binds protons and neutrons together to form a nucleus and is called the nuclear force (or residual strong force). Because the force is mediated by massive, short lived mesons on this scale, the residual strong interaction obeys a distance-dependent behavior between nucleons that is quite different from when it is acting to bind quarks within hadrons. There are also differences in the binding energies of the nuclear force with regard to nuclear fusion versus nuclear fission. Nuclear fusion accounts for most energy production in the Sun and other stars. Nuclear fission allows for decay of radioactive elements and isotopes, although it is often mediated by the weak interaction. Artificially, the energy associated with the nuclear force is partially released in nuclear power and nuclear weapons, both in uranium or plutonium-based fission weapons and in fusion weapons like the hydrogen bomb.

History

Before 1971, physicists were uncertain as to how the atomic nucleus was bound together. It was known that the nucleus was composed of protons and neutrons and that protons possessed positive electric charge, while neutrons were electrically neutral. By the understanding of physics at that time, positive charges would repel one another and the positively charged protons should cause the nucleus to fly apart. However, this was never observed. New physics was needed to explain this phenomenon.

A stronger attractive force was postulated to explain how the atomic nucleus was bound despite the protons' mutual electromagnetic repulsion. This hypothesized force was called the strong force, which was believed to be a fundamental force that acted on the protons and neutrons that make up the nucleus.

In 1964, Murray Gell-Mann, and separately George Zweig, proposed that baryons, which include protons and neutrons, and mesons were composed of elementary particles. Zweig called the elementary particles "aces" while Gell-Mann called them "quarks"; the theory came to be called the quark model. The strong attraction between nucleons was the side-effect of a more fundamental force that bound the quarks together into protons and neutrons. The theory of quantum chromodynamics explains that quarks carry what is called a color charge, although it has no relation to visible color. Quarks with unlike color charge attract one another as a result of the strong interaction, and the particle that mediates this was called the gluon.

Behavior of the strong interaction

The strong interaction is observable at two ranges, and mediated by different force carriers in each one. On a scale less than about 0.8 fm (roughly the radius of a nucleon), the force is carried by gluons and holds quarks together to form protons, neutrons, and other hadrons. On a larger scale, up to about 3 fm, the force is carried by mesons and binds nucleons (protons and neutrons) together to form the nucleus of an atom. In the former context, it is often known as the color force, and is so strong that if hadrons are struck by high-energy particles, they produce jets of massive particles instead of emitting their constituents (quarks and gluons) as freely moving particles. This property of the strong force is called color confinement.

Two layers of strong interaction
Interaction range held carrier result
Strong < 0.8 fm quark gluon hadron
Residual Strong 1–3 fm hadron meson nucleus

Within hadrons

The fundamental couplings of the strong interaction, from left to right: (a) gluon radiation, (b) gluon splitting and (c,d) gluon self-coupling.

The word strong is used since the strong interaction is the "strongest" of the four fundamental forces. At a distance of 10−15 m, its strength is around 100 times that of the electromagnetic force, some 106 times as great as that of the weak force, and about 1038 times that of gravitation.

The strong force is described by quantum chromodynamics (QCD), a part of the Standard Model of particle physics. Mathematically, QCD is a non-abelian gauge theory based on a local (gauge) symmetry group called SU(3).

The force carrier particle of the strong interaction is the gluon, a massless gauge boson. Gluons are thought to interact with quarks and other gluons by way of a type of charge called color charge. Color charge is analogous to electromagnetic charge, but it comes in three types (±red, ±green, and ±blue) rather than one, which results in different rules of behavior. These rules are described by quantum chromodynamics (QCD), the theory of quark–gluon interactions. Unlike the photon in electromagnetism, which is neutral, the gluon carries a color charge. Quarks and gluons are the only fundamental particles that carry non-vanishing color charge, and hence they participate in strong interactions only with each other. The strong force is the expression of the gluon interaction with other quark and gluon particles.

All quarks and gluons in QCD interact with each other through the strong force. The strength of interaction is parameterized by the strong coupling constant. This strength is modified by the gauge color charge of the particle, a group-theoretical property.

The strong force acts between quarks. Unlike all other forces (electromagnetic, weak, and gravitational), the strong force does not diminish in strength with increasing distance between pairs of quarks. After a limiting distance (about the size of a hadron) has been reached, it remains at a strength of about 10000 N, no matter how much farther the distance between the quarks. As the separation between the quarks grows, the energy added to the pair creates new pairs of matching quarks between the original two; hence it is impossible to isolate quarks. The explanation is that the amount of work done against a force of 10000 N is enough to create particle–antiparticle pairs within a very short distance. The energy added to the system by pulling two quarks apart would create a pair of new quarks that will pair up with the original ones. In QCD, this phenomenon is called color confinement; as a result, only hadrons, not individual free quarks, can be observed. The failure of all experiments that have searched for free quarks is considered to be evidence of this phenomenon.

The elementary quark and gluon particles involved in a high energy collision are not directly observable. The interaction produces jets of newly created hadrons that are observable. Those hadrons are created, as a manifestation of mass–energy equivalence, when sufficient energy is deposited into a quark–quark bond, as when a quark in one proton is struck by a very fast quark of another impacting proton during a particle accelerator experiment. However, quark–gluon plasmas have been observed.

Between hadrons

A Feynman diagram (shown by the animation in the lead) with the individual quark constituents shown, to illustrate how the fundamental strong interaction gives rise to the nuclear force. Straight lines are quarks, while multi-colored loops are gluons (the carriers of the fundamental force).

While color confinement implies that the strong force acts without distance-diminishment between pairs of quarks in compact collections of bound quarks (hadrons), at distances approaching or greater than the radius of a proton, a residual force (described below) remains. It manifests as a force between the "colorless" hadrons, and is known as the nuclear force or residual strong force (and historically as the strong nuclear force).

The nuclear force acts between hadrons, known as mesons and baryons. This "residual strong force", acting indirectly, transmits gluons that form part of the virtual π and ρ mesons, which, in turn, transmit the force between nucleons that holds the nucleus (beyond hydrogen-1 nucleus) together.

The residual strong force is thus a minor residuum of the strong force that binds quarks together into protons and neutrons. This same force is much weaker between neutrons and protons, because it is mostly neutralized within them, in the same way that electromagnetic forces between neutral atoms (van der Waals forces) are much weaker than the electromagnetic forces that hold electrons in association with the nucleus, forming the atoms.

Unlike the strong force, the residual strong force diminishes with distance, and does so rapidly. The decrease is approximately as a negative exponential power of distance, though there is no simple expression known for this; see Yukawa potential. The rapid decrease with distance of the attractive residual force and the less rapid decrease of the repulsive electromagnetic force acting between protons within a nucleus, causes the instability of larger atomic nuclei, such as all those with atomic numbers larger than 82 (the element lead).

Although the nuclear force is weaker than the strong interaction itself, it is still highly energetic: transitions produce gamma rays. The mass of a nucleus is significantly different from the summed masses of the individual nucleons. This mass defect is due to the potential energy associated with the nuclear force. Differences between mass defects power nuclear fusion and nuclear fission.

Unification

The so-called Grand Unified Theories (GUT) aim to describe the strong interaction and the electroweak interaction as aspects of a single force, similarly to how the electromagnetic and weak interactions were unified by the Glashow–Weinberg–Salam model into electroweak interaction. The strong interaction has a property called asymptotic freedom, wherein the strength of the strong force diminishes at higher energies (or temperatures). The theorized energy where its strength becomes equal to the electroweak interaction is the grand unification energy. However, no Grand Unified Theory has yet been successfully formulated to describe this process, and Grand Unification remains an unsolved problem in physics.

If GUT is correct, after the Big Bang and during the electroweak epoch of the universe, the electroweak force separated from the strong force. Accordingly, a grand unification epoch is hypothesized to have existed prior to this.

Physics beyond the Standard Model

From Wikipedia, the free encyclopedia
 

Problems with the Standard Model

Despite being the most successful theory of particle physics to date, the Standard Model is not perfect. A large share of the published output of theoretical physicists consists of proposals for various forms of "Beyond the Standard Model" new physics proposals that would modify the Standard Model in ways subtle enough to be consistent with existing data, yet address its imperfections materially enough to predict non-Standard Model outcomes of new experiments that can be proposed.

The Standard Model of elementary particles + hypothetical Graviton

Phenomena not explained

The Standard Model is inherently an incomplete theory. There are fundamental physical phenomena in nature that the Standard Model does not adequately explain:

  • Gravity. The standard model does not explain gravity. The approach of simply adding a graviton to the Standard Model does not recreate what is observed experimentally without other modifications, as yet undiscovered, to the Standard Model. Moreover, the Standard Model is widely considered to be incompatible with the most successful theory of gravity to date, general relativity.
  • Dark matter. Assuming that general relativity and Lambda CDM are true, cosmological observations tell us the standard model explains about 5% of the mass-energy present in the universe. About 26% should be dark matter (the remaining 69% being dark energy) which would behave just like other matter, but which only interacts weakly (if at all) with the Standard Model fields. Yet the Standard Model does not supply any fundamental particles that are good dark matter candidates.
  • Dark energy. As mentioned, the remaining 69% of the universe's energy should consist of the so-called dark energy, a constant energy density for the vacuum. Attempts to explain dark energy in terms of vacuum energy of the standard model lead to a mismatch of 120 orders of magnitude.
  • Neutrino oscillations. According to the Standard Model, neutrinos do not oscillate. However, experiments and astronomical observations have shown that neutrino oscillation does occur. These are typically explained by postulating that neutrinos have mass. Neutrinos do not have mass in the Standard Model, and mass terms for the neutrinos can be added to the Standard Model by hand, but these lead to new theoretical problems. For example, the mass terms need to be extraordinarily small and it is not clear if the neutrino masses would arise in the same way that the masses of other fundamental particles do in the Standard Model. There are also other extensions of the Standard Model for neutrino oscillations which do not assume massive neutrinos, such as Lorentz-violating neutrino oscillations.
  • Matter–antimatter asymmetry. The universe is made out of mostly matter. However, the standard model predicts that matter and antimatter should have been created in (almost) equal amounts if the initial conditions of the universe did not involve disproportionate matter relative to antimatter. The Standard Model can incorporate baryogenesis through sphalerons in a thermodynamic imbalance during the early universe, though the amount of net baryons (and leptons) thus created may not be sufficient to account for the present baryon asymmetry. Thus, there might be no mechanism in the Standard Model to sufficiently explain this asymmetry.

Experimental results not explained

No experimental result is accepted as definitively contradicting the Standard Model at the 5 σ level, widely considered to be the threshold of a discovery in particle physics. Because every experiment contains some degree of statistical and systemic uncertainty, and the theoretical predictions themselves are also almost never calculated exactly and are subject to uncertainties in measurements of the fundamental constants of the Standard Model (some of which are tiny and others of which are substantial), it is to be expected that some of the hundreds of experimental tests of the Standard Model will deviate from it to some extent, even if there were no new physics to be discovered.

At any given moment there are several experimental results standing that significantly differ from a Standard Model-based prediction. In the past, many of these discrepancies have been found to be statistical flukes or experimental errors that vanish as more data has been collected, or when the same experiments were conducted more carefully. On the other hand, any physics beyond the Standard Model would necessarily first appear in experiments as a statistically significant difference between an experiment and the theoretical prediction. The task is to determine which is the case.

In each case, physicists seek to determine if a result is merely a statistical fluke or experimental error on the one hand, or a sign of new physics on the other. More statistically significant results cannot be mere statistical flukes but can still result from experimental error or inaccurate estimates of experimental precision. Frequently, experiments are tailored to be more sensitive to experimental results that would distinguish the Standard Model from theoretical alternatives.

Some of the most notable examples include the following:

  • B meson decay etc. – results from a BaBar experiment may suggest a surplus over Standard Model predictions of a type of particle decay ( B  →  D(*)  τ  ντ ). In this, an electron and positron collide, resulting in a B meson and an antimatter B meson, which then decays into a D meson and a tau lepton as well as a tau antineutrino. While the level of certainty of the excess (3.4 σ in statistical jargon) is not enough to declare a break from the Standard Model, the results are a potential sign of something amiss and are likely to affect existing theories, including those attempting to deduce the properties of Higgs bosons. In 2015, LHCb reported observing a 2.1 σ excess in the same ratio of branching fractions. The Belle experiment also reported an excess. In 2017 a meta analysis of all available data reported a cumulative 5 σ deviation from SM.
  • Neutron lifetime puzzle – Free neutrons are not stable but decay after some time. Currently there are two methods used to measure this lifetime ("bottle" versus "beam") that give different values not within each other's error margin. Currently the lifetime from the bottle method is at  with a difference of 10 seconds below the beam method value of . This problem may be solved by taking into account neutron scattering which decreases the lifetime of the involved neutrons. This error occurs in the bottle method and the effect depends on the shape of the bottle – thus this might be a bottle method only systematic error.

Theoretical predictions not observed

Observation at particle colliders of all of the fundamental particles predicted by the Standard Model has been confirmed. The Higgs boson is predicted by the Standard Model's explanation of the Higgs mechanism, which describes how the weak SU(2) gauge symmetry is broken and how fundamental particles obtain mass; it was the last particle predicted by the Standard Model to be observed. On July 4, 2012, CERN scientists using the Large Hadron Collider announced the discovery of a particle consistent with the Higgs boson, with a mass of about 126 GeV/c2. A Higgs boson was confirmed to exist on March 14, 2013, although efforts to confirm that it has all of the properties predicted by the Standard Model are ongoing.

A few hadrons (i.e. composite particles made of quarks) whose existence is predicted by the Standard Model, which can be produced only at very high energies in very low frequencies have not yet been definitively observed, and "glueballs" (i.e. composite particles made of gluons) have also not yet been definitively observed. Some very low frequency particle decays predicted by the Standard Model have also not yet been definitively observed because insufficient data is available to make a statistically significant observation.

Unexplained relations

  • Koide formula – an unexplained empirical equation remarked upon by Yoshio Koide in 1981, and later by others. It relates the masses of the three charged leptons: . The Standard Model does not predict lepton masses (they are free parameters of the theory). However, the value of the Koide formula being equal to 2/3 within experimental errors of the measured lepton masses suggests the existence of a theory which is able to predict lepton masses.
  • The CKM matrix, if interpreted as a rotation matrix in a 3-dimensional vector space, "rotates" a vector composed of square roots of down-type quark masses into a vector of square roots of up-type quark masses , up to vector lengths, a result due to Kohzo Nishida.
  • The sum of squares of the Yukawa couplings of all Standard Model fermions is approximately 0.984, which is very close to 1. To put it another way, the sum of squares of fermion masses is very close to half of squared Higgs vacuum expectation value. This sum is dominated by the top quark.
  • The sum of squares of boson masses (that is, W, Z, and Higgs bosons) is also very close to half of squared Higgs vacuum expectation value, the ratio is approximately 1.004.
  • Consequently, the sum of squared masses of all Standard Model particles is very close to the squared Higgs vacuum expectation value, the ratio is approximately 0.994.

It is unclear if these empirical relationships represent any underlying physics; according to Koide, the rule he discovered "may be an accidental coincidence".

Theoretical problems

Some features of the standard model are added in an ad hoc way. These are not problems per se (i.e. the theory works fine with the ad hoc insertions), but they imply a lack of understanding. These contrived features have motivated theorists to look for more fundamental theories with fewer parameters. Some of the contrivances are:

  • Hierarchy problem – the standard model introduces particle masses through a process known as spontaneous symmetry breaking caused by the Higgs field. Within the standard model, the mass of the Higgs particle gets some very large quantum corrections due to the presence of virtual particles (mostly virtual top quarks). These corrections are much larger than the actual mass of the Higgs. This means that the bare mass parameter of the Higgs in the standard model must be fine tuned in such a way that almost completely cancels the quantum corrections. This level of fine-tuning is deemed unnatural by many theorists. The problem cannot be formulated in the strict context of the Standard Model, for the Higgs mass cannot be calculated. In a sense, the problem amounts to the worry that a future theory of fundamental particles, in which the Higgs boson mass will be calculable, should not have excessive fine-tunings.
  • Number of parameters – the standard model depends on 19 parameter numbers. Their values are known from experiment, but the origin of the values is unknown. Some theorists have tried to find relations between different parameters, for example, between the masses of particles in different generations or calculating particle masses, such as in asymptotic safety scenarios.
  • Quantum triviality – suggests that it may not be possible to create a consistent quantum field theory involving elementary scalar Higgs particles. This is sometimes called the Landau pole problem. A possible solution is that the renormalized value could go to zero as the cut-off is removed, meaning that the bare value is completely screened by quantum fluctuations.
  • Strong CP problem – it can be argued theoretically that the standard model should contain a term in the strong interaction that breaks CP symmetry, causing slightly different interaction rates for matter vs. antimatter. Experimentally, however, no such violation has been found, implying that the coefficient of this term – if any – would be suspiciously close to zero.

Additional experimental results

Research from experimental data on the cosmological constant, LIGO noise, and pulsar timing, suggests it's very unlikely that there are any new particles with masses much higher than those which can be found in the standard model or the Large Hadron Collider. However, this research has also indicated that quantum gravity or perturbative quantum field theory will become strongly coupled before 1 PeV, leading to other new physics in the TeVs.

Grand unified theories

The standard model has three gauge symmetries; the colour SU(3), the weak isospin SU(2), and the weak hypercharge U(1) symmetry, corresponding to the three fundamental forces. Due to renormalization the coupling constants of each of these symmetries vary with the energy at which they are measured. Around 1016 GeV these couplings become approximately equal. This has led to speculation that above this energy the three gauge symmetries of the standard model are unified in one single gauge symmetry with a simple gauge group, and just one coupling constant. Below this energy the symmetry is spontaneously broken to the standard model symmetries. Popular choices for the unifying group are the special unitary group in five dimensions SU(5) and the special orthogonal group in ten dimensions SO(10).

Theories that unify the standard model symmetries in this way are called Grand Unified Theories (or GUTs), and the energy scale at which the unified symmetry is broken is called the GUT scale. Generically, grand unified theories predict the creation of magnetic monopoles in the early universe, and instability of the proton. Neither of these have been observed, and this absence of observation puts limits on the possible GUTs.

Supersymmetry

Supersymmetry extends the Standard Model by adding another class of symmetries to the Lagrangian. These symmetries exchange fermionic particles with bosonic ones. Such a symmetry predicts the existence of supersymmetric particles, abbreviated as sparticles, which include the sleptons, squarks, neutralinos and charginos. Each particle in the Standard Model would have a superpartner whose spin differs by 1/2 from the ordinary particle. Due to the breaking of supersymmetry, the sparticles are much heavier than their ordinary counterparts; they are so heavy that existing particle colliders may not be powerful enough to produce them.

Neutrinos

In the standard model, neutrinos cannot spontaneously change flavor. Measurements however indicated that neutrinos do spontaneously change flavor, in what is called neutrino oscillations.

Neutrino oscillations are usually explained using massive neutrinos. In the standard model, neutrinos have exactly zero mass, as the standard model only contains left-handed neutrinos. With no suitable right-handed partner, it is impossible to add a renormalizable mass term to the standard model. These measurements only give the mass differences between the different flavours. The best constraint on the absolute mass of the neutrinos comes from precision measurements of tritium decay, providing an upper limit 2 eV, which makes them at least five orders of magnitude lighter than the other particles in the standard model. This necessitates an extension of the standard model, which not only needs to explain how neutrinos get their mass, but also why the mass is so small.

One approach to add masses to the neutrinos, the so-called seesaw mechanism, is to add right-handed neutrinos and have these couple to left-handed neutrinos with a Dirac mass term. The right-handed neutrinos have to be sterile, meaning that they do not participate in any of the standard model interactions. Because they have no charges, the right-handed neutrinos can act as their own anti-particles, and have a Majorana mass term. Like the other Dirac masses in the standard model, the neutrino Dirac mass is expected to be generated through the Higgs mechanism, and is therefore unpredictable. The standard model fermion masses differ by many orders of magnitude; the Dirac neutrino mass has at least the same uncertainty. On the other hand, the Majorana mass for the right-handed neutrinos does not arise from the Higgs mechanism, and is therefore expected to be tied to some energy scale of new physics beyond the standard model, for example the Planck scale. Therefore, any process involving right-handed neutrinos will be suppressed at low energies. The correction due to these suppressed processes effectively gives the left-handed neutrinos a mass that is inversely proportional to the right-handed Majorana mass, a mechanism known as the see-saw. The presence of heavy right-handed neutrinos thereby explains both the small mass of the left-handed neutrinos and the absence of the right-handed neutrinos in observations. However, due to the uncertainty in the Dirac neutrino masses, the right-handed neutrino masses can lie anywhere. For example, they could be as light as keV and be dark matter, they can have a mass in the LHC energy range and lead to observable lepton number violation, or they can be near the GUT scale, linking the right-handed neutrinos to the possibility of a grand unified theory.

The mass terms mix neutrinos of different generations. This mixing is parameterized by the PMNS matrix, which is the neutrino analogue of the CKM quark mixing matrix. Unlike the quark mixing, which is almost minimal, the mixing of the neutrinos appears to be almost maximal. This has led to various speculations of symmetries between the various generations that could explain the mixing patterns. The mixing matrix could also contain several complex phases that break CP invariance, although there has been no experimental probe of these. These phases could potentially create a surplus of leptons over anti-leptons in the early universe, a process known as leptogenesis. This asymmetry could then at a later stage be converted in an excess of baryons over anti-baryons, and explain the matter-antimatter asymmetry in the universe.

The light neutrinos are disfavored as an explanation for the observation of dark matter, based on considerations of large-scale structure formation in the early universe. Simulations of structure formation show that they are too hot – that is, their kinetic energy is large compared to their mass – while formation of structures similar to the galaxies in our universe requires cold dark matter. The simulations show that neutrinos can at best explain a few percent of the missing mass in dark matter. However, the heavy, sterile, right-handed neutrinos are a possible candidate for a dark matter WIMP.

There are however other explanations for neutrino oscillations which do not necessarily require neutrinos to have masses, such as Lorentz-violating neutrino oscillations.

Preon models

Several preon models have been proposed to address the unsolved problem concerning the fact that there are three generations of quarks and leptons. Preon models generally postulate some additional new particles which are further postulated to be able to combine to form the quarks and leptons of the standard model. One of the earliest preon models was the Rishon model.

To date, no preon model is widely accepted or fully verified.

Theories of everything

Theoretical physics continues to strive toward a theory of everything, a theory that fully explains and links together all known physical phenomena, and predicts the outcome of any experiment that could be carried out in principle.

In practical terms the immediate goal in this regard is to develop a theory which would unify the Standard Model with General Relativity in a theory of quantum gravity. Additional features, such as overcoming conceptual flaws in either theory or accurate prediction of particle masses, would be desired. The challenges in putting together such a theory are not just conceptual – they include the experimental aspects of the very high energies needed to probe exotic realms.

Several notable attempts in this direction are supersymmetry, loop quantum gravity, and String theory.

Supersymmetry

Loop quantum gravity

Theories of quantum gravity such as loop quantum gravity and others are thought by some to be promising candidates to the mathematical unification of quantum field theory and general relativity, requiring less drastic changes to existing theories. However recent work places stringent limits on the putative effects of quantum gravity on the speed of light, and disfavours some current models of quantum gravity.

String theory

Extensions, revisions, replacements, and reorganizations of the Standard Model exist in attempt to correct for these and other issues. String theory is one such reinvention, and many theoretical physicists think that such theories are the next theoretical step toward a true Theory of Everything.

Among the numerous variants of string theory, M-theory, whose mathematical existence was first proposed at a String Conference in 1995 by Edward Witten, is believed by many to be a proper "ToE" candidate, notably by physicists Brian Greene and Stephen Hawking. Though a full mathematical description is not yet known, solutions to the theory exist for specific cases. Recent works have also proposed alternate string models, some of which lack the various harder-to-test features of M-theory (e.g. the existence of Calabi–Yau manifolds, many extra dimensions, etc.) including works by well-published physicists such as Lisa Randall.

Agent-based model

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Agent-based_model An agent-based model ( A...