Search This Blog

Saturday, December 7, 2019

Scientific wager

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Scientific_wager
 
A scientific wager is a wager whose outcome is settled by scientific method. They typically consist of an offer to pay a certain sum of money on the scientific proof or disproof of some currently-uncertain statement. Some wagers have specific date restrictions for collection, but many are open. Wagers occasionally exert a powerful galvanizing effect on society and the scientific community.

Notable scientists who have made scientific wagers include Stephen Hawking and Richard Feynman. Stanford Linear Accelerator has an open book containing about 35 bets in particle physics dating back to 1980; many are still unresolved.

Notable scientific wagers

  • In 1870, Alfred Russel Wallace bet a flat-Earth theorist named John Hampden that he could prove the flat Earth hypothesis incorrect. The sum staked was £500 (equivalent to about £47000 in present-day terms[1]). A test (now known as the Bedford Level experiment) involving a stretch of the Old Bedford River, in Norfolk, was agreed on: Wallace measured the curvature of the canal's surface using two markers separated by about 5 kilometres (3.1 mi) and suspended at equal heights above the water's surface. Using a telescope mounted 5 km from one of the markers, Wallace established that the nearer one appeared to be the higher of the two. An independent referee agreed that this showed the Earth's surface to curve away from the telescope, and so Wallace won his money. However, Hampden never accepted the result and made increasingly unpleasant threats to Wallace.
  • In 1975, cosmologist Stephen Hawking bet fellow cosmologist Kip Thorne a subscription to Penthouse magazine for Thorne against four years of Private Eye for him that Cygnus X-1 would turn out to not be a black hole. In 1990, Hawking acknowledged that he had lost the bet. Hawking's explanation for his position was that if black holes didn't actually exist much of his research would be incorrect, but at least he'd have the consolation of winning the bet.
  • In 1978, chess International Master David Levy won £1250 from four artificial intelligence experts by never losing a match to a chess program in a ten-year span from 1968 to 1978.
  • In 1980, biologist Paul R. Ehrlich bet economist Julian Lincoln Simon that the price of a portfolio of $200 of each of five mineral commodities (copper, chromium, nickel, tin, and tungsten) would rise over the next 10 years. He lost, and paid the amount the total price had declined: $576.07.
  • In 1997, Stephen Hawking and Kip Thorne made a bet with John Preskill on the ultimate resolution of the apparent contradiction between Hawking radiation resulting in loss of information, and a requirement of quantum mechanics that information cannot be destroyed. Hawking and Thorne bet that information must be lost in a black hole; Preskill bet that it must not. The formal wager was: "When an initial pure quantum state undergoes gravitational collapse to form a black hole, the final state at the end of black hole evaporation will always be a pure quantum state". The stake was an encyclopaedia of the winner's choice, from which "information can be recovered at will". Hawking conceded the bet in 2004, giving a baseball encyclopaedia to John Preskill. Thorne has not formally conceded. See: Thorne-Hawking-Preskill bet
  • In 2005, British climate scientist James Annan proposed bets with global warming denialists concerning whether future temperatures will increase. Two Russian solar physicists, Galina Mashnich and Vladimir Bashkirtsev, accepted the wager of US$10,000 that the average global temperature during 2012–2017 would be lower than during 1998–2003. Previously, Annan first directly challenged Richard Lindzen. Lindzen had been willing to bet that global temperatures would drop over the next 20 years. Annan says that Lindzen wanted odds of 50–1 against falling temperatures. Lindzen, however, says that he asked for 2–1 odds against a temperature rise of over 0.4 °C. Annan and other proponents of global warming state they have challenged other denialists to bets over global warming that were not accepted, including Annan's attempt in 2005 to accept a bet that had been offered by Patrick Michaels in 1998 that temperatures would be cooler after ten years. Annan made a bet in 2011 with Doctor David Whitehouse that the Met Office temperature would set a new annual record by the end of the year. Annan was declared to have lost on January 13, 2012.
  • In 2005, The Guardian columnist George Monbiot challenged Myron Ebell of the Competitive Enterprise Institute to a GB£5,000 bet of global warming versus global cooling.
  • In 2012, Stephen Hawking lost $100 to Gordon Kane of the University of Michigan because of the Higgs boson discovery.
  • Zvi Bern has won many bets connected to quantum gravity.
  • On July 8, 2009, at a FQXi conference in the Azores, Antony Garrett Lisi made a public bet with Frank Wilczek that superparticles would not be detected by July 8, 2015. On Aug 16, 2016, after agreeing to a one-year delay to allow for more data collection from the Large Hadron Collider, Frank Wilczek conceded the superparticle bet to Lisi.
  • In 2000 roughly 40 physicists made a bet about the existence of supersymmetry to be settled in 2011, but because LHC was delayed the bet was extended to 2016. As of Summer 2016 there had been no signs of superparticles, and the losers delivered "good cognac at a price not less than $100" each to the winners.
  • Also in 2016 David Gross lost a separate wager about supersymmetry, but he continues to believe in the theory.

Black hole thermodynamics

From Wikipedia, the free encyclopedia
 
In physics, black hole thermodynamics is the area of study that seeks to reconcile the laws of thermodynamics with the existence of black-hole event horizons. As the study of the statistical mechanics of black-body radiation led to the advent of the theory of quantum mechanics, the effort to understand the statistical mechanics of black holes has had a deep impact upon the understanding of quantum gravity, leading to the formulation of the holographic principle.

An artist's depiction of two black holes merging, a process in which the laws of thermodynamics are upheld
 

Overview

The second law of thermodynamics requires that black holes have entropy. If black holes carried no entropy, it would be possible to violate the second law by throwing mass into the black hole. The increase of the entropy of the black hole more than compensates for the decrease of the entropy carried by the object that was swallowed. 

Starting from theorems proved by Stephen Hawking, Jacob Bekenstein conjectured that the black hole entropy was proportional to the area of its event horizon divided by the Planck area. In 1973 Bekenstein suggested as the constant of proportionality, asserting that if the constant was not exactly this, it must be very close to it. The next year, in 1974, Hawking showed that black holes emit thermal Hawking radiation corresponding to a certain temperature (Hawking temperature). Using the thermodynamic relationship between energy, temperature and entropy, Hawking was able to confirm Bekenstein's conjecture and fix the constant of proportionality at :
where is the area of the event horizon, is Boltzmann's constant, and is the Planck length. This is often referred to as the Bekenstein–Hawking formula. The subscript BH either stands for "black hole" or "Bekenstein–Hawking". The black-hole entropy is proportional to the area of its event horizon . The fact that the black-hole entropy is also the maximal entropy that can be obtained by the Bekenstein bound (wherein the Bekenstein bound becomes an equality) was the main observation that led to the holographic principle. This area relationship was generalized to arbitrary regions via the Ryu-Takayanagi formula, which relates the entanglement entropy of a boundary conformal field theory to a specific surface in its dual gravitational theory.

Although Hawking's calculations gave further thermodynamic evidence for black-hole entropy, until 1995 no one was able to make a controlled calculation of black-hole entropy based on statistical mechanics, which associates entropy with a large number of microstates. In fact, so called "no-hair" theorems appeared to suggest that black holes could have only a single microstate. The situation changed in 1995 when Andrew Strominger and Cumrun Vafa calculated the right Bekenstein–Hawking entropy of a supersymmetric black hole in string theory, using methods based on D-branes and string duality. Their calculation was followed by many similar computations of entropy of large classes of other extremal and near-extremal black holes, and the result always agreed with the Bekenstein–Hawking formula. However, for the Schwarzschild black hole, viewed as the most far-from-extremal black hole, the relationship between micro- and macrostates has not been characterized. Efforts to develop an adequate answer within the framework of string theory continue.

In loop quantum gravity (LQG) it is possible to associate a geometrical interpretation to the microstates: these are the quantum geometries of the horizon. LQG offers a geometric explanation of the finiteness of the entropy and of the proportionality of the area of the horizon. It is possible to derive, from the covariant formulation of full quantum theory (spinfoam) the correct relation between energy and area (1st law), the Unruh temperature and the distribution that yields Hawking entropy. The calculation makes use of the notion of dynamical horizon and is done for non-extremal black holes. There seems to be also discussed the calculation of Bekenstein–Hawking entropy from the point of view of LQG. 

The laws of black hole mechanics

The four laws of black hole mechanics are physical properties that black holes are believed to satisfy. The laws, analogous to the laws of thermodynamics, were discovered by Brandon Carter, Stephen Hawking, and James Bardeen

Statement of the laws

The laws of black-hole mechanics are expressed in geometrized units

The zeroth law

The horizon has constant surface gravity for a stationary black hole. 

The first law

For perturbations of stationary black holes, the change of energy is related to change of area, angular momentum, and electric charge by
where is the energy, is the surface gravity, is the horizon area, is the angular velocity, is the angular momentum, is the electrostatic potential and is the electric charge

The second law

The horizon area is, assuming the weak energy condition, a non-decreasing function of time:
This "law" was superseded by Hawking's discovery that black holes radiate, which causes both the black hole's mass and the area of its horizon to decrease over time. 

The third law

It is not possible to form a black hole with vanishing surface gravity. That is, cannot be achieved. 

Discussion of the laws

The zeroth law

The zeroth law is analogous to the zeroth law of thermodynamics, which states that the temperature is constant throughout a body in thermal equilibrium. It suggests that the surface gravity is analogous to temperature. T constant for thermal equilibrium for a normal system is analogous to constant over the horizon of a stationary black hole. 

The first law

The left side, , is the change in energy (proportional to mass). Although the first term does not have an immediately obvious physical interpretation, the second and third terms on the right side represent changes in energy due to rotation and electromagnetism. Analogously, the first law of thermodynamics is a statement of energy conservation, which contains on its right side the term

The second law

The second law is the statement of Hawking's area theorem. Analogously, the second law of thermodynamics states that the change in entropy in an isolated system will be greater than or equal to 0 for a spontaneous process, suggesting a link between entropy and the area of a black-hole horizon. However, this version violates the second law of thermodynamics by matter losing (its) entropy as it falls in, giving a decrease in entropy. However, generalizing the second law as the sum of black-hole entropy and outside entropy, shows that the second law of thermodynamics is not violated in a system including the universe beyond the horizon. 

The generalized second law of thermodynamics (GSL) was needed to present the second law of thermodynamics as valid. This is because the second law of thermodynamics, as a result of the disappearance of entropy near the exterior of black holes, is not useful. The GSL allows for the application of the law because now the measurement of interior, common entropy is possible. The validity of the GSL can be established by studying an example, such as looking at a system having entropy that falls into a bigger, non-moving black hole, and establishing upper and lower entropy bounds for the increase in the black hole entropy and entropy of the system, respectively. One should also note that the GSL will hold for theories of gravity such as Einstein gravity, Lovelock gravity, or Braneworld gravity, because the conditions to use GSL for these can be met.

However, on the topic of black hole formation, the question becomes whether or not the generalized second law of thermodynamics will be valid, and if it is, it will have been proved valid for all situations. Because a black hole formation is not stationary, but instead moving, proving that the GSL holds is difficult. Proving the GSL is generally valid would require using quantum-statistical mechanics, because the GSL is both a quantum and statistical law. This discipline does not exist so the GSL can be assumed to be useful in general, as well as for prediction. For example, one can use the GSL to predict that, for a cold, non-rotating assembly of nucleons, , where is the entropy of a black hole and is the sum of the ordinary entropy.

The third law

Extremal black holes have vanishing surface gravity. Stating that cannot go to zero is analogous to the third law of thermodynamics, which states that the entropy of a system at absolute zero is a well defined constant. This is because a system at zero temperature exists in its ground state. Furthermore, will reach zero at zero temperature, but itself will also reach zero, at least for perfect crystalline substances. No experimentally verified violations of the laws of thermodynamics are known yet.

Interpretation of the laws

The four laws of black-hole mechanics suggest that one should identify the surface gravity of a black hole with temperature and the area of the event horizon with entropy, at least up to some multiplicative constants. If one only considers black holes classically, then they have zero temperature and, by the no-hair theorem, zero entropy, and the laws of black-hole mechanics remain an analogy. However, when quantum-mechanical effects are taken into account, one finds that black holes emit thermal radiation (Hawking radiation) at a temperature
From the first law of black-hole mechanics, this determines the multiplicative constant of the Bekenstein–Hawking entropy, which is

Beyond black holes

Gary Gibbons and Hawking have shown that black-hole thermodynamics is more general than black holes—that cosmological event horizons also have an entropy and temperature. 

More fundamentally, 't Hooft and Susskind used the laws of black-hole thermodynamics to argue for a general holographic principle of nature, which asserts that consistent theories of gravity and quantum mechanics must be lower-dimensional. Though not yet fully understood in general, the holographic principle is central to theories like the AdS/CFT correspondence.

There are also connections between black-hole entropy and fluid surface tension.

Gerard 't Hooft

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Gerard_'t_Hooft
 
Gerard 't Hooft
Gerard 't Hooft.jpg
November 2008
BornJuly 5, 1946 (age 73)
Den Helder, Netherlands
NationalityDutch
Alma materUtrecht University
Known forQuantum field theory, Quantum gravity, 't Hooft–Polyakov monopole, 't Hooft symbol, 't Hooft operator, Holographic principle, Renormalization, Dimensional regularization
AwardsHeineman Prize (1979)
Wolf Prize (1981)
Lorentz Medal (1986)
Spinoza Prize (1995)
Franklin Medal (1995)
Nobel Prize in Physics (1999)
Lomonosov Gold Medal (2010)
Scientific career
FieldsTheoretical physics
InstitutionsUtrecht University
Doctoral advisorMartinus J. G. Veltman
Doctoral studentsRobbert Dijkgraaf
Herman Verlinde

Gerardus (Gerard) 't Hooft (Dutch: [ˈɣeːrɑrt ət ˈɦoːft]; born July 5, 1946) is a Dutch theoretical physicist and professor at Utrecht University, the Netherlands. He shared the 1999 Nobel Prize in Physics with his thesis advisor Martinus J. G. Veltman "for elucidating the quantum structure of electroweak interactions".

His work concentrates on gauge theory, black holes, quantum gravity and fundamental aspects of quantum mechanics. His contributions to physics include a proof that gauge theories are renormalizable, dimensional regularization and the holographic principle.

Personal life

He is married to Albertha Schik (Betteke) and has two daughters, Saskia and Ellen.

Biography

Early life

Gerard 't Hooft was born in Den Helder on July 5, 1946, but grew up in The Hague. He was the middle child of a family of three. He comes from a family of scholars. His grandmother was a sister of Nobel prize laureate Frits Zernike, and was married to Pieter Nicolaas van Kampen, who was a well-known professor of zoology at Leiden University. His uncle Nico van Kampen was an (emeritus) professor of theoretical physics at Utrecht University, and while his mother did not opt for a scientific career because of her gender, she did marry a maritime engineer. Following his family's footsteps, he showed interest in science at an early age. When his primary school teacher asked him what he wanted to be when he grew up, he boldly declared, "a man who knows everything."

After primary school Gerard attended the Dalton Lyceum, a school that applied the ideas of the Dalton Plan, an educational method that suited him well. He easily passed his science and mathematics courses, but struggled with his language courses. Nonetheless, he passed his classes in English, French, German, classical Greek and Latin. At the age of sixteen he earned a silver medal in the second Dutch Math Olympiad

Education

After Gerard 't Hooft passed his high school exams in 1964, he enrolled in the physics program at Utrecht University. He opted for Utrecht instead of the much closer Leiden, because his uncle was a professor there and he wanted to attend his lectures. Because he was so focused on science, his father insisted that he join the Utrechtsch Studenten Corps, a student association, in the hope that he would do something else besides studying. This worked to some extent, during his studies he was a coxswain with their rowing club "Triton" and organized a national congress for science students with their science discussion club "Christiaan Huygens". 

In the course of his studies he decided he wanted to go into what he perceived as the heart of theoretical physics, elementary particles. His uncle had grown to dislike the subject and in particular its practitioners, so when it became time to write his 'doctoraalscriptie' (Dutch equivalent of a master's thesis) in 1968, 't Hooft turned to the newly appointed professor Martinus Veltman, who specialized in Yang–Mills theory, a relatively fringe subject at the time because it was thought that these could not be renormalized. His assignment was to study the Adler–Bell–Jackiw anomaly, a mismatch in the theory of the decay of neutral pions; formal arguments forbid the decay into photons, whereas practical calculations and experiments showed that this was the primary form of decay. The resolution of the problem was completely unknown at the time, and 't Hooft was unable to provide one. 

In 1969, 't Hooft started on his doctoral research with Martinus Veltman as his advisor. He would work on the same subject Veltman was working on, the renormalization of Yang–Mills theories. In 1971 his first paper was published. In it he showed how to renormalize massless Yang–Mills fields, and was able to derive relations between amplitudes, which would be generalized by Andrei Slavnov and John C. Taylor, and become known as the Slavnov–Taylor identities

The world took little notice, but Veltman was excited because he saw that the problem he had been working on was solved. A period of intense collaboration followed in which they developed the technique of dimensional regularization. Soon 't Hooft's second paper was ready to be published,[3] in which he showed that Yang–Mills theories with massive fields due to spontaneous symmetry breaking could be renormalized. This paper earned them worldwide recognition, and would ultimately earn the pair the 1999 Nobel Prize in Physics. 

These two papers formed the basis of 't Hooft's dissertation, The Renormalization procedure for Yang–Mills Fields, and he obtained his PhD degree in 1972. In the same year he married his wife, Albertha A. Schik, a student of medicine in Utrecht.

Career

Gerard 't Hooft at Harvard
 
After obtaining his doctorate 't Hooft went to CERN in Geneva, where he had a fellowship. He further refined his methods for Yang–Mills theories with Veltman (who went back to Geneva). In this time he became interested in the possibility that the strong interaction could be described as a massless Yang–Mills theory, i.e. one of a type that he had just proved to be renormalizable and hence be susceptible to detailed calculation and comparison with experiment. 

According to 't Hooft's calculations, this type of theory possessed just the right kind of scaling properties (asymptotic freedom) that this theory should have according to deep inelastic scattering experiments. This was contrary to popular perception of Yang–Mills theories at the time, that like gravitation and electrodynamics, their intensity should decrease with increasing distance between the interacting particles; such conventional behaviour with distance was unable to explain the results of deep inelastic scattering, whereas 't Hooft's calculations could. 

When 't Hooft mentioned his results at a small conference at Marseilles in 1972, Kurt Symanzik urged him to publish this result; but 't Hooft did not, and the result was eventually rediscovered and published by Hugh David Politzer, David Gross, and Frank Wilczek in 1973, which led to their earning the 2004 Nobel Prize in Physics.

In 1974, 't Hooft returned to Utrecht where he became assistant professor. In 1976, he was invited for a guest position at Stanford and a position at Harvard as Morris Loeb lecturer. His eldest daughter, Saskia Anne, was born in Boston, while his second daughter, Ellen Marga, was born in 1978 after he returned to Utrecht, where he was made full professor. In the academic year 1987-1988 't Hooft spent a sabbatical in the Boston University Physics Department along with Howard Georgi, Robert Jaffe and others arranged by the then new Department chair Lawrence Sulak.

In 2007 't Hooft became editor-in-chief for Foundations of Physics, where he sought to distance the journal from the controversy of ECE theory. 't Hooft held the position until 2016.

On July 1, 2011 he was appointed Distinguished professor by Utrecht University.

Honors

In 1999 't Hooft shared the Nobel prize in Physics with his thesis adviser Veltman for "elucidating the quantum structure of the electroweak interactions in physics". Before that time his work had already been recognized by other notable awards. In 1981, he was awarded the Wolf Prize, possibly the most prestigious prize in physics after the Nobel prize. Five years later he received the Lorentz Medal, awarded every four years in recognition of the most important contributions in theoretical physics. In 1995, he was one of the first recipients of the Spinozapremie, the highest award available to scientists in the Netherlands. In the same year he was also honoured with a Franklin Medal.

Since his Nobel Prize, 't Hooft has received a slew of awards, honorary doctorates and honorary professorships. He was knighted commander in the Order of the Netherlands Lion, and officer in the French Legion of Honor. The asteroid 9491 Thooft has been named in his honor, and he has written a constitution for its future inhabitants.

He is a member of the Royal Netherlands Academy of Arts and Sciences (KNAW) since 1982, where he was made academy professor in 2003. He is also a foreign member of many other science academies, including the French Académie des Sciences, the American National Academy of Sciences and American Academy of Arts and Sciences and the Britain and Ireland based Institute of Physics.

Research

't Hooft's research interest can be divided in three main directions: 'gauge theories in elementary particle physics', 'quantum gravity and black holes', and 'foundational aspects of quantum mechanics'.

Gauge theories in elementary particle physics

't Hooft is most famous for his contributions to the development of gauge theories in particle physics. The best known of these is the proof in his PhD thesis that Yang–Mills theories are renormalizable, for which he shared the 1999 Nobel Prize in Physics. For this proof he introduced (with his adviser Veltman) the technique of dimensional regularization.

After his PhD, he became interested in the role of gauge theories in the strong interaction, the leading theory of which is called quantum chromodynamics or QCD. Much of his research focused on the problem of color confinement in QCD, i.e. the observational fact that only color neutral particles are observed at low energies. This led him to the discovery that SU(N) gauge theories simplify in the large N limit, a fact which has proved important in the examination of the conjectured correspondence between string theories in an Anti-de Sitter space and conformal field theories in one lower dimension. By solving the theory in one space and one time dimension, 't Hooft was able to derive a formula for the masses of mesons.

He also studied the role of so-called instanton contributions in QCD. His calculation showed that these contributions lead to an interaction between light quarks at low energies not present in the normal theory. Studying instanton solutions of Yang–Mills theories, 't Hooft discovered that spontaneously breaking a theory with SU(N) symmetry to a U(1) symmetry will lead to the existence of magnetic monopoles. These monopoles are called 't Hooft–Polyakov monopoles, after Alexander Polyakov, who independently obtained the same result.

As another piece in the color confinement puzzle 't Hooft introduced 't Hooft operators, which are the magnetic dual of Wilson loops. Using these operators he was able to classify different phases of QCD, which form the basis of the QCD phase diagram.

In 1986, he was finally able to show that instanton contributions solve the Adler–Bell–Jackiw anomaly, the topic of his master's thesis.

Quantum gravity and black holes

When Veltman and 't Hooft moved to CERN after 't Hooft obtained his PhD, Veltman's attention was drawn to the possibility of using their dimensional regularization techniques to the problem of quantizing gravity. Although it was known that perturbative quantum gravity was not completely renormalizible, they felt important lessons were to be learned by studying the formal renormalization of the theory order by order. This work would be continued by Stanley Deser and another PhD student of Veltman, Peter van Nieuwenhuizen, who later found patterns in the renormalization counter terms, which led to the discovery of supergravity.

In the 1980s, 't Hooft's attention was drawn to the subject of gravity in 3 spacetime dimensions. Together with Deser and Jackiw he published an article in 1984 describing the dynamics of flat space where the only local degrees of freedom were propagating point defects. His attention returned to this model at various points in time, showing that Gott pairs would not cause causality violating timelike loops, and showing how the model could be quantized. More recently he proposed generalizing this piecewise flat model of gravity to 4 spacetime dimensions.

With Stephen Hawking's discovery of Hawking radiation of black holes, it appeared that the evaporation of these objects violated a fundamental property of quantum mechanics, unitarity. 'T Hooft refused to accept this problem, known as the black hole information paradox, and assumed that this must be the result of the semi-classical treatment of Hawking, and that it should not appear in a full theory of quantum gravity. He proposed that it might be possible to study some of the properties of such a theory, by assuming that such a theory was unitary.

Using this approach he has argued that near a black hole, quantum fields could be described by a theory in a lower dimension. This led to the introduction of the holographic principle by him and Leonard Susskind.

Fundamental aspects of quantum mechanics

't Hooft has "deviating views on the physical interpretation of quantum theory". He believes that there could be a deterministic explanation underlying quantum mechanics. Using a speculative model he has argued that such a theory could avoid the usual Bell inequality arguments that would disallow such a local hidden variable theory. In 2016 he published a book length exposition of his ideas which, according to 't Hooft, has encountered mixed reactions.

Popular publications

Fundamental interaction

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Fundamental_interaction
 
In physics, the fundamental interactions, also known as fundamental forces, are the interactions that do not appear to be reducible to more basic interactions. There are four fundamental interactions known to exist: the gravitational and electromagnetic interactions, which produce significant long-range forces whose effects can be seen directly in everyday life, and the strong and weak interactions, which produce forces at minuscule, subatomic distances and govern nuclear interactions. Some scientists hypothesize that a fifth force might exist, but the hypotheses remain speculative.

Each of the known fundamental interactions can be described mathematically as a field. The gravitational force is attributed to the curvature of spacetime, described by Einstein's general theory of relativity. The other three are discrete quantum fields, and their interactions are mediated by elementary particles described by the Standard Model of particle physics.

Within the Standard Model, the strong interaction is carried by a particle called the gluon, and is responsible for quarks binding together to form hadrons, such as protons and neutrons. As a residual effect, it creates the nuclear force that binds the latter particles to form atomic nuclei. The weak interaction is carried by particles called W and Z bosons, and also acts on the nucleus of atoms, mediating radioactive decay. The electromagnetic force, carried by the photon, creates electric and magnetic fields, which are responsible for the attraction between orbital electrons and atomic nuclei which holds atoms together, as well as chemical bonding and electromagnetic waves, including visible light, and forms the basis for electrical technology. Although the electromagnetic force is far stronger than gravity, it tends to cancel itself out within large objects, so over large distances (on the scale of planets and galaxies), gravity tends to be the dominant force.

Many theoretical physicists believe these fundamental forces to be related and to become unified into a single force at very high energies on a minuscule scale, the Planck scale, but particle accelerators cannot produce the enormous energies required to experimentally probe this. Devising a common theoretical framework that would explain the relation between the forces in a single theory is perhaps the greatest goal of today's theoretical physicists. The weak and electromagnetic forces have already been unified with the electroweak theory of Sheldon Glashow, Abdus Salam, and Steven Weinberg for which they received the 1979 Nobel Prize in physics. Progress is currently being made in uniting the electroweak and strong fields within what is called a Grand Unified Theory (GUT). A bigger challenge is to find a way to quantize the gravitational field, resulting in a theory of quantum gravity (QG) which would unite gravity in a common theoretical framework with the other three forces. Some theories, notably string theory, seek both QG and GUT within one framework, unifying all four fundamental interactions along with mass generation within a theory of everything (ToE).

The four fundamental interactions of nature
Property/Interaction Gravitation Electroweak Strong
Weak Electromagnetic Fundamental Residual
Mediating particles Not yet observed
(Graviton hypothesised)
W+, W and Z0 γ (photon) Gluons π, ρ and ω mesons
Affected particles All particles Left-handed fermions Electrically charged Quarks, Gluons Hadrons
Acts on Mass, Energy Flavor Electric charge Color charge
Bound states formed Planets, Stars, Solar systems, Galaxies n/a Atoms, Molecules Hadrons Atomic nuclei
Strength at the scale of quarks
(relative to electromagnetism)
10−41(predicted) 10−4 1 60 Not applicable
to quarks
Strength at the scale of
protons/neutrons
(relative to electromagnetism)
10−36(predicted) 10−7 1 Not applicable
to hadrons
20

History

Classical theory

In his 1687 theory, Isaac Newton postulated space as an infinite and unalterable physical structure existing before, within, and around all objects while their states and relations unfold at a constant pace everywhere, thus absolute space and time. Inferring that all objects bearing mass approach at a constant rate, but collide by impact proportional to their masses, Newton inferred that matter exhibits an attractive force. His law of universal gravitation mathematically stated it to span the entire universe instantly (despite absolute time), or, if not actually a force, to be instant interaction among all objects (despite absolute space.) As conventionally interpreted, Newton's theory of motion modelled a central force without a communicating medium. Thus Newton's theory violated the first principle of mechanical philosophy, as stated by Descartes, No action at a distance. Conversely, during the 1820s, when explaining magnetism, Michael Faraday inferred a field filling space and transmitting that force. Faraday conjectured that ultimately, all forces unified into one.

In 1873, James Clerk Maxwell unified electricity and magnetism as effects of an electromagnetic field whose third consequence was light, travelling at constant speed in a vacuum. The electromagnetic field theory contradicted predictions of Newton's theory of motion, unless physical states of the luminiferous aether—presumed to fill all space whether within matter or in a vacuum and to manifest the electromagnetic field—aligned all phenomena and thereby held valid the Newtonian principle relativity or invariance

The Standard Model

The Standard Model of elementary particles, with the fermions in the first three columns, the gauge bosons in the fourth column, and the Higgs boson in the fifth column
 
The Standard Model of particle physics was developed throughout the latter half of the 20th century. In the Standard Model, the electromagnetic, strong, and weak interactions associate with elementary particles, whose behaviours are modelled in quantum mechanics (QM). For predictive success with QM's probabilistic outcomes, particle physics conventionally models QM events across a field set to special relativity, altogether relativistic quantum field theory (QFT). Force particles, called gauge bosonsforce carriers or messenger particles of underlying fields—interact with matter particles, called fermions. Everyday matter is atoms, composed of three fermion types: up-quarks and down-quarks constituting, as well as electrons orbiting, the atom's nucleus. Atoms interact, form molecules, and manifest further properties through electromagnetic interactions among their electrons absorbing and emitting photons, the electromagnetic field's force carrier, which if unimpeded traverse potentially infinite distance. Electromagnetism's QFT is quantum electrodynamics (QED).

The electromagnetic interaction was modelled with the weak interaction, whose force carriers are W and Z bosons, traversing the minuscule distance, in electroweak theory (EWT). Electroweak interaction would operate at such high temperatures as soon after the presumed Big Bang, but, as the early universe cooled, split into electromagnetic and weak interactions. The strong interaction, whose force carrier is the gluon, traversing minuscule distance among quarks, is modeled in quantum chromodynamics (QCD). EWT, QCD, and the Higgs mechanism, whereby the Higgs field manifests Higgs bosons that interact with some quantum particles and thereby endow those particles with mass, comprise particle physics' Standard Model (SM). Predictions are usually made using calculational approximation methods, although such perturbation theory is inadequate to model some experimental observations (for instance bound states and solitons.) Still, physicists widely accept the Standard Model as science's most experimentally confirmed theory.

Beyond the Standard Model, some theorists work to unite the electroweak and strong interactions within a Grand Unified Theory (GUT). Some attempts at GUTs hypothesize "shadow" particles, such that every known matter particle associates with an undiscovered force particle, and vice versa, altogether supersymmetry (SUSY). Other theorists seek to quantize the gravitational field by the modelling behaviour of its hypothetical force carrier, the graviton and achieve quantum gravity (QG). One approach to QG is loop quantum gravity (LQG). Still other theorists seek both QG and GUT within one framework, reducing all four fundamental interactions to a Theory of Everything (ToE). The most prevalent aim at a ToE is string theory, although to model matter particles, it added SUSY to force particles—and so, strictly speaking, became superstring theory. Multiple, seemingly disparate superstring theories were unified on a backbone, M-theory. Theories beyond the Standard Model remain highly speculative, lacking great experimental support. 

Overview of the fundamental interactions

An overview of the various families of elementary and composite particles, and the theories describing their interactions. Fermions are on the left, and Bosons are on the right.
 
In the conceptual model of fundamental interactions, matter consists of fermions, which carry properties called charges and spin ±​12 (intrinsic angular momentum ±​ħ2, where ħ is the reduced Planck constant). They attract or repel each other by exchanging bosons.

The interaction of any pair of fermions in perturbation theory can then be modelled thus:
Two fermions go in → interaction by boson exchange → Two changed fermions go out.
The exchange of bosons always carries energy and momentum between the fermions, thereby changing their speed and direction. The exchange may also transport a charge between the fermions, changing the charges of the fermions in the process (e.g., turn them from one type of fermion to another). Since bosons carry one unit of angular momentum, the fermion's spin direction will flip from +​12 to −​12 (or vice versa) during such an exchange (in units of the reduced Planck's constant).
Because an interaction results in fermions attracting and repelling each other, an older term for "interaction" is force

According to the present understanding, there are four fundamental interactions or forces: gravitation, electromagnetism, the weak interaction, and the strong interaction. Their magnitude and behaviour vary greatly, as described in the table below. Modern physics attempts to explain every observed physical phenomenon by these fundamental interactions. Moreover, reducing the number of different interaction types is seen as desirable. Two cases in point are the unification of:
Both magnitude ("relative strength") and "range", as given in the table, are meaningful only within a rather complex theoretical framework. The table below lists properties of a conceptual scheme that is still the subject of ongoing research. 

Interaction Current theory Mediators Relative strength Long-distance behavior Range (m)
Weak Electroweak Theory (EWT) W and Z bosons 1025 10−18
Strong Quantum chromodynamics
(QCD)
gluons 1038
(Color confinement
10−15
Electromagnetic Quantum electrodynamics
(QED)
photons 1036
Gravitation General relativity
(GR)
gravitons (hypothetical) 1

The modern (perturbative) quantum mechanical view of the fundamental forces other than gravity is that particles of matter (fermions) do not directly interact with each other, but rather carry a charge, and exchange virtual particles (gauge bosons), which are the interaction carriers or force mediators. For example, photons mediate the interaction of electric charges, and gluons mediate the interaction of color charges.

The interactions

Gravity

Gravitation is by far the weakest of the four interactions at the atomic scale, where electromagnetic interactions dominate. But the idea that the weakness of gravity can easily be demonstrated by suspending a pin using a simple magnet (such as a refrigerator magnet) is fundamentally flawed. The only reason the magnet is able to hold the pin against the gravitational pull of the entire Earth is due to its relative proximity. There is clearly a short distance of separation between magnet and pin where a breaking point is reached, and due to the large mass of Earth this distance is disappointingly small.

Thus gravitation is very important for macroscopic objects and over macroscopic distances for the following reasons. Gravitation:
  • Is the only interaction that acts on all particles having mass, energy and/or momentum
  • Has an infinite range, like electromagnetism but unlike strong and weak interaction
  • Cannot be absorbed, transformed, or shielded against
  • Always attracts and never repels (see function of geodesic equation in general relativity)
Even though electromagnetism is far stronger than gravitation, electrostatic attraction is not relevant for large celestial bodies, such as planets, stars, and galaxies, simply because such bodies contain equal numbers of protons and electrons and so have a net electric charge of zero. Nothing "cancels" gravity, since it is only attractive, unlike electric forces which can be attractive or repulsive. On the other hand, all objects having mass are subject to the gravitational force, which only attracts. Therefore, only gravitation matters on the large-scale structure of the universe.

The long range of gravitation makes it responsible for such large-scale phenomena as the structure of galaxies and black holes and it retards the expansion of the universe. Gravitation also explains astronomical phenomena on more modest scales, such as planetary orbits, as well as everyday experience: objects fall; heavy objects act as if they were glued to the ground, and animals can only jump so high.

Gravitation was the first interaction to be described mathematically. In ancient times, Aristotle hypothesized that objects of different masses fall at different rates. During the Scientific Revolution, Galileo Galilei experimentally determined that this hypothesis was wrong under certain circumstances — neglecting the friction due to air resistance, and buoyancy forces if an atmosphere is present (e.g. the case of a dropped air-filled balloon vs a water-filled balloon) all objects accelerate toward the Earth at the same rate. Isaac Newton's law of Universal Gravitation (1687) was a good approximation of the behaviour of gravitation. Our present-day understanding of gravitation stems from Einstein's General Theory of Relativity of 1915, a more accurate (especially for cosmological masses and distances) description of gravitation in terms of the geometry of spacetime.

Merging general relativity and quantum mechanics (or quantum field theory) into a more general theory of quantum gravity is an area of active research. It is hypothesized that gravitation is mediated by a massless spin-2 particle called the graviton.

Although general relativity has been experimentally confirmed (at least for weak fields) on all but the smallest scales, there are rival theories of gravitation. Those taken seriously by the physics community all reduce to general relativity in some limit, and the focus of observational work is to establish limitations on what deviations from general relativity are possible.

Proposed extra dimensions could explain why the gravity force is so weak.

Electroweak interaction

Electromagnetism and weak interaction appear to be very different at everyday low energies. They can be modelled using two different theories. However, above unification energy, on the order of 100 GeV, they would merge into a single electroweak force.

Electroweak theory is very important for modern cosmology, particularly on how the universe evolved. This is because shortly after the Big Bang, the temperature was approximately above 1015 K, the electromagnetic force and the weak force were merged into a combined electroweak force.

For contributions to the unification of the weak and electromagnetic interaction between elementary particles, Abdus Salam, Sheldon Glashow and Steven Weinberg were awarded the Nobel Prize in Physics in 1979.

Electromagnetism

Electromagnetism is the force that acts between electrically charged particles. This phenomenon includes the electrostatic force acting between charged particles at rest, and the combined effect of electric and magnetic forces acting between charged particles moving relative to each other. 

Electromagnetism has infinite range like gravity, but is vastly stronger than it, and therefore describes a number of macroscopic phenomena of everyday experience such as friction, rainbows, lightning, and all human-made devices using electric current, such as television, lasers, and computers. Electromagnetism fundamentally determines all macroscopic, and many atomic levels, properties of the chemical elements, including all chemical bonding

In a four kilogram (~1 gallon) jug of water there are 


of total electron charge. Thus, if we place two such jugs a meter apart, the electrons in one of the jugs repel those in the other jug with a force of 


This force is larger than the planet Earth would weigh if weighed on another Earth. The atomic nuclei in one jug also repel those in the other with the same force. However, these repulsive forces are canceled by the attraction of the electrons in jug A with the nuclei in jug B and the attraction of the nuclei in jug A with the electrons in jug B, resulting in no net force. Electromagnetic forces are tremendously stronger than gravity but cancel out so that for large bodies gravity dominates.

Electrical and magnetic phenomena have been observed since ancient times, but it was only in the 19th century James Clerk Maxwell discovered that electricity and magnetism are two aspects of the same fundamental interaction. By 1864, Maxwell's equations had rigorously quantified this unified interaction. Maxwell's theory, restated using vector calculus, is the classical theory of electromagnetism, suitable for most technological purposes. 

The constant speed of light in a vacuum (customarily described with a lowercase letter "c") can be derived from Maxwell's equations, which are consistent with the theory of special relativity. Einstein's 1905 theory of special relativity, however, which flows from the observation that the speed of light is constant no matter how fast the observer is moving, showed that the theoretical result implied by Maxwell's equations has profound implications far beyond electromagnetism on the very nature of time and space.

In another work that departed from classical electro-magnetism, Einstein also explained the photoelectric effect by utilizing Max Planck's discovery that light was transmitted in 'quanta' of specific energy content based on the frequency, which we now call photons. Starting around 1927, Paul Dirac combined quantum mechanics with the relativistic theory of electromagnetism. Further work in the 1940s, by Richard Feynman, Freeman Dyson, Julian Schwinger, and Sin-Itiro Tomonaga, completed this theory, which is now called quantum electrodynamics, the revised theory of electromagnetism. Quantum electrodynamics and quantum mechanics provide a theoretical basis for electromagnetic behavior such as quantum tunneling, in which a certain percentage of electrically charged particles move in ways that would be impossible under the classical electromagnetic theory, that is necessary for everyday electronic devices such as transistors to function.

Weak interaction

The weak interaction or weak nuclear force is responsible for some nuclear phenomena such as beta decay. Electromagnetism and the weak force are now understood to be two aspects of a unified electroweak interaction — this discovery was the first step toward the unified theory known as the Standard Model. In the theory of the electroweak interaction, the carriers of the weak force are the massive gauge bosons called the W and Z bosons. The weak interaction is the only known interaction which does not conserve parity; it is left-right asymmetric. The weak interaction even violates CP symmetry but does conserve CPT.

Strong interaction

The strong interaction, or strong nuclear force, is the most complicated interaction, mainly because of the way it varies with distance. At distances greater than 10 femtometers, the strong force is practically unobservable. Moreover, it holds only inside the atomic nucleus.

After the nucleus was discovered in 1908, it was clear that a new force, today known as the nuclear force, was needed to overcome the electrostatic repulsion, a manifestation of electromagnetism, of the positively charged protons. Otherwise, the nucleus could not exist. Moreover, the force had to be strong enough to squeeze the protons into a volume whose diameter is about 10−15 m, much smaller than that of the entire atom. From the short range of this force, Hideki Yukawa predicted that it was associated with a massive particle, whose mass is approximately 100 MeV.

The 1947 discovery of the pion ushered in the modern era of particle physics. Hundreds of hadrons were discovered from the 1940s to 1960s, and an extremely complicated theory of hadrons as strongly interacting particles was developed. Most notably:
While each of these approaches offered deep insights, no approach led directly to a fundamental theory. 

Murray Gell-Mann along with George Zweig first proposed fractionally charged quarks in 1961. Throughout the 1960s, different authors considered theories similar to the modern fundamental theory of quantum chromodynamics (QCD) as simple models for the interactions of quarks. The first to hypothesize the gluons of QCD were Moo-Young Han and Yoichiro Nambu, who introduced the quark color charge and hypothesized that it might be associated with a force-carrying field. At that time, however, it was difficult to see how such a model could permanently confine quarks. Han and Nambu also assigned each quark color an integer electrical charge, so that the quarks were fractionally charged only on average, and they did not expect the quarks in their model to be permanently confined. 

In 1971, Murray Gell-Mann and Harald Fritzsch proposed that the Han/Nambu color gauge field was the correct theory of the short-distance interactions of fractionally charged quarks. A little later, David Gross, Frank Wilczek, and David Politzer discovered that this theory had the property of asymptotic freedom, allowing them to make contact with experimental evidence. They concluded that QCD was the complete theory of the strong interactions, correct at all distance scales. The discovery of asymptotic freedom led most physicists to accept QCD since it became clear that even the long-distance properties of the strong interactions could be consistent with experiment if the quarks are permanently confined. 

Assuming that quarks are confined, Mikhail Shifman, Arkady Vainshtein and Valentine Zakharov were able to compute the properties of many low-lying hadrons directly from QCD, with only a few extra parameters to describe the vacuum. In 1980, Kenneth G. Wilson published computer calculations based on the first principles of QCD, establishing, to a level of confidence tantamount to certainty, that QCD will confine quarks. Since then, QCD has been the established theory of the strong interactions. 

QCD is a theory of fractionally charged quarks interacting by means of 8 bosonic particles called gluons. The gluons interact with each other, not just with the quarks, and at long distances the lines of force collimate into strings. In this way, the mathematical theory of QCD not only explains how quarks interact over short distances but also the string-like behavior, discovered by Chew and Frautschi, which they manifest over longer distances. 

Beyond the Standard Model

Numerous theoretical efforts have been made to systematize the existing four fundamental interactions on the model of electroweak unification.

Grand Unified Theories (GUTs) are proposals to show that the three fundamental interactions described by the Standard Model are all different manifestations of a single interaction with symmetries that break down and create separate interactions below some extremely high level of energy. GUTs are also expected to predict some of the relationships between constants of nature that the Standard Model treats as unrelated, as well as predicting gauge coupling unification for the relative strengths of the electromagnetic, weak, and strong forces (this was, for example, verified at the Large Electron–Positron Collider in 1991 for supersymmetric theories).

Theories of everything, which integrate GUTs with a quantum gravity theory face a greater barrier, because no quantum gravity theories, which include string theory, loop quantum gravity, and twistor theory, have secured wide acceptance. Some theories look for a graviton to complete the Standard Model list of force-carrying particles, while others, like loop quantum gravity, emphasize the possibility that time-space itself may have a quantum aspect to it.

Some theories beyond the Standard Model include a hypothetical fifth force, and the search for such a force is an ongoing line of experimental research in physics. In supersymmetric theories, there are particles that acquire their masses only through supersymmetry breaking effects and these particles, known as moduli can mediate new forces. Another reason to look for new forces is the discovery that the expansion of the universe is accelerating (also known as dark energy), giving rise to a need to explain a nonzero cosmological constant, and possibly to other modifications of general relativity. Fifth forces have also been suggested to explain phenomena such as CP violations, dark matter, and dark flow.

Left–right political spectrum

From Wikipedia, the free encyclopedia https://en.wikipedia.org/w...