Search This Blog

Thursday, October 8, 2020

Theoretical physics

From Wikipedia, the free encyclopedia
 
Visual representation of a Schwarzschild wormhole. Wormholes have never been observed, but they are predicted to exist through mathematical models and scientific theory.

Theoretical physics is a branch of physics that employs mathematical models and abstractions of physical objects and systems to rationalize, explain and predict natural phenomena. This is in contrast to experimental physics, which uses experimental tools to probe these phenomena.

The advancement of science generally depends on the interplay between experimental studies and theory. In some cases, theoretical physics adheres to standards of mathematical rigour while giving little weight to experiments and observations. For example, while developing special relativity, Albert Einstein was concerned with the Lorentz transformation which left Maxwell's equations invariant, but was apparently uninterested in the Michelson–Morley experiment on Earth's drift through a luminiferous aether. Conversely, Einstein was awarded the Nobel Prize for explaining the photoelectric effect, previously an experimental result lacking a theoretical formulation.

Overview

A physical theory is a model of physical events. It is judged by the extent to which its predictions agree with empirical observations. The quality of a physical theory is also judged on its ability to make new predictions which can be verified by new observations. A physical theory differs from a mathematical theorem in that while both are based on some form of axioms, judgment of mathematical applicability is not based on agreement with any experimental results. A physical theory similarly differs from a mathematical theory, in the sense that the word "theory" has a different meaning in mathematical terms.

A physical theory involves one or more relationships between various measurable quantities. Archimedes realized that a ship floats by displacing its mass of water, Pythagoras understood the relation between the length of a vibrating string and the musical tone it produces.Other examples include entropy as a measure of the uncertainty regarding the positions and motions of unseen particles and the quantum mechanical idea that (action and) energy are not continuously variable.

Theoretical physics consists of several different approaches. In this regard, theoretical particle physics forms a good example. For instance: "phenomenologists" might employ (semi-) empirical formulas and heuristics to agree with experimental results, often without deep physical understanding. "Modelers" (also called "model-builders") often appear much like phenomenologists, but try to model speculative theories that have certain desirable features (rather than on experimental data), or apply the techniques of mathematical modeling to physics problems. Some attempt to create approximate theories, called effective theories, because fully developed theories may be regarded as unsolvable or too complicated. Other theorists may try to unify, formalise, reinterpret or generalise extant theories, or create completely new ones altogether. Sometimes the vision provided by pure mathematical systems can provide clues to how a physical system might be modeled; e.g., the notion, due to Riemann and others, that space itself might be curved. Theoretical problems that need computational investigation are often the concern of computational physics.

Theoretical advances may consist in setting aside old, incorrect paradigms (e.g., aether theory of light propagation, caloric theory of heat, burning consisting of evolving phlogiston, or astronomical bodies revolving around the Earth) or may be an alternative model that provides answers that are more accurate or that can be more widely applied. In the latter case, a correspondence principle will be required to recover the previously known result. Sometimes though, advances may proceed along different paths. For example, an essentially correct theory may need some conceptual or factual revisions; atomic theory, first postulated millennia ago (by several thinkers in Greece and India) and the two-fluid theory of electricity are two cases in this point. However, an exception to all the above is the wave–particle duality, a theory combining aspects of different, opposing models via the Bohr complementarity principle.

Relationship between mathematics and physics

Physical theories become accepted if they are able to make correct predictions and no (or few) incorrect ones. The theory should have, at least as a secondary objective, a certain economy and elegance (compare to mathematical beauty), a notion sometimes called "Occam's razor" after the 13th-century English philosopher William of Occam (or Ockham), in which the simpler of two theories that describe the same matter just as adequately is preferred (but conceptual simplicity may mean mathematical complexity). They are also more likely to be accepted if they connect a wide range of phenomena. Testing the consequences of a theory is part of the scientific method.

Physical theories can be grouped into three categories: mainstream theories, proposed theories and fringe theories.

History

Theoretical physics began at least 2,300 years ago, under the Pre-socratic philosophy, and continued by Plato and Aristotle, whose views held sway for a millennium. During the rise of medieval universities, the only acknowledged intellectual disciplines were the seven liberal arts of the Trivium like grammar, logic, and rhetoric and of the Quadrivium like arithmetic, geometry, music and astronomy. During the Middle Ages and Renaissance, the concept of experimental science, the counterpoint to theory, began with scholars such as Ibn al-Haytham and Francis Bacon. As the Scientific Revolution gathered pace, the concepts of matter, energy, space, time and causality slowly began to acquire the form we know today, and other sciences spun off from the rubric of natural philosophy. Thus began the modern era of theory with the Copernican paradigm shift in astronomy, soon followed by Johannes Kepler's expressions for planetary orbits, which summarized the meticulous observations of Tycho Brahe; the works of these men (alongside Galileo's) can perhaps be considered to constitute the Scientific Revolution.

The great push toward the modern concept of explanation started with Galileo, one of the few physicists who was both a consummate theoretician and a great experimentalist. The analytic geometry and mechanics of Descartes were incorporated into the calculus and mechanics of Isaac Newton, another theoretician/experimentalist of the highest order, writing Principia Mathematica. In it contained a grand synthesis of the work of Copernicus, Galileo and Kepler; as well as Newton's theories of mechanics and gravitation, which held sway as worldviews until the early 20th century. Simultaneously, progress was also made in optics (in particular colour theory and the ancient science of geometrical optics), courtesy of Newton, Descartes and the Dutchmen Snell and Huygens. In the 18th and 19th centuries Joseph-Louis Lagrange, Leonhard Euler and William Rowan Hamilton would extend the theory of classical mechanics considerably. They picked up the interactive intertwining of mathematics and physics begun two millennia earlier by Pythagoras.

Among the great conceptual achievements of the 19th and 20th centuries were the consolidation of the idea of energy (as well as its global conservation) by the inclusion of heat, electricity and magnetism, and then light. The laws of thermodynamics, and most importantly the introduction of the singular concept of entropy began to provide a macroscopic explanation for the properties of matter. Statistical mechanics (followed by statistical physics and Quantum statistical mechanics) emerged as an offshoot of thermodynamics late in the 19th century. Another important event in the 19th century was the discovery of electromagnetic theory, unifying the previously separate phenomena of electricity, magnetism and light.


The pillars of modern physics, and perhaps the most revolutionary theories in the history of physics, have been relativity theory and quantum mechanics. Newtonian mechanics was subsumed under special relativity and Newton's gravity was given a kinematic explanation by general relativity. Quantum mechanics led to an understanding of blackbody radiation (which indeed, was an original motivation for the theory) and of anomalies in the specific heats of solids — and finally to an understanding of the internal structures of atoms and molecules. Quantum mechanics soon gave way to the formulation of quantum field theory (QFT), begun in the late 1920s. In the aftermath of World War 2, more progress brought much renewed interest in QFT, which had since the early efforts, stagnated. The same period also saw fresh attacks on the problems of superconductivity and phase transitions, as well as the first applications of QFT in the area of theoretical condensed matter. The 1960s and 70s saw the formulation of the Standard model of particle physics using QFT and progress in condensed matter physics (theoretical foundations of superconductivity and critical phenomena, among others), in parallel to the applications of relativity to problems in astronomy and cosmology respectively.

All of these achievements depended on the theoretical physics as a moving force both to suggest experiments and to consolidate results — often by ingenious application of existing mathematics, or, as in the case of Descartes and Newton (with Leibniz), by inventing new mathematics. Fourier's studies of heat conduction led to a new branch of mathematics: infinite, orthogonal series.

Modern theoretical physics attempts to unify theories and explain phenomena in further attempts to understand the Universe, from the cosmological to the elementary particle scale. Where experimentation cannot be done, theoretical physics still tries to advance through the use of mathematical models.

Mainstream theories

Mainstream theories (sometimes referred to as central theories) are the body of knowledge of both factual and scientific views and possess a usual scientific quality of the tests of repeatability, consistency with existing well-established science and experimentation. There do exist mainstream theories that are generally accepted theories based solely upon their effects explaining a wide variety of data, although the detection, explanation, and possible composition are subjects of debate.

Examples

Proposed theories

The proposed theories of physics are usually relatively new theories which deal with the study of physics which include scientific approaches, means for determining the validity of models and new types of reasoning used to arrive at the theory. However, some proposed theories include theories that have been around for decades and have eluded methods of discovery and testing. Proposed theories can include fringe theories in the process of becoming established (and, sometimes, gaining wider acceptance). Proposed theories usually have not been tested.

Examples

Fringe theories

Fringe theories include any new area of scientific endeavor in the process of becoming established and some proposed theories. It can include speculative sciences. This includes physics fields and physical theories presented in accordance with known evidence, and a body of associated predictions have been made according to that theory.

Some fringe theories go on to become a widely accepted part of physics. Other fringe theories end up being disproven. Some fringe theories are a form of protoscience and others are a form of pseudoscience. The falsification of the original theory sometimes leads to reformulation of the theory.

Examples

Thought experiments vs real experiments

"Thought" experiments are situations created in one's mind, asking a question akin to "suppose you are in this situation, assuming such is true, what would follow?". They are usually created to investigate phenomena that are not readily experienced in every-day situations. Famous examples of such thought experiments are Schrödinger's cat, the EPR thought experiment, simple illustrations of time dilation, and so on. These usually lead to real experiments designed to verify that the conclusion (and therefore the assumptions) of the thought experiments are correct. The EPR thought experiment led to the Bell inequalities, which were then tested to various degrees of rigor, leading to the acceptance of the current formulation of quantum mechanics and probabilism as a working hypothesis.

 

Sheldon Lee Glashow

From Wikipedia, the free encyclopedia
 
Sheldon Lee Glashow
Sheldon Glashow at Harvard cropped.jpg
BornDecember 5, 1932 (age 87)
Alma materCornell University (A.B., 1954)
Harvard University (Ph.D., 1959)
Known forElectroweak theory
Georgi–Glashow model
GIM mechanism
Glashow resonance
Chiral color
Very special relativity
Trinification
Criticism of Superstring theory
Spouse(s)
Joan Shirley Alexander
(m. 1972)
Children4
AwardsNobel Prize in Physics (1979)
Scientific career
FieldsTheoretical Physics
InstitutionsBoston University
Harvard University
University of California, Berkeley
ThesisThe vector meson in elementary particle decays (1958)
Doctoral advisorJulian Schwinger

Sheldon Lee Glashow (US: /ˈɡlæʃ/, UK: /ˈɡlæʃ/; born December 5, 1932) is a Nobel Prize-winning American theoretical physicist. He is the Metcalf Professor of Mathematics and Physics at Boston University and Eugene Higgins Professor of Physics, Emeritus, at Harvard University, and is a member of the Board of Sponsors for the Bulletin of the Atomic Scientists.

Birth and education

Sheldon Lee Glashow was born in New York City, to Jewish immigrants from Russia, Bella (née Rubin) and Lewis Gluchovsky, a plumber. He graduated from Bronx High School of Science in 1950. Glashow was in the same graduating class as Steven Weinberg, whose own research, independent of Glashow's, would result in Glashow, Weinberg, and Abdus Salam sharing the 1979 Nobel Prize in Physics (see below). Glashow received a Bachelor of Arts degree from Cornell University in 1954 and a Ph.D. degree in physics from Harvard University in 1959 under Nobel-laureate physicist Julian Schwinger. Afterwards, Glashow became a NSF fellow at NORDITA and joined the University of California, Berkeley where he was an associate professor from 1962 to 1966. He joined the Harvard physics department as a professor in 1966, and was named Eugene Higgins Professor of Physics in 1979; he became emeritus in 2000. Glashow has been a visiting scientist at CERN, and professor at Aix-Marseille University, MIT, Brookhaven Laboratory, Texas A&M, the University of Houston, and Boston University.

Research

In 1961, Glashow extended electroweak unification models due to Schwinger by including a short range neutral current, the Z0. The resulting symmetry structure that Glashow proposed, SU(2) × U(1), forms the basis of the accepted theory of the electroweak interactions. For this discovery, Glashow along with Steven Weinberg and Abdus Salam, was awarded the 1979 Nobel Prize in Physics.

In collaboration with James Bjorken, Glashow was the first to predict a fourth quark, the charm quark, in 1964. This was at a time when 4 leptons had been discovered but only 3 quarks proposed. The development of their work in 1970, the GIM mechanism showed that the two quark pairs: (d.s), (u,c), would largely cancel out flavor changing neutral currents, which had been observed experimentally at far lower levels than theoretically predicted on the basis of 3 quarks only. The prediction of the charm quark also removed a technical disaster for any quantum field theory with unequal numbers of quarks and leptons — an anomaly — where classical field theory symmetries fail to carry over into the quantum theory.

In 1973, Glashow and Howard Georgi proposed the first grand unified theory. They discovered how to fit the gauge forces in the standard model into an SU(5) group, and the quarks and leptons into two simple representations. Their theory qualitatively predicted the general pattern of coupling constant running, with plausible assumptions, it gave rough mass ratio values between third generation leptons and quarks, and it was the first indication that the law of Baryon number is inexact, that the proton is unstable. This work was the foundation for all future unifying work.

Glashow shared the 1977 J. Robert Oppenheimer Memorial Prize with Feza Gürsey.

Criticism of superstring theory

Glashow is a skeptic of superstring theory due to its lack of experimentally testable predictions. He had campaigned to keep string theorists out of the Harvard physics department, though the campaign failed. About ten minutes into "String's the Thing", the second episode of The Elegant Universe TV series, he describes superstring theory as a discipline distinct from physics, saying "...you may call it a tumor, if you will...".

Professor Glashow's KHC PY 101 Energy class, at Boston University's Kilachand Honors College (Spring 2011)

Personal life

Glashow is married to the former Joan Shirley Alexander. They have four children. Lynn Margulis was Joan's sister, making Carl Sagan his former brother-in-law. Daniel Kleitman, who was another doctoral student of Julian Schwinger, is also his brother-in-law, through Joan's other sister, Sharon.

In 2003 he was one of 22 Nobel Laureates who signed the Humanist Manifesto. Glashow has described himself as a "practising atheist" and he is a Democrat.

Glashow is one of the 20 American recipients of the Nobel Prize in Physics to sign a letter addressed to President George W. Bush in May of 2008, urging him to "reverse the damage done to basic science research in the Fiscal Year 2008 Omnibus Appropriations Bill" by requesting additional emergency funding for the Department of Energy’s Office of Science, the National Science Foundation, and the National Institute of Standards and Technology.

Works

Awards and Honors

Theory of everything

From Wikipedia, the free encyclopedia

A theory of everything (TOE or ToE), final theory, ultimate theory, or master theory is a hypothetical single, all-encompassing, coherent theoretical framework of physics that fully explains and links together all physical aspects of the universe. Finding a TOE is one of the major unsolved problems in physics. String theory and M-theory have been proposed as theories of everything. Over the past few centuries, two theoretical frameworks have been developed that, together, most closely resemble a TOE. These two theories upon which all modern physics rests are general relativity and quantum mechanics. General relativity is a theoretical framework that only focuses on gravity for understanding the universe in regions of both large scale and high mass: stars, galaxies, clusters of galaxies, etc. On the other hand, quantum mechanics is a theoretical framework that only focuses on three non-gravitational forces for understanding the universe in regions of both small scale and low mass: sub-atomic particles, atoms, molecules, etc. Quantum mechanics successfully implemented the Standard Model that describes the three non-gravitational forces – strong nuclear, weak nuclear, and electromagnetic force – as well as all observed elementary particles.

General relativity and quantum mechanics have been thoroughly proven in their separate fields of relevance. Since the usual domains of applicability of general relativity and quantum mechanics are so different, most situations require that only one of the two theories be used. However, the two theories are considered incompatible in regions of extremely small scale – the Planck scale – such as those that exist within a black hole or during the beginning stages of the universe (i.e., the moment immediately following the Big Bang). To resolve the incompatibility, a theoretical framework revealing a deeper underlying reality, unifying gravity with the other three interactions, must be discovered to harmoniously integrate the realms of general relativity and quantum mechanics into a seamless whole: the TOE is a single theory that, in principle, is capable of describing all phenomena in the universe.

In pursuit of this goal, quantum gravity has become one area of active research. One example is string theory, which evolved into a candidate for the TOE, but not without drawbacks (most notably, its lack of currently testable predictions) and controversy. String theory posits that at the beginning of the universe (up to 10−43 seconds after the Big Bang), the four fundamental forces were once a single fundamental force. According to string theory, every particle in the universe, at its most microscopic level (Planck length), consists of varying combinations of vibrating strings (or strands) with preferred patterns of vibration. String theory further claims that it is through these specific oscillatory patterns of strings that a particle of unique mass and force charge is created (that is to say, the electron is a type of string that vibrates one way, while the up quark is a type of string vibrating another way, and so forth).

Name

Initially, the term theory of everything was used with an ironic reference to various overgeneralized theories. For example, a grandfather of Ijon Tichy – a character from a cycle of Stanisław Lem's science fiction stories of the 1960s – was known to work on the "General Theory of Everything". Physicist Harald Fritzsch used the term in his 1977 lectures in Varenna. Physicist John Ellis claims to have introduced the term into the technical literature in an article in Nature in 1986. Over time, the term stuck in popularizations of theoretical physics research.

Historical antecedents

Antiquity to 19th century

Ancient Babylonian astronomers studied the pattern of the Seven Classical Planets against the background of stars, with their interest being to relate celestial movement to human events (astrology), and the goal being to predict events by recording events against a time measure and then look for recurrent patterns. The debate between the universe having either a beginning or eternal cycles can be traced back to ancient Babylonia.

The natural philosophy of atomism appeared in several ancient traditions. In ancient Greek philosophy, the pre-Socratic philosophers speculated that the apparent diversity of observed phenomena was due to a single type of interaction, namely the motions and collisions of atoms. The concept of 'atom' proposed by Democritus was an early philosophical attempt to unify phenomena observed in nature. The concept of 'atom' also appeared in the Nyaya-Vaisheshika school of ancient Indian philosophy.

Archimedes was possibly the first philosopher to have described nature with axioms (or principles) and then deduce new results from them. Any "theory of everything" is similarly expected to be based on axioms and to deduce all observable phenomena from them.

Following earlier atomistic thought, the mechanical philosophy of the 17th century posited that all forces could be ultimately reduced to contact forces between the atoms, then imagined as tiny solid particles.

In the late 17th century, Isaac Newton's description of the long-distance force of gravity implied that not all forces in nature result from things coming into contact. Newton's work in his Mathematical Principles of Natural Philosophy dealt with this in a further example of unification, in this case unifying Galileo's work on terrestrial gravity, Kepler's laws of planetary motion and the phenomenon of tides by explaining these apparent actions at a distance under one single law: the law of universal gravitation.

In 1814, building on these results, Laplace famously suggested that a sufficiently powerful intellect could, if it knew the position and velocity of every particle at a given time, along with the laws of nature, calculate the position of any particle at any other time:

An intellect which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed, if this intellect were also vast enough to submit these data to analysis, it would embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom; for such an intellect nothing would be uncertain and the future just like the past would be present before its eyes.

— Essai philosophique sur les probabilités, Introduction. 1814

Laplace thus envisaged a combination of gravitation and mechanics as a theory of everything. Modern quantum mechanics implies that uncertainty is inescapable, and thus that Laplace's vision has to be amended: a theory of everything must include gravitation and quantum mechanics. Even ignoring quantum mechanics, chaos theory is sufficient to guarantee that the future of any sufficiently complex mechanical or astronomical system is unpredictable.

In 1820, Hans Christian Ørsted discovered a connection between electricity and magnetism, triggering decades of work that culminated in 1865, in James Clerk Maxwell's theory of electromagnetism. During the 19th and early 20th centuries, it gradually became apparent that many common examples of forces – contact forces, elasticity, viscosity, friction, and pressure – result from electrical interactions between the smallest particles of matter.

In his experiments of 1849–50, Michael Faraday was the first to search for a unification of gravity with electricity and magnetism. However, he found no connection.

In 1900, David Hilbert published a famous list of mathematical problems. In Hilbert's sixth problem, he challenged researchers to find an axiomatic basis to all of physics. In this problem he thus asked for what today would be called a theory of everything.

Early 20th century

In the late 1920s, the new quantum mechanics showed that the chemical bonds between atoms were examples of (quantum) electrical forces, justifying Dirac's boast that "the underlying physical laws necessary for the mathematical theory of a large part of physics and the whole of chemistry are thus completely known".

After 1915, when Albert Einstein published the theory of gravity (general relativity), the search for a unified field theory combining gravity with electromagnetism began with a renewed interest. In Einstein's day, the strong and the weak forces had not yet been discovered, yet, he found the potential existence of two other distinct forces -gravity and electromagnetism- far more alluring. This launched his thirty-year voyage in search of the so-called "unified field theory" that he hoped would show that these two forces are really manifestations of one grand underlying principle. During these last few decades of his life, this quixotic quest isolated Einstein from the mainstream of physics. 

Understandably, the mainstream was instead far more excited about the newly emerging framework of quantum mechanics. Einstein wrote to a friend in the early 1940s, "I have become a lonely old chap who is mainly known because he doesn't wear socks and who is exhibited as a curiosity on special occasions." Prominent contributors were Gunnar Nordström, Hermann Weyl, Arthur Eddington, David Hilbert, Theodor Kaluza, Oskar Klein (see Kaluza–Klein theory), and most notably, Albert Einstein and his collaborators. Einstein intensely searched for, but ultimately failed to find, a unifying theory.  More than a half a century later, Einstein's dream of discovering a unified theory has become the Holy Grail of modern physics.

Late 20th century and the nuclear interactions

In the twentieth century, the search for a unifying theory was interrupted by the discovery of the strong and weak nuclear forces (or interactions), which differ both from gravity and from electromagnetism. A further hurdle was the acceptance that in a TOE, quantum mechanics had to be incorporated from the start, rather than emerging as a consequence of a deterministic unified theory, as Einstein had hoped.

Gravity and electromagnetism could always peacefully coexist as entries in a list of classical forces, but for many years it seemed that gravity could not even be incorporated into the quantum framework, let alone unified with the other fundamental forces. For this reason, work on unification, for much of the twentieth century, focused on understanding the three "quantum" forces: electromagnetism and the weak and strong forces. The first two were combined in 1967–68 by Sheldon Glashow, Steven Weinberg, and Abdus Salam into the "electroweak" force. Electroweak unification is a broken symmetry: the electromagnetic and weak forces appear distinct at low energies because the particles carrying the weak force, the W and Z bosons, have non-zero masses of 80.4 GeV/c2 and 91.2 GeV/c2, whereas the photon, which carries the electromagnetic force, is massless. At higher energies Ws and Zs can be created easily and the unified nature of the force becomes apparent.

While the strong and electroweak forces peacefully coexist in the Standard Model of particle physics, they remain distinct. So far, the quest for a theory of everything is thus unsuccessful on two points: neither a unification of the strong and electroweak forces – which Laplace would have called 'contact forces' – nor a unification of these forces with gravitation has been achieved.

Modern physics

Conventional sequence of theories

A Theory of Everything would unify all the fundamental interactions of nature: gravitation, strong interaction, weak interaction, and electromagnetism. Because the weak interaction can transform elementary particles from one kind into another, the TOE should also yield a deep understanding of the various different kinds of possible particles. The usual assumed path of theories is given in the following graph, where each unification step leads one level up:





Theory of everything













Quantum gravity










Space Curvature



Electronuclear force (GUT)

















Standard model of cosmology


Standard model of particle physics
















Strong interaction
SU(3)





Electroweak interaction
SU(2) x U(1)Y





























Weak interaction
SU(2)




Electromagnetism
U(1)EM






































Electricity



Magnetism



In this graph, electroweak unification occurs at around 100 GeV, grand unification is predicted to occur at 1016 GeV, and unification of the GUT force with gravity is expected at the Planck energy, roughly 1019 GeV.

Several Grand Unified Theories (GUTs) have been proposed to unify electromagnetism and the weak and strong forces. Grand unification would imply the existence of an electronuclear force; it is expected to set in at energies of the order of 1016 GeV, far greater than could be reached by any possible Earth-based particle accelerator. Although the simplest GUTs have been experimentally ruled out, the general idea, especially when linked with supersymmetry, remains a favorite candidate in the theoretical physics community. Supersymmetric GUTs seem plausible not only for their theoretical "beauty", but because they naturally produce large quantities of dark matter, and because the inflationary force may be related to GUT physics (although it does not seem to form an inevitable part of the theory). Yet GUTs are clearly not the final answer; both the current standard model and all proposed GUTs are quantum field theories which require the problematic technique of renormalization to yield sensible answers. This is usually regarded as a sign that these are only effective field theories, omitting crucial phenomena relevant only at very high energies.

The final step in the graph requires resolving the separation between quantum mechanics and gravitation, often equated with general relativity. Numerous researchers concentrate their efforts on this specific step; nevertheless, no accepted theory of quantum gravity – and thus no accepted theory of everything – has emerged yet. It is usually assumed that the TOE will also solve the remaining problems of GUTs.

In addition to explaining the forces listed in the graph, a TOE may also explain the status of at least two candidate forces suggested by modern cosmology: an inflationary force and dark energy. Furthermore, cosmological experiments also suggest the existence of dark matter, supposedly composed of fundamental particles outside the scheme of the standard model. However, the existence of these forces and particles has not been proven.

String theory and M-theory

Since the 1990s, some physicists such as Edward Witten believe that 11-dimensional M-theory, which is described in some limits by one of the five perturbative superstring theories, and in another by the maximally-supersymmetric 11-dimensional supergravity, is the theory of everything. However, there is no widespread consensus on this issue.

A surprising property of string/M-theory is that extra dimensions are required for the theory's consistency. In this regard, string theory can be seen as building on the insights of the Kaluza–Klein theory, in which it was realized that applying general relativity to a five-dimensional universe (with one of them small and curled up) looks from the four-dimensional perspective like the usual general relativity together with Maxwell's electrodynamics. This lent credence to the idea of unifying gauge and gravity interactions, and to extra dimensions, but did not address the detailed experimental requirements. Another important property of string theory is its supersymmetry, which together with extra dimensions are the two main proposals for resolving the hierarchy problem of the standard model, which is (roughly) the question of why gravity is so much weaker than any other force. The extra-dimensional solution involves allowing gravity to propagate into the other dimensions while keeping other forces confined to a four-dimensional spacetime, an idea that has been realized with explicit stringy mechanisms.

Research into string theory has been encouraged by a variety of theoretical and experimental factors. On the experimental side, the particle content of the standard model supplemented with neutrino masses fits into a spinor representation of SO(10), a subgroup of E8 that routinely emerges in string theory, such as in heterotic string theory or (sometimes equivalently) in F-theory. String theory has mechanisms that may explain why fermions come in three hierarchical generations, and explain the mixing rates between quark generations. On the theoretical side, it has begun to address some of the key questions in quantum gravity, such as resolving the black hole information paradox, counting the correct entropy of black holes and allowing for topology-changing processes. It has also led to many insights in pure mathematics and in ordinary, strongly-coupled gauge theory due to the Gauge/String duality.

In the late 1990s, it was noted that one major hurdle in this endeavor is that the number of possible four-dimensional universes is incredibly large. The small, "curled up" extra dimensions can be compactified in an enormous number of different ways (one estimate is 10500 ) each of which leads to different properties for the low-energy particles and forces. This array of models is known as the string theory landscape.

One proposed solution is that many or all of these possibilities are realised in one or another of a huge number of universes, but that only a small number of them are habitable. Hence what we normally conceive as the fundamental constants of the universe are ultimately the result of the anthropic principle rather than dictated by theory. This has led to criticism of string theory, arguing that it cannot make useful (i.e., original, falsifiable, and verifiable) predictions and regarding it as a pseudoscience. Others disagree, and string theory remains an active topic of investigation in theoretical physics.

Loop quantum gravity

Current research on loop quantum gravity may eventually play a fundamental role in a TOE, but that is not its primary aim. Also loop quantum gravity introduces a lower bound on the possible length scales.

There have been recent claims that loop quantum gravity may be able to reproduce features resembling the Standard Model. So far only the first generation of fermions (leptons and quarks) with correct parity properties have been modelled by Sundance Bilson-Thompson using preons constituted of braids of spacetime as the building blocks. However, there is no derivation of the Lagrangian that would describe the interactions of such particles, nor is it possible to show that such particles are fermions, nor that the gauge groups or interactions of the Standard Model are realised. Utilization of quantum computing concepts made it possible to demonstrate that the particles are able to survive quantum fluctuations.

This model leads to an interpretation of electric and colour charge as topological quantities (electric as number and chirality of twists carried on the individual ribbons and colour as variants of such twisting for fixed electric charge).

Bilson-Thompson's original paper suggested that the higher-generation fermions could be represented by more complicated braidings, although explicit constructions of these structures were not given. The electric charge, colour, and parity properties of such fermions would arise in the same way as for the first generation. The model was expressly generalized for an infinite number of generations and for the weak force bosons (but not for photons or gluons) in a 2008 paper by Bilson-Thompson, Hackett, Kauffman and Smolin.

Other attempts

Among other attempts to develop a theory of everything is the theory of causal fermion systems, giving the two current physical theories (general relativity and quantum field theory) as limiting cases.

Another theory is called Causal Sets. As some of the approaches mentioned above, its direct goal isn't necessarily to achieve a TOE but primarily a working theory of quantum gravity, which might eventually include the standard model and become a candidate for a TOE. Its founding principle is that spacetime is fundamentally discrete and that the spacetime events are related by a partial order. This partial order has the physical meaning of the causality relations between relative past and future distinguishing spacetime events.

Outside the previously mentioned attempts there is Garrett Lisi's E8 proposal. This theory attempts to construct general relativity and the standard model within the Lie group E8. The theory doesn't provide a novel quantization procedure and the author suggests its quantization might follow the Loop Quantum Gravity approach above mentioned.

Causal dynamical triangulation does not assume any pre-existing arena (dimensional space), but rather attempts to show how the spacetime fabric itself evolves.

Christoph Schiller's Strand Model attempts to account for the gauge symmetry of the Standard Model of particle physics, U(1)×SU(2)×SU(3), with the three Reidemeister moves of knot theory by equating each elementary particle to a different tangle of one, two, or three strands (selectively a long prime knot or unknotted curve, a rational tangle, or a braided tangle respectively).

Another attempt may be related to ER=EPR, a conjecture in physics stating that entangled particles are connected by a wormhole (or Einstein–Rosen bridge).

Present status

At present, there is no candidate theory of everything that includes the standard model of particle physics and general relativity and that, at the same time, is able to calculate the fine structure constant or the mass of the electron. Most particle physicists expect that the outcome of the ongoing experiments – the search for new particles at the large particle accelerators and for dark matter – are needed in order to provide further input for a TOE.

Arguments against

In parallel to the intense search for a TOE, various scholars have seriously debated the possibility of its discovery.

Gödel's incompleteness theorem

A number of scholars claim that Gödel's incompleteness theorem suggests that any attempt to construct a TOE is bound to fail. Gödel's theorem, informally stated, asserts that any formal theory sufficient to express elementary arithmetical facts and strong enough for them to be proved is either inconsistent (both a statement and its denial can be derived from its axioms) or incomplete, in the sense that there is a true statement that can't be derived in the formal theory.

Stanley Jaki, in his 1966 book The Relevance of Physics, pointed out that, because any "theory of everything" will certainly be a consistent non-trivial mathematical theory, it must be incomplete. He claims that this dooms searches for a deterministic theory of everything.

Freeman Dyson has stated that "Gödel's theorem implies that pure mathematics is inexhaustible. No matter how many problems we solve, there will always be other problems that cannot be solved within the existing rules. […] Because of Gödel's theorem, physics is inexhaustible too. The laws of physics are a finite set of rules, and include the rules for doing mathematics, so that Gödel's theorem applies to them."

Stephen Hawking was originally a believer in the Theory of Everything, but after considering Gödel's Theorem, he concluded that one was not obtainable. "Some people will be very disappointed if there is not an ultimate theory that can be formulated as a finite number of principles. I used to belong to that camp, but I have changed my mind."

Jürgen Schmidhuber (1997) has argued against this view; he points out that Gödel's theorems are irrelevant for computable physics. In 2000, Schmidhuber explicitly constructed limit-computable, deterministic universes whose pseudo-randomness based on undecidable, Gödel-like halting problems is extremely hard to detect but does not at all prevent formal TOEs describable by very few bits of information.

Related critique was offered by Solomon Feferman, among others. Douglas S. Robertson offers Conway's game of life as an example: The underlying rules are simple and complete, but there are formally undecidable questions about the game's behaviors. Analogously, it may (or may not) be possible to completely state the underlying rules of physics with a finite number of well-defined laws, but there is little doubt that there are questions about the behavior of physical systems which are formally undecidable on the basis of those underlying laws.

Since most physicists would consider the statement of the underlying rules to suffice as the definition of a "theory of everything", most physicists argue that Gödel's Theorem does not mean that a TOE cannot exist. On the other hand, the scholars invoking Gödel's Theorem appear, at least in some cases, to be referring not to the underlying rules, but to the understandability of the behavior of all physical systems, as when Hawking mentions arranging blocks into rectangles, turning the computation of prime numbers into a physical question. This definitional discrepancy may explain some of the disagreement among researchers.

Fundamental limits in accuracy

No physical theory to date is believed to be precisely accurate. Instead, physics has proceeded by a series of "successive approximations" allowing more and more accurate predictions over a wider and wider range of phenomena. Some physicists believe that it is therefore a mistake to confuse theoretical models with the true nature of reality, and hold that the series of approximations will never terminate in the "truth". Einstein himself expressed this view on occasions. Following this view, we may reasonably hope for a theory of everything which self-consistently incorporates all currently known forces, but we should not expect it to be the final answer.

On the other hand, it is often claimed that, despite the apparently ever-increasing complexity of the mathematics of each new theory, in a deep sense associated with their underlying gauge symmetry and the number of dimensionless physical constants, the theories are becoming simpler. If this is the case, the process of simplification cannot continue indefinitely.

Lack of fundamental laws

There is a philosophical debate within the physics community as to whether a theory of everything deserves to be called the fundamental law of the universe. One view is the hard reductionist position that the TOE is the fundamental law and that all other theories that apply within the universe are a consequence of the TOE. Another view is that emergent laws, which govern the behavior of complex systems, should be seen as equally fundamental. Examples of emergent laws are the second law of thermodynamics and the theory of natural selection. The advocates of emergence argue that emergent laws, especially those describing complex or living systems are independent of the low-level, microscopic laws. In this view, emergent laws are as fundamental as a TOE.

The debates do not make the point at issue clear. Possibly the only issue at stake is the right to apply the high-status term "fundamental" to the respective subjects of research. A well-known debate over this took place between Steven Weinberg and Philip Anderson.

Impossibility of being "of everything"

Although the name "theory of everything" suggests the determinism of Laplace's quotation, this gives a very misleading impression. Determinism is frustrated by the probabilistic nature of quantum mechanical predictions, by the extreme sensitivity to initial conditions that leads to mathematical chaos, by the limitations due to event horizons, and by the extreme mathematical difficulty of applying the theory. Thus, although the current standard model of particle physics "in principle" predicts almost all known non-gravitational phenomena, in practice only a few quantitative results have been derived from the full theory (e.g., the masses of some of the simplest hadrons), and these results (especially the particle masses which are most relevant for low-energy physics) are less accurate than existing experimental measurements. The TOE would almost certainly be even harder to apply for the prediction of experimental results, and thus might be of limited use.

A motive for seeking a TOE, apart from the pure intellectual satisfaction of completing a centuries-long quest, is that prior examples of unification have predicted new phenomena, some of which (e.g., electrical generators) have proved of great practical importance. And like in these prior examples of unification, the TOE would probably allow us to confidently define the domain of validity and residual error of low-energy approximations to the full theory.

The theories generally do not account for the apparent phenomena of consciousness or free will, which are instead often the subject of philosophy and religion.

Infinite number of onion layers

Frank Close regularly argues that the layers of nature may be like the layers of an onion, and that the number of layers might be infinite. This would imply an infinite sequence of physical theories.

Impossibility of calculation

Weinberg points out that calculating the precise motion of an actual projectile in the Earth's atmosphere is impossible. So how can we know we have an adequate theory for describing the motion of projectiles? Weinberg suggests that we know principles (Newton's laws of motion and gravitation) that work "well enough" for simple examples, like the motion of planets in empty space. These principles have worked so well on simple examples that we can be reasonably confident they will work for more complex examples. For example, although general relativity includes equations that do not have exact solutions, it is widely accepted as a valid theory because all of its equations with exact solutions have been experimentally verified. Likewise, a TOE must work for a wide range of simple examples in such a way that we can be reasonably confident it will work for every situation in physics.

Millennium Run

From Wikipedia, the free encyclopedia

The Millennium Run, or Millennium Simulation (referring to its size) is a computer N-body simulation used to investigate how the distribution of matter in the Universe has evolved over time, in particular, how the observed population of galaxies was formed. It is used by scientists working in physical cosmology to compare observations with theoretical predictions.

Overview

A basic scientific method for testing theories in cosmology is to evaluate their consequences for the observable parts of the universe. One piece of observational evidence is the distribution of matter, including galaxies and intergalactic gas, which are observed today. Light emitted from more distant matter must travel longer in order to reach Earth, meaning looking at distant objects is like looking further back in time. This means the evolution in time of the matter distribution in the universe can also be observed directly.

The Millennium Simulation was run in 2005 by the Virgo Consortium, an international group of astrophysicists from Germany, the United Kingdom, Canada, Japan and the United States. It starts at the epoch when the cosmic background radiation was emitted, about 379,000 years after the universe began. The cosmic background radiation has been studied by satellite experiments, and the observed inhomogeneities in the cosmic background serve as the starting point for following the evolution of the corresponding matter distribution. Using the physical laws expected to hold in the currently known cosmologies and simplified representations of the astrophysical processes observed to affect real galaxies, the initial distribution of matter is allowed to evolve, and the simulation's predictions for formation of galaxies and black holes are recorded.

Since the completion of the Millennium Run simulation in 2005, a series of ever more sophisticated and higher fidelity simulations of the formation of the galaxy population have been built within its stored output and have been made publicly available over the internet. In addition to improving the treatment of the astrophysics of galaxy formation, recent versions have adjusted the parameters of the underlying cosmological model to reflect changing ideas about their precise values. To date (mid-2018) more than 950 published papers have made use of data from the Millennium Run, making it, at least by this measure, the highest impact astrophysical simulation of all time.

Size of the simulation

For the first scientific results, published on June 2, 2005, the Millennium Simulation traced 21603, or just over 10 billion, "particles." These are not particles in the particle physics sense – each "particle" represents approximately a billion solar masses of dark matter. The region of space simulated was a cube with about 2 billion light years as its length. This volume was populated by about 20 million "galaxies". A super computer located in Garching, Germany executed the simulation, which used a version of the GADGET code, for more than a month. The output of the simulation needed about 25 terabytes of storage.

First results

The Sloan Digital Sky Survey had challenged the current understanding of cosmology by finding black hole candidates in very bright quasars at large distances. This meant that they were created much earlier than initially expected. In successfully managing to produce quasars at early times, the Millennium Simulation demonstrated that these objects do not contradict our models of the evolution of the universe.

Millennium II

In 2009, the same group ran the 'Millennium II' simulation (MS-II) on a smaller cube (about 400 million light years on a side), with the same number of particles but with each particle representing 6.9 million solar masses. This is a rather harder numerical task since splitting the computational domain between processors becomes harder when dense clumps of matter are present. MS-II used 1.4 million CPU hours over 2048 cores (i.e. about a month) on the Power-6 computer at Garching; a simulation was also run with the same initial conditions and fewer particles to check that features in the higher-resolution run were also seen at lower resolution.

Millennium XXL

In 2010, the 'Millennium XXL' simulation (MXXL) was performed, this time using a much larger cube (over 13 billion light years on a side), and 67203 particles each representing 7 billion times the mass of the Sun. The MXXL spans a cosmological volume 216 and 27,000 times the size of the Millennium and the MS-II simulation boxes, respectively. The simulation was run on JUROPA, one of the top 15 supercomputers in the world in 2010. It used more than 12,000 cores for an equivalent of 300 years CPU time, 30 terabytes of RAM and generated more than 100 terabytes of data. Cosmologists use the MXXL simulation to study the distribution of galaxies and dark matter halos on very large scales and how the rarest and most massive structures in the universe came about.

Millennium Run Observatory

In 2012, the Millennium Run Observatory (MRObs) project was launched. The MRObs is a theoretical virtual observatory that integrates detailed predictions for the dark matter (from the Millennium simulations) and for the galaxies (from semi-analytical models) with a virtual telescope to synthesize artificial observations. Astrophysicists use these virtual observations to study how the predictions from the Millennium simulations compare to the real universe, to plan future observational surveys, and to calibrate the techniques used by astronomers to analyze real observations. A first set of virtual observations produced by the MRObs have been released to the astronomical community for analysis through the MRObs Web portal. The virtual universe can also be accessed through a new online tool, the MRObs browser, which allows users to interact with the Millennium Run Relational Database where the properties of millions of dark matter halos and their galaxies from the Millennium project are being stored. Upgrades to the MRObs framework, and its extension to other types of simulations, are currently being planned.

Thermodynamic diagrams

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Thermodynamic_diagrams Thermodynamic diagrams are diagrams used to repr...