Search This Blog

Thursday, October 8, 2020

Lambda-CDM model

From Wikipedia, the free encyclopedia

The ΛCDM (Lambda cold dark matter) or Lambda-CDM model is a parametrization of the Big Bang cosmological model in which the universe contains three major components: first, a cosmological constant denoted by Lambda (Greek Λ) and associated with dark energy; second, the postulated cold dark matter (abbreviated CDM); and third, ordinary matter. It is frequently referred to as the standard model of Big Bang cosmology because it is the simplest model that provides a reasonably good account of the following properties of the cosmos:

The model assumes that general relativity is the correct theory of gravity on cosmological scales. It emerged in the late 1990s as a concordance cosmology, after a period of time when disparate observed properties of the universe appeared mutually inconsistent, and there was no consensus on the makeup of the energy density of the universe.

The ΛCDM model can be extended by adding cosmological inflation, quintessence and other elements that are current areas of speculation and research in cosmology.

Some alternative models challenge the assumptions of the ΛCDM model. Examples of these are modified Newtonian dynamics, entropic gravity, modified gravity, theories of large-scale variations in the matter density of the universe, bimetric gravity, scale invariance of empty space, and decaying dark matter (DDM).

Overview

Lambda-CDM, accelerated expansion of the universe. The time-
line in this schematic diagram extends from the Big Bang/
inflation era 13.7 Byr ago to the present cosmological time.

Most modern cosmological models are based on the cosmological principle, which states that our observational location in the universe is not unusual or special; on a large-enough scale, the universe looks the same in all directions (isotropy) and from every location (homogeneity).

The model includes an expansion of metric space that is well documented both as the red shift of prominent spectral absorption or emission lines in the light from distant galaxies and as the time dilation in the light decay of supernova luminosity curves. Both effects are attributed to a Doppler shift in electromagnetic radiation as it travels across expanding space. Although this expansion increases the distance between objects that are not under shared gravitational influence, it does not increase the size of the objects (e.g. galaxies) in space. It also allows for distant galaxies to recede from each other at speeds greater than the speed of light; local expansion is less than the speed of light, but expansion summed across great distances can collectively exceed the speed of light.

The letter (lambda) represents the cosmological constant, which is currently associated with a vacuum energy or dark energy in empty space that is used to explain the contemporary accelerating expansion of space against the attractive effects of gravity. A cosmological constant has negative pressure, , which contributes to the stress-energy tensor that, according to the general theory of relativity, causes accelerating expansion. The fraction of the total energy density of our (flat or almost flat) universe that is dark energy, , is estimated to be 0.669 ± 0.038 based on the 2018 Dark Energy Survey results using Type Ia Supernovae or 0.6847 ± 0.0073 based on the 2018 release of Planck satellite data, or more than 68.3% (2018 estimate) of the mass-energy density of the universe.

Dark matter is postulated in order to account for gravitational effects observed in very large-scale structures (the "flat" rotation curves of galaxies; the gravitational lensing of light by galaxy clusters; and enhanced clustering of galaxies) that cannot be accounted for by the quantity of observed matter.

Cold dark matter as currently hypothesized is:

non-baryonic
It consists of matter other than protons and neutrons (and electrons, by convention, although electrons are not baryons).
cold
Its velocity is far less than the speed of light at the epoch of radiation-matter equality (thus neutrinos are excluded, being non-baryonic but not cold).
dissipationless
It cannot cool by radiating photons.
collisionless
The dark matter particles interact with each other and other particles only through gravity and possibly the weak force.

Dark matter constitutes about 26.5% of the mass-energy density of the universe. The remaining 4.9% comprises all ordinary matter observed as atoms, chemical elements, gas and plasma, the stuff of which visible planets, stars and galaxies are made. The great majority of ordinary matter in the universe is unseen, since visible stars and gas inside galaxies and clusters account for less than 10% of the ordinary matter contribution to the mass-energy density of the universe.

Also, the energy density includes a very small fraction (~ 0.01%) in cosmic microwave background radiation, and not more than 0.5% in relic neutrinos. Although very small today, these were much more important in the distant past, dominating the matter at redshift > 3200.

The model includes a single originating event, the "Big Bang", which was not an explosion but the abrupt appearance of expanding space-time containing radiation at temperatures of around 1015 K. This was immediately (within 10−29 seconds) followed by an exponential expansion of space by a scale multiplier of 1027 or more, known as cosmic inflation. The early universe remained hot (above 10,000 K) for several hundred thousand years, a state that is detectable as a residual cosmic microwave background, or CMB, a very low energy radiation emanating from all parts of the sky. The "Big Bang" scenario, with cosmic inflation and standard particle physics, is the only current cosmological model consistent with the observed continuing expansion of space, the observed distribution of lighter elements in the universe (hydrogen, helium, and lithium), and the spatial texture of minute irregularities (anisotropies) in the CMB radiation. Cosmic inflation also addresses the "horizon problem" in the CMB; indeed, it seems likely that the universe is larger than the observable particle horizon.

The model uses the Friedmann–Lemaître–Robertson–Walker metric, the Friedmann equations and the cosmological equations of state to describe the observable universe from right after the inflationary epoch to present and future.

Cosmic expansion history

The expansion of the universe is parameterized by a dimensionless scale factor (with time counted from the birth of the universe), defined relative to the present day, so ; the usual convention in cosmology is that subscript 0 denotes present-day values, so is the current age of the universe. The scale factor is related to the observed redshift of the light emitted at time by

The expansion rate is described by the time-dependent Hubble parameter, , defined as

where is the time-derivative of the scale factor. The first Friedmann equation gives the expansion rate in terms of the matter+radiation density , the curvature , and the cosmological constant ,

where as usual is the speed of light and is the gravitational constant. A critical density is the present-day density, which gives zero curvature , assuming the cosmological constant is zero, regardless of its actual value. Substituting these conditions to the Friedmann equation gives

where is the reduced Hubble constant. If the cosmological constant were actually zero, the critical density would also mark the dividing line between eventual recollapse of the universe to a Big Crunch, or unlimited expansion. For the Lambda-CDM model with a positive cosmological constant (as observed), the universe is predicted to expand forever regardless of whether the total density is slightly above or below the critical density; though other outcomes are possible in extended models where the dark energy is not constant but actually time-dependent.

It is standard to define the present-day density parameter for various species as the dimensionless ratio

where the subscript is one of for baryons, for cold dark matter, for radiation (photons plus relativistic neutrinos), and or for dark energy.

Since the densities of various species scale as different powers of , e.g. for matter etc., the Friedmann equation can be conveniently rewritten in terms of the various density parameters as

where is the equation of state parameter of dark energy, and assuming negligible neutrino mass (significant neutrino mass requires a more complex equation). The various parameters add up to by construction. In the general case this is integrated by computer to give the expansion history and also observable distance-redshift relations for any chosen values of the cosmological parameters, which can then be compared with observations such as supernovae and baryon acoustic oscillations.

In the minimal 6-parameter Lambda-CDM model, it is assumed that curvature is zero and , so this simplifies to

Observations show that the radiation density is very small today, ; if this term is neglected the above has an analytic solution

where this is fairly accurate for or million years. Solving for gives the present age of the universe in terms of the other parameters.

It follows that the transition from decelerating to accelerating expansion (the second derivative crossing zero) occurred when

which evaluates to or for the best-fit parameters estimated from the Planck spacecraft.

Historical development

The discovery of the cosmic microwave background (CMB) in 1964 confirmed a key prediction of the Big Bang cosmology. From that point on, it was generally accepted that the universe started in a hot, dense state and has been expanding over time. The rate of expansion depends on the types of matter and energy present in the universe, and in particular, whether the total density is above or below the so-called critical density.

During the 1970s, most attention focused on pure-baryonic models, but there were serious challenges explaining the formation of galaxies, given the small anisotropies in the CMB (upper limits at that time). In the early 1980s, it was realized that this could be resolved if cold dark matter dominated over the baryons, and the theory of cosmic inflation motivated models with critical density.

During the 1980s, most research focused on cold dark matter with critical density in matter, around 95% CDM and 5% baryons: these showed success at forming galaxies and clusters of galaxies, but problems remained; notably, the model required a Hubble constant lower than preferred by observations, and observations around 1988–1990 showed more large-scale galaxy clustering than predicted.

These difficulties sharpened with the discovery of CMB anisotropy by the Cosmic Background Explorer in 1992, and several modified CDM models, including ΛCDM and mixed cold and hot dark matter, came under active consideration through the mid-1990s. The ΛCDM model then became the leading model following the observations of accelerating expansion in 1998, and was quickly supported by other observations: in 2000, the BOOMERanG microwave background experiment measured the total (matter–energy) density to be close to 100% of critical, whereas in 2001 the 2dFGRS galaxy redshift survey measured the matter density to be near 25%; the large difference between these values supports a positive Λ or dark energy. Much more precise spacecraft measurements of the microwave background from WMAP in 2003–2010 and Planck in 2013–2015 have continued to support the model and pin down the parameter values, most of which are now constrained below 1 percent uncertainty.

There is currently active research into many aspects of the ΛCDM model, both to refine the parameters and possibly detect deviations. In addition, ΛCDM has no explicit physical theory for the origin or physical nature of dark matter or dark energy; the nearly scale-invariant spectrum of the CMB perturbations, and their image across the celestial sphere, are believed to result from very small thermal and acoustic irregularities at the point of recombination.

A large majority of astronomers and astrophysicists support the ΛCDM model or close relatives of it, but Milgrom, McGaugh, and Kroupa are leading critics, attacking the dark matter portions of the theory from the perspective of galaxy formation models and supporting the alternative modified Newtonian dynamics (MOND) theory, which requires a modification of the Einstein field equations and the Friedmann equations as seen in proposals such as modified gravity theory (MOG theory) or tensor–vector–scalar gravity theory (TeVeS theory). Other proposals by theoretical astrophysicists of cosmological alternatives to Einstein's general relativity that attempt to account for dark energy or dark matter include f(R) gravity, scalar–tensor theories such as galileon theories, brane cosmologies, the DGP model, and massive gravity and its extensions such as bimetric gravity.

Successes

In addition to explaining pre-2000 observations, the model has made a number of successful predictions: notably the existence of the baryon acoustic oscillation feature, discovered in 2005 in the predicted location; and the statistics of weak gravitational lensing, first observed in 2000 by several teams. The polarization of the CMB, discovered in 2002 by DASI, is now a dramatic success: in the 2015 Planck data release, there are seven observed peaks in the temperature (TT) power spectrum, six peaks in the temperature-polarization (TE) cross spectrum, and five peaks in the polarization (EE) spectrum. The six free parameters can be well constrained by the TT spectrum alone, and then the TE and EE spectra can be predicted theoretically to few-percent precision with no further adjustments allowed: comparison of theory and observations shows an excellent match.

Challenges

Extensive searches for dark matter particles have so far shown no well-agreed detection; the dark energy may be almost impossible to detect in a laboratory, and its value is unnaturally small compared to naive theoretical predictions.

Comparison of the model with observations is very successful on large scales (larger than galaxies, up to the observable horizon), but may have some problems on sub-galaxy scales, possibly predicting too many dwarf galaxies and too much dark matter in the innermost regions of galaxies. This problem is called the "small scale crisis". These small scales are harder to resolve in computer simulations, so it is not yet clear whether the problem is the simulations, non-standard properties of dark matter, or a more radical error in the model.

It has been argued that the ΛCDM model is built upon a foundation of conventionalist stratagems, rendering it unfalsifiable in the sense defined by Karl Popper.

Parameters

Planck Collaboration Cosmological parameters

Description Symbol Value
Indepen-
dent
para-
meters
Physical baryon density parameter Ωb h2 0.02230±0.00014
Physical dark matter density parameter Ωc h2 0.1188±0.0010
Age of the universe t0 13.799±0.021 × 109 years
Scalar spectral index ns 0.9667±0.0040
Curvature fluctuation amplitude,
k0 = 0.002 Mpc−1
2.441+0.088
−0.092
×10−9
Reionization optical depth τ 0.066±0.012
Fixed
para-
meters
Total density parameter Ωtot 1
Equation of state of dark energy w −1
Tensor/scalar ratio r 0
Running of spectral index 0
Sum of three neutrino masses 0.06
Effective number of relativistic degrees
of freedom
Neff 3.046
Calcu-
lated
values
Hubble constant H0 67.74±0.46 km s−1 Mpc−1
Baryon density parameter Ωb 0.0486±0.0010
Dark matter density parameter Ωc 0.2589±0.0057
Matter density parameter Ωm 0.3089±0.0062
Dark energy density parameter ΩΛ 0.6911±0.0062
Critical density ρcrit (8.62±0.12)×10−27 kg/m3[g]
The present root-mean-square matter fluctuation

averaged over a sphere of radius 8h1 Mpc

σ8 0.8159±0.0086
Redshift at decoupling z 1089.90±0.23
Age at decoupling t 377700±3200 years
Redshift of reionization (with uniform prior) zre 8.5+1.0
−1.1

The simple ΛCDM model is based on six parameters: physical baryon density parameter; physical dark matter density parameter; the age of the universe; scalar spectral index; curvature fluctuation amplitude; and reionization optical depth. In accordance with Occam's razor, six is the smallest number of parameters needed to give an acceptable fit to current observations; other possible parameters are fixed at "natural" values, e.g. total density parameter = 1.00, dark energy equation of state = −1. (See below for extended models that allow these to vary.)

The values of these six parameters are mostly not predicted by current theory (though, ideally, they may be related by a future "Theory of Everything"), except that most versions of cosmic inflation predict the scalar spectral index should be slightly smaller than 1, consistent with the estimated value 0.96. The parameter values, and uncertainties, are estimated using large computer searches to locate the region of parameter space providing an acceptable match to cosmological observations. From these six parameters, the other model values, such as the Hubble constant and the dark energy density, can be readily calculated.

Commonly, the set of observations fitted includes the cosmic microwave background anisotropy, the brightness/redshift relation for supernovae, and large-scale galaxy clustering including the baryon acoustic oscillation feature. Other observations, such as the Hubble constant, the abundance of galaxy clusters, weak gravitational lensing and globular cluster ages, are generally consistent with these, providing a check of the model, but are less precisely measured at present.

Parameter values listed below are from the Planck Collaboration Cosmological parameters 68% confidence limits for the base ΛCDM model from Planck CMB power spectra, in combination with lensing reconstruction and external data (BAO + JLA + H0).


  1. Calculated from h = H0 / (100 km s−1 Mpc−1) per ρcrit = 1.87847×10−26 h2 kg m−3.

Missing baryon problem

Extended models

Extended model parameters
Description Symbol Value
Total density parameter 1.0023+0.0056
−0.0054
Equation of state of dark energy −0.980±0.053
Tensor-to-scalar ratio < 0.11, k0 = 0.002 Mpc−1 ()
Running of the spectral index −0.022±0.020, k0 = 0.002 Mpc−1
Sum of three neutrino masses < 0.58 eV/c2 ()
Physical neutrino density parameter < 0.0062

Extended models allow one or more of the "fixed" parameters above to vary, in addition to the basic six; so these models join smoothly to the basic six-parameter model in the limit that the additional parameter(s) approach the default values. For example, possible extensions of the simplest ΛCDM model allow for spatial curvature ( may be different from 1); or quintessence rather than a cosmological constant where the equation of state of dark energy is allowed to differ from −1. Cosmic inflation predicts tensor fluctuations (gravitational waves). Their amplitude is parameterized by the tensor-to-scalar ratio (denoted ), which is determined by the unknown energy scale of inflation. Other modifications allow hot dark matter in the form of neutrinos more massive than the minimal value, or a running spectral index; the latter is generally not favoured by simple cosmic inflation models.

Allowing additional variable parameter(s) will generally increase the uncertainties in the standard six parameters quoted above, and may also shift the central values slightly. The Table below shows results for each of the possible "6+1" scenarios with one additional variable parameter; this indicates that, as of 2015, there is no convincing evidence that any additional parameter is different from its default value.

Some researchers have suggested that there is a running spectral index, but no statistically significant study has revealed one. Theoretical expectations suggest that the tensor-to-scalar ratio should be between 0 and 0.3, and the latest results are now within those limits.

  • The "physical baryon density parameter" Ωb h2 is the "baryon density parameter" Ωb multiplied by the square of the reduced Hubble constant h = H0 / (100 km s−1 Mpc−1). Likewise for the difference between "physical dark matter density parameter" and "dark matter density parameter".

  • A density ρx = Ωxρcrit is expressed in terms of the critical density ρcrit, which is the total density of matter/energy needed for the universe to be spatially flat. Measurements indicate that the actual total density ρtot is very close if not equal to this value, see below.

  • This is the minimal value allowed by solar and terrestrial neutrino oscillation experiments.

  • from the Standard Model of particle physics

  • Calculated from Ωbh2 and h = H0 / (100 km s−1 Mpc−1).

  • Calculated from Ωch2 and h = H0 / (100 km s−1 Mpc−1).
  • Non-standard cosmology

    From Wikipedia, the free encyclopedia
    Jump to navigation Jump to search

    A non-standard cosmology is any physical cosmological model of the universe that was, or still is, proposed as an alternative to the then-current standard model of cosmology. The term non-standard is applied to any theory that does not conform to the scientific consensus. Because the term depends on the prevailing consensus, the meaning of the term changes over time. For example, hot dark matter would not have been considered non-standard in 1990, but would be in 2010. Conversely, a non-zero cosmological constant resulting in an accelerating universe would have been considered non-standard in 1990, but is part of the standard cosmology in 2010.

    Several major cosmological disputes have occurred throughout the history of cosmology. One of the earliest was the Copernican Revolution, which established the heliocentric model of the Solar System. More recent was the Great Debate of 1920, in the aftermath of which the Milky Way's status as but one of the Universe's many galaxies was established. From the 1940s to the 1960s, the astrophysical community was equally divided between supporters of the Big Bang theory and supporters of a rival steady state universe; this was eventually decided in favour of the Big Bang theory by advances in observational cosmology in the late 1960s. The current standard model of cosmology is the Lambda-CDM model, wherein the Universe is governed by General Relativity, began with a Big Bang and today is a nearly-flat universe that consists of approximately 5% baryons, 27% cold dark matter, and 68% dark energy.

    Lambda-CDM has been an extremely successful model, but retains some weaknesses (such as the dwarf galaxy problem). Research on extensions or modifications to Lambda-CDM, as well as fundamentally different models, is ongoing. Topics investigated include quintessence, Modified Newtonian Dynamics (MOND) and its relativistic generalization TeVeS, and warm dark matter.

    The Lambda-CDM model

    Before observational evidence was gathered, theorists developed frameworks based on what they understood to be the most general features of physics and philosophical assumptions about the universe. When Albert Einstein developed his general theory of relativity in 1915, this was used as a mathematical starting point for most cosmological theories. In order to arrive at a cosmological model, however, theoreticians needed to make assumptions about the nature of the largest scales of the universe. The assumptions that the current standard model of cosmology, Lambda-CDM, relies upon are:

    1. the universality of physical laws – that the laws of physics don't change from one place and time to another,
    2. the cosmological principle – that the universe is roughly homogeneous and isotropic in space though not necessarily in time, and
    3. the Copernican principle – that we are not observing the universe from a preferred locale.

    These assumptions when combined with General Relativity result in a universe that is governed by the Friedmann–Robertson–Walker metric (FRW metric). The FRW metric allows for a universe that is either expanding or contracting (as well as stationary but unstable universes). When Hubble's Law was discovered, most astronomers interpreted the law as a sign the universe is expanding. This implies the universe was smaller in the past, and therefore led to the following conclusions:

    1. the universe emerged from a hot, dense state at a finite time in the past,
    2. because the universe heats up as it contracts and cools as it expands, in the first moments that time existed as we know it, the temperatures were high enough for Big Bang nucleosynthesis to occur, and
    3. a cosmic microwave background pervading the entire universe should exist, which is a record of a phase transition that occurred when the atoms of the universe first formed.

    These features were derived by numerous individuals over a period of years; indeed it was not until the middle of the twentieth century that accurate predictions of the last feature and observations confirming its existence were made. Non-standard theories developed either by starting from different assumptions or by contradicting the features predicted by Lambda-CDM.

    History

    Modern physical cosmology as it is currently studied first emerged as a scientific discipline in the period after the Shapley–Curtis debate and discoveries by Edwin Hubble of a cosmic distance ladder when astronomers and physicists had to come to terms with a universe that was of a much larger scale than the previously assumed galactic size. Theorists who successfully developed cosmologies applicable to the larger-scale universe are remembered today as the founders of modern cosmology. Among these scientists are Arthur Milne, Willem de Sitter, Alexander Friedman, Georges Lemaître, and Albert Einstein himself.

    After confirmation of the Hubble's law by observation, the two most popular cosmological theories became the Steady State theory of Hoyle, Gold and Bondi, and the big bang theory of Ralph Alpher, George Gamow, and Robert Dicke with a small number of supporters of a smattering of alternatives. Since the discovery of the Cosmic microwave background radiation (CMB) by Arno Penzias and Robert Wilson in 1965, most cosmologists concluded that observations were best explained by the big bang model. Steady State theorists and other non-standard cosmologies were then tasked with providing an explanation for the phenomenon if they were to remain plausible. This led to original approaches including integrated starlight and cosmic iron whiskers, which were meant to provide a source for a pervasive, all-sky microwave background that was not due to an early universe phase transition.

    Artist depiction of the WMAP spacecraft at the L2 point. Data gathered by this spacecraft has been successfully used to parametrize the features of standard cosmology, but complete analysis of the data in the context of any non-standard cosmology has not yet been achieved.

    Scepticism about the non-standard cosmologies' ability to explain the CMB caused interest in the subject to wane since then, however, there have been two periods in which interest in non-standard cosmology has increased due to observational data which posed difficulties for the big bang. The first occurred was the late 1970s when there were a number of unsolved problems, such as the horizon problem, the flatness problem, and the lack of magnetic monopoles, which challenged the big bang model. These issues were eventually resolved by cosmic inflation in the 1980s. This idea subsequently became part of the understanding of the big bang, although alternatives have been proposed from time to time. The second occurred in the mid-1990s when observations of the ages of globular clusters and the primordial helium abundance, apparently disagreed with the big bang. However, by the late 1990s, most astronomers had concluded that these observations did not challenge the big bang and additional data from COBE and the WMAP, provided detailed quantitative measures which were consistent with standard cosmology.

    In the 1990s, a dawning of a "golden age of cosmology" was accompanied by a startling discovery that the expansion of the universe was, in fact, accelerating. Previous to this, it had been assumed that matter either in its visible or invisible dark matter form was the dominant energy density in the universe. This "classical" big bang cosmology was overthrown when it was discovered that nearly 70% of the energy in the universe was attributable to the cosmological constant, often referred to as "dark energy". This has led to the development of a so-called concordance ΛCDM model which combines detailed data obtained with new telescopes and techniques in observational astrophysics with an expanding, density-changing universe. Today, it is more common to find in the scientific literature proposals for "non-standard cosmologies" that actually accept the basic tenets of the big bang cosmology, while modifying parts of the concordance model. Such theories include alternative models of dark energy, such as quintessence, phantom energy and some ideas in brane cosmology; alternative models of dark matter, such as modified Newtonian dynamics; alternatives or extensions to inflation such as chaotic inflation and the ekpyrotic model; and proposals to supplement the universe with a first cause, such as the Hartle–Hawking boundary condition, the cyclic model, and the string landscape. There is no consensus about these ideas amongst cosmologists, but they are nonetheless active fields of academic inquiry.

    Today, heterodox non-standard cosmologies are generally considered unworthy of consideration by cosmologists while many of the historically significant nonstandard cosmologies are considered to have been falsified. The essentials of the big bang theory have been confirmed by a wide range of complementary and detailed observations, and no non-standard cosmologies have reproduced the range of successes of the big bang model. Speculations about alternatives are not normally part of research or pedagogical discussions, except as object lessons or for their historical importance. An open letter started by some remaining advocates of non-standard cosmology has affirmed that: "today, virtually all financial and experimental resources in cosmology are devoted to big bang studies...."

    Alternative gravity

    General relativity, upon which the FRW metric is based, is an extremely successful theory which has met every observational test so far. However, at a fundamental level it is incompatible with quantum mechanics, and by predicting singularities, it also predicts its own breakdown. Any alternative theory of gravity would imply immediately an alternative cosmological theory since current modeling is dependent on general relativity as a framework assumption. There are many different motivations to modify general relativity, such as to eliminate the need for dark matter or dark energy, or to avoid such paradoxes as the firewall.

    Machian universe

    Ernst Mach developed a kind of extension to general relativity which proposed that inertia was due to gravitational effects of the mass distribution of the universe. This led naturally to speculation about the cosmological implications for such a proposal. Carl Brans and Robert Dicke were able to successfully incorporate Mach's principle into general relativity which admitted for cosmological solutions that would imply a variable mass. The homogeneously distributed mass of the universe would result in a roughly scalar field that permeated the universe and would serve as a source for Newton's gravitational constant; creating a theory of quantum gravity.

    MOND

    Modified Newtonian Dynamics (MOND) is a relatively modern proposal to explain the galaxy rotation problem based on a variation of Newton's Second Law of Dynamics at low accelerations. This would produce a large-scale variation of Newton's universal theory of gravity. A modification of Newton's theory would also imply a modification of general relativistic cosmology in as much as Newtonian cosmology is the limit of Friedman cosmology. While almost all astrophysicists today reject MOND in favor of dark matter, a small number of researchers continue to enhance it, recently incorporating Brans–Dicke theories into treatments that attempt to account for cosmological observations.

    TeVeS

    Tensor–vector–scalar gravity (TeVeS) is a proposed relativistic theory that is equivalent to Modified Newtonian dynamics (MOND) in the non-relativistic limit, which purports to explain the galaxy rotation problem without invoking dark matter. Originated by Jacob Bekenstein in 2004, it incorporates various dynamical and non-dynamical tensor fields, vector fields and scalar fields.

    The break-through of TeVeS over MOND is that it can explain the phenomenon of gravitational lensing, a cosmic optical illusion in which matter bends light, which has been confirmed many times. A recent preliminary finding is that it can explain structure formation without CDM, but requiring a ~2eV massive neutrino (they are also required to fit some Clusters of galaxies, including the Bullet Cluster). However, other authors (see Slosar, Melchiorri and Silk) claim that TeVeS can't explain cosmic microwave background anisotropies and structure formation at the same time, i.e. ruling out those models at high significance.

    f(R) gravity

    f(R) gravity is a family of theories that modify general relativity by defining a different function of the Ricci scalar. The simplest case is just the function being equal to the scalar; this is general relativity. As a consequence of introducing an arbitrary function, there may be freedom to explain the accelerated expansion and structure formation of the Universe without adding unknown forms of dark energy or dark matter. Some functional forms may be inspired by corrections arising from a quantum theory of gravity. f(R) gravity was first proposed in 1970 by Hans Adolph Buchdahl (although φ was used rather than f for the name of the arbitrary function). It has become an active field of research following work by Starobinsky on cosmic inflation. A wide range of phenomena can be produced from this theory by adopting different functions; however, many functional forms can now be ruled out on observational grounds, or because of pathological theoretical problems.

    Steady State theories

    The Steady State theory extends the homogeneity assumption of the cosmological principle to reflect a homogeneity in time as well as in space. This "perfect cosmological principle" as it would come to be called asserted that the universe looks the same everywhere (on the large scale), the same as it always has and always will. This is in contrast to Lambda-CDM, in which the universe looked very different in the past and will look very different in the future. Steady State theory was proposed in 1948 by Fred Hoyle, Thomas Gold, Hermann Bondi and others. In order to maintain the perfect cosmological principle in an expanding universe, steady state cosmology had to posit a "matter-creation field" (the so-called C-field) that would insert matter into the universe in order to maintain a constant density.

    The debate between the Big Bang and the Steady State models would happen for 15 years with camps roughly evenly divided until the discovery of the cosmic microwave background radiation. This radiation is a natural feature of the Big Bang model which demands a "time of last scattering" where photons decouple with baryonic matter. The Steady State model proposed that this radiation could be accounted for by so-called "integrated starlight" which was a background caused in part by Olbers' paradox in an infinite universe. In order to account for the uniformity of the background, steady state proponents posited a fog effect associated with microscopic iron particles that would scatter radio waves in such a manner as to produce an isotropic CMB. The proposed phenomena was whimsically named "cosmic iron whiskers" and served as the thermalization mechanism. The Steady State theory did not have the horizon problem of the Big Bang because it assumed an infinite amount of time was available for thermalizing the background.

    As more cosmological data began to be collected, cosmologists began to realize that the Big Bang correctly predicted the abundance of light elements observed in the cosmos. What was a coincidental ratio of hydrogen to deuterium and helium in the steady state model was a feature of the Big Bang model. Additionally, detailed measurements of the CMB since the 1990s with the COBE, WMAP and Planck observations indicated that the spectrum of the background was closer to a blackbody than any other source in nature. The best integrated starlight models could predict was a thermalization to the level of 10% while the COBE satellite measured the deviation at one part in 105. After this dramatic discovery, the majority of cosmologists became convinced that the steady state theory could not explain the observed CMB properties.

    Although the original steady state model is now considered to be contrary to observations (particularly the CMB) even by its one-time supporters, modifications of the steady state model have been proposed, including a model that envisions the universe as originating through many little bangs rather than one big bang (the so-called "quasi-steady state cosmology"). It supposes that the universe goes through periodic expansion and contraction phases, with a soft "rebound" in place of the Big Bang. Thus the Hubble Law is explained by the fact that the universe is currently in an expansion phase. Work continues on this model (most notably by Jayant V. Narlikar), although it has not gained widespread mainstream acceptance.

    Anisotropic universe

    Isotropicity – the idea that the universe looks the same in all directions – is one of the core assumptions that enters into the FRW equations. In 2008 however, scientists working on Wilkinson Microwave Anisotropy Probe data claimed to have detected a 600–1000 km/s flow of clusters toward a 20-degree patch of sky between the constellations of Centaurus and Vela. They suggested that the motion may be a remnant of the influence of no-longer-visible regions of the universe prior to inflation. The detection is controversial, and other scientists have found that the universe is isotropic to a great degree.

    Exotic dark matter and dark energy

    In Lambda-CDM, dark matter is an extremely inert form of matter that does not interact with both ordinary matter (baryons) and light, but still exerts gravitational effects. To produce the large-scale structure we see today, dark matter is "cold" (the 'C' in Lambda-CDM), i.e. non-relativistic. Dark energy is an unknown form of energy that tends to accelerate the expansion of the universe. Both dark matter and dark energy have not been conclusively identified, and their exact nature is the subject of intense study. For example, scientists have hypothesized that dark matter could decay into dark energy, or that both dark matter and dark energy are different facets of the same underlying fluid. Other theories that aim to explain one or the other, such as warm dark matter and quintessence, also fall into this category.

    Proposals based on observational skepticism

    As the observational cosmology began to develop, certain astronomers began to offer alternative speculations regarding the interpretation of various phenomena that occasionally became parts of non-standard cosmologies.

    Tired light

    Tired light theories challenge the common interpretation of Hubble's Law as a sign the universe is expanding. It was proposed by Fritz Zwicky in 1929. The basic proposal amounted to light losing energy ("getting tired") due to the distance it traveled rather than any metric expansion or physical recession of sources from observers. A traditional explanation of this effect was to attribute a dynamical friction to photons; the photons' gravitational interactions with stars and other material will progressively reduce their momentum, thus producing a redshift. Other proposals for explaining how photons could lose energy included the scattering of light by intervening material in a process similar to observed interstellar reddening. However, all these processes would also tend to blur images of distant objects, and no such blurring has been detected.

    Traditional tired light has been found incompatible with the observed time dilation that is associated with the cosmological redshift. This idea is mostly remembered as a falsified alternative explanation for Hubble's law in most astronomy or cosmology discussions.

    Dirac large numbers hypothesis

    The Dirac large numbers hypothesis uses the ratio of the size of the visible universe to the radius of quantum particle to predict the age of the universe. The coincidence of various ratios being close in order of magnitude may ultimately prove meaningless or the indication of a deeper connection between concepts in a future theory of everything. Nevertheless, attempts to use such ideas have been criticized as numerology.

    Redshift periodicity and intrinsic redshifts

    Halton Arp in London, Oct 2000

    Some astrophysicists were unconvinced that the cosmological redshifts are caused by universal cosmological expansion. Skepticism and alternative explanations began appearing in the scientific literature in the 1960s. In particular, Geoffrey Burbidge, William Tifft and Halton Arp were all observational astrophysicists who proposed that there were inconsistencies in the redshift observations of galaxies and quasars. The first two were famous for suggesting that there were periodicities in the redshift distributions of galaxies and quasars. Subsequent statistical analyses of redshift surveys, however, have not confirmed the existence of these periodicities.

    During the quasar controversies of the 1970s, these same astronomers were also of the opinion that quasars exhibited high redshifts not due to their incredible distance but rather due to unexplained intrinsic redshift mechanisms that would cause the periodicities and cast doubt on the Big Bang.

    Arguments over how distant quasars were took the form of debates surrounding quasar energy production mechanisms, their light curves, and whether quasars exhibited any proper motion. Astronomers who believed quasars were not at cosmological distances argued that the Eddington luminosity set limits on how distant the quasars could be since the energy output required to explain the apparent brightness of cosmologically-distant quasars was far too high to be explainable by nuclear fusion alone. This objection was made moot by the improved models of gravity-powered accretion disks which for sufficiently dense material (such as black holes) can be more efficient at energy production than nuclear reactions. The controversy was laid to rest by the 1990s when evidence became available that observed quasars were actually the ultra-luminous cores of distant active galactic nuclei and that the major components of their redshift were in fact due to the Hubble flow.

    Throughout his career, Halton Arp maintained that there were anomalies in his observations of quasars and galaxies, and that those anomalies served as a refutation of the Big Bang. In particular, Arp pointed out examples of quasars that were close to the line of sight of (relatively) nearby active, mainly Seyfert galaxies. These objects are now classified under the term active galactic nuclei (AGN), Arp criticized using such term on the ground that it isn't empirical. He claimed that clusters of quasars were in alignment around cores of these galaxies and that quasars, rather than being the cores of distant AGN, were actually much closer and were starlike-objects ejected from the centers of nearby galaxies with high intrinsic redshifts. Arp also contended that they gradually lost their non-cosmological redshift component and eventually evolved into full-fledged galaxies. This stands in stark contradiction to the accepted models of galaxy formation.

    The biggest problem with Arp's analysis is that today there are hundreds of thousands of quasars with known redshifts discovered by various sky surveys. The vast majority of these quasars are not correlated in any way with nearby AGN. Indeed, with improved observing techniques, a number of host galaxies have been observed around quasars which indicates that those quasars at least really are at cosmological distances and are not the kind of objects Arp proposes. Arp's analysis, according to most scientists, suffers from being based on small number statistics and hunting for peculiar coincidences and odd associations. Unbiased samples of sources, taken from numerous galaxy surveys of the sky show none of the proposed 'irregularities', nor that any statistically significant correlations exist.

    In addition, it is not clear what mechanism would be responsible for intrinsic redshifts or their gradual dissipation over time. It is also unclear how nearby quasars would explain some features in the spectrum of quasars which the standard model easily explains. In the standard cosmology, clouds of neutral hydrogen between the quasar and the earth create Lyman alpha absorption lines having different redshifts up to that of the quasar itself; this feature is called the Lyman-alpha forest. Moreover, in extreme quasars one can observe the absorption of neutral hydrogen which has not yet been reionized in a feature known as the Gunn–Peterson trough. Most cosmologists see this missing theoretical work as sufficient reason to explain the observations as either chance or error.

    Halton Arp has proposed an explanation for his observations by a Machian "variable mass hypothesis". The variable-mass theory invokes constant matter creation from active galactic nuclei, which puts it into the class of steady-state theories. With the passing of Halton Arp, this cosmology has been relegated to a dismissed theory.

    Plasma cosmology

    In 1965, Hannes Alfvén proposed a "plasma cosmology" theory of the universe based in part on scaling observations of space plasma physics and experiments on plasmas in terrestrial laboratories to cosmological scales orders of magnitude greater. Taking matter–antimatter symmetry as a starting point, Alfvén together with Oskar Klein proposed the Alfvén-Klein cosmology model, based on the fact that since most of the local universe was composed of matter and not antimatter there may be large bubbles of matter and antimatter that would globally balance to equality. The difficulties with this model were apparent almost immediately. Matter–antimatter annihilation results in the production of high energy photons which were not observed. While it was possible that the local "matter-dominated" cell was simply larger than the observable universe, this proposition did not lend itself to observational tests.

    Like the steady state theory, plasma cosmology includes a Strong Cosmological Principle which assumes that the universe is isotropic in time as well as in space. Matter is explicitly assumed to have always existed, or at least that it formed at a time so far in the past as to be forever beyond humanity's empirical methods of investigation.

    While plasma cosmology has never had the support of most astronomers or physicists, a small number of plasma researchers have continued to promote and develop the approach, and publish in the special issues of the IEEE Transactions on Plasma Science. A few papers regarding plasma cosmology were published in other mainstream journals until the 1990s. Additionally, in 1991, Eric J. Lerner, an independent researcher in plasma physics and nuclear fusion, wrote a popular-level book supporting plasma cosmology called The Big Bang Never Happened. At that time there was renewed interest in the subject among the cosmological community along with other non-standard cosmologies. This was due to anomalous results reported in 1987 by Andrew Lange and Paul Richardson of UC Berkeley and Toshio Matsumoto of Nagoya University that indicated the cosmic microwave background might not have a blackbody spectrum. However, the final announcement (in April 1992) of COBE satellite data corrected the earlier contradiction of the Big Bang; the popularity of plasma cosmology has since fallen.

    Nucleosynthesis objections

    One of the major successes of the Big Bang theory has been to provide a prediction that corresponds to the observations of the abundance of light elements in the universe. Along with the explanation provided for the Hubble's law and for the cosmic microwave background, this observation has proved very difficult for alternative theories to explain.

    Theories which assert that the universe has an infinite age, including many of the theories described above, fail to account for the abundance of deuterium in the cosmos, because deuterium easily undergoes nuclear fusion in stars and there are no known astrophysical processes other than the Big Bang itself that can produce it in large quantities. Hence the fact that deuterium is not an extremely rare component of the universe suggests that the universe has a finite age.

    Theories which assert that the universe has a finite life, but that the Big Bang did not happen, have problems with the abundance of helium-4. The observed amount of 4He is far larger than the amount that should have been created via stars or any other known process. By contrast, the abundance of 4He in Big Bang models is very insensitive to assumptions about baryon density, changing only a few percent as the baryon density changes by several orders of magnitude. The observed value of 4He is within the range calculated.

    Mathematical universe hypothesis

    From Wikipedia, the free encyclopedia
     
    In physics and cosmology, the mathematical universe hypothesis (MUH), also known as the ultimate ensemble theory, is a speculative "theory of everything" (TOE) proposed by cosmologist Max Tegmark.

    Description

    Tegmark's MUH is: Our external physical reality is a mathematical structure. That is, the physical universe is not merely described by mathematics, but is mathematics (specifically, a mathematical structure). Mathematical existence equals physical existence, and all structures that exist mathematically exist physically as well. Observers, including humans, are "self-aware substructures (SASs)". In any mathematical structure complex enough to contain such substructures, they "will subjectively perceive themselves as existing in a physically 'real' world".

    The theory can be considered a form of Pythagoreanism or Platonism in that it proposes the existence of mathematical entities; a form of mathematical monism in that it denies that anything exists except mathematical objects; and a formal expression of ontic structural realism.

    Tegmark claims that the hypothesis has no free parameters and is not observationally ruled out. Thus, he reasons, it is preferred over other theories-of-everything by Occam's Razor. Tegmark also considers augmenting the MUH with a second assumption, the computable universe hypothesis (CUH), which says that the mathematical structure that is our external physical reality is defined by computable functions.

    The MUH is related to Tegmark's categorization of four levels of the multiverse. This categorization posits a nested hierarchy of increasing diversity, with worlds corresponding to different sets of initial conditions (level 1), physical constants (level 2), quantum branches (level 3), and altogether different equations or mathematical structures (level 4).

    Reception

    Andreas Albrecht of Imperial College in London, called it a "provocative" solution to one of the central problems facing physics. Although he "wouldn't dare" go so far as to say he believes it, he noted that "it's actually quite difficult to construct a theory where everything we see is all there is".

    Criticisms and responses

    Definition of the ensemble

    Jürgen Schmidhuber argues that "Although Tegmark suggests that '... all mathematical structures are a priori given equal statistical weight,' there is no way of assigning equal non-vanishing probability to all (infinitely many) mathematical structures." Schmidhuber puts forward a more restricted ensemble which admits only universe representations describable by constructive mathematics, that is, computer programs; e.g., the Global Digital Mathematics Library and Digital Library of Mathematical Functions, linked open data representations of formalized fundamental theorems intended to serve as building blocks for additional mathematical results. He explicitly includes universe representations describable by non-halting programs whose output bits converge after finite time, although the convergence time itself may not be predictable by a halting program, due to the undecidability of the halting problem.

    In response, Tegmark notes (sec. V.E) that a constructive mathematics formalized measure of free parameter variations of physical dimensions, constants, and laws over all universes has not yet been constructed for the string theory landscape either, so this should not be regarded as a "show-stopper".

    Consistency with Gödel's theorem

    It has also been suggested that the MUH is inconsistent with Gödel's incompleteness theorem. In a three-way debate between Tegmark and fellow physicists Piet Hut and Mark Alford, the "secularist" (Alford) states that "the methods allowed by formalists cannot prove all the theorems in a sufficiently powerful system... The idea that math is 'out there' is incompatible with the idea that it consists of formal systems."

    Tegmark's response in (sec VI.A.1) is to offer a new hypothesis "that only Gödel-complete (fully decidable) mathematical structures have physical existence. This drastically shrinks the Level IV multiverse, essentially placing an upper limit on complexity, and may have the attractive side effect of explaining the relative simplicity of our universe." Tegmark goes on to note that although conventional theories in physics are Gödel-undecidable, the actual mathematical structure describing our world could still be Gödel-complete, and "could in principle contain observers capable of thinking about Gödel-incomplete mathematics, just as finite-state digital computers can prove certain theorems about Gödel-incomplete formal systems like Peano arithmetic." In (sec. VII) he gives a more detailed response, proposing as an alternative to MUH the more restricted "Computable Universe Hypothesis" (CUH) which only includes mathematical structures that are simple enough that Gödel's theorem does not require them to contain any undecidable or uncomputable theorems. Tegmark admits that this approach faces "serious challenges", including (a) it excludes much of the mathematical landscape; (b) the measure on the space of allowed theories may itself be uncomputable; and (c) "virtually all historically successful theories of physics violate the CUH".

    Observability

    Stoeger, Ellis, and Kircher (sec. 7) note that in a true multiverse theory, "the universes are then completely disjoint and nothing that happens in any one of them is causally linked to what happens in any other one. This lack of any causal connection in such multiverses really places them beyond any scientific support". Ellis (p29) specifically criticizes the MUH, stating that an infinite ensemble of completely disconnected universes is "completely untestable, despite hopeful remarks sometimes made, see, e.g., Tegmark (1998)." Tegmark maintains that MUH is testable, stating that it predicts (a) that "physics research will uncover mathematical regularities in nature", and (b) by assuming that we occupy a typical member of the multiverse of mathematical structures, one could "start testing multiverse predictions by assessing how typical our universe is" (sec. VIII.C).

    Plausibility of radical Platonism

    The MUH is based on the Radical Platonist view that math is an external reality (sec V.C). However, Jannes argues that "mathematics is at least in part a human construction", on the basis that if it is an external reality, then it should be found in some other animals as well: "Tegmark argues that, if we want to give a complete description of reality, then we will need a language independent of us humans, understandable for non-human sentient entities, such as aliens and future supercomputers". Brian Greene ( p. 299) argues similarly: "The deepest description of the universe should not require concepts whose meaning relies on human experience or interpretation. Reality transcends our existence and so shouldn't, in any fundamental way, depend on ideas of our making."

    However, there are many non-human entities, plenty of which are intelligent, and many of which can apprehend, memorise, compare and even approximately add numerical quantities. Several animals have also passed the mirror test of self-consciousness. But a few surprising examples of mathematical abstraction notwithstanding (for example, chimpanzees can be trained to carry out symbolic addition with digits, or the report of a parrot understanding a “zero-like concept”), all examples of animal intelligence with respect to mathematics are limited to basic counting abilities. He adds, "non-human intelligent beings should exist that understand the language of advanced mathematics. However, none of the non-human intelligent beings that we know of confirm the status of (advanced) mathematics as an objective language." In the paper "On Math, Matter and Mind" the secularist viewpoint examined argues (sec. VI.A) that math is evolving over time, there is "no reason to think it is converging to a definite structure, with fixed questions and established ways to address them", and also that "The Radical Platonist position is just another metaphysical theory like solipsism... In the end the metaphysics just demands that we use a different language for saying what we already knew." Tegmark responds (sec VI.A.1) that "The notion of a mathematical structure is rigorously defined in any book on Model Theory", and that non-human mathematics would only differ from our own "because we are uncovering a different part of what is in fact a consistent and unified picture, so math is converging in this sense." In his 2014 book on the MUH, Tegmark argues that the resolution is not that we invent the language of mathematics, but that we discover the structure of mathematics.

    Coexistence of all mathematical structures

    Don Page has argued (sec 4) that "At the ultimate level, there can be only one world and, if mathematical structures are broad enough to include all possible worlds or at least our own, there must be one unique mathematical structure that describes ultimate reality. So I think it is logical nonsense to talk of Level 4 in the sense of the co-existence of all mathematical structures." This means there can only be one mathematical corpus. Tegmark responds (sec. V.E) that "this is less inconsistent with Level IV than it may sound, since many mathematical structures decompose into unrelated substructures, and separate ones can be unified."

    Consistency with our "simple universe"

    Alexander Vilenkin comments (Ch. 19, p. 203) that "the number of mathematical structures increases with increasing complexity, suggesting that 'typical' structures should be horrendously large and cumbersome. This seems to be in conflict with the beauty and simplicity of the theories describing our world". He goes on to note (footnote 8, p. 222) that Tegmark's solution to this problem, the assigning of lower "weights" to the more complex structures (sec. V.B) seems arbitrary ("Who determines the weights?") and may not be logically consistent ("It seems to introduce an additional mathematical structure, but all of them are supposed to be already included in the set").

    Occam's razor

    Tegmark has been criticized as misunderstanding the nature and application of Occam's razor; Massimo Pigliucci reminds that "Occam's razor is just a useful heuristic, it should never be used as the final arbiter to decide which theory is to be favored".

    Introduction to entropy

    From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Introduct...