Search This Blog

Saturday, October 23, 2021

Dark energy

From Wikipedia, the free encyclopedia

In physical cosmology and astronomy, dark energy is an unknown form of energy that affects the universe on the largest scales. The first observational evidence for its existence came from measurements of supernovae, which showed that the universe does not expand at a constant rate; rather, the expansion of the universe is accelerating. Understanding the evolution of the universe requires knowledge of its starting conditions and its composition. Prior to these observations, it was thought that all forms of matter and energy in the universe would only cause the expansion to slow down over time. Measurements of the cosmic microwave background suggest the universe began in a hot Big Bang, from which general relativity explains its evolution and the subsequent large-scale motion. Without introducing a new form of energy, there was no way to explain how an accelerating universe could be measured. Since the 1990s, dark energy has been the most accepted premise to account for the accelerated expansion. As of 2021, there are active areas of cosmology research aimed at understanding the fundamental nature of dark energy.

Assuming that the lambda-CDM model of cosmology is correct, the best current measurements indicate that dark energy contributes 68% of the total energy in the present-day observable universe. The mass–energy of dark matter and ordinary (baryonic) matter contributes 26% and 5%, respectively, and other components such as neutrinos and photons contribute a very small amount. The density of dark energy is very low (~ 7 × 10−30 g/cm3), much less than the density of ordinary matter or dark matter within galaxies. However, it dominates the mass–energy of the universe because it is uniform across space.

Two proposed forms of dark energy are the cosmological constant, representing a constant energy density filling space homogeneously, and scalar fields such as quintessence or moduli, dynamic quantities having energy densities that can vary in time and space. Contributions from scalar fields that are constant in space are usually also included in the cosmological constant. The cosmological constant can be formulated to be equivalent to the zero-point radiation of space i.e. the vacuum energy. Scalar fields that change in space can be difficult to distinguish from a cosmological constant because the change may be extremely slow.

Due to the toy model nature of concordance cosmology, some experts believe that a more accurate general relativistic treatment of the structures that exist on all scales in the real universe may do away with the need to invoke dark energy. Inhomogeneous cosmologies, which attempt to account for the back-reaction of structure formation on the metric, generally do not acknowledge any dark energy contribution to the energy density of the Universe.

History of discovery and previous speculation

Einstein's cosmological constant

The "cosmological constant" is a constant term that can be added to Einstein's field equation of general relativity. If considered as a "source term" in the field equation, it can be viewed as equivalent to the mass of empty space (which conceptually could be either positive or negative), or "vacuum energy".

The cosmological constant was first proposed by Einstein as a mechanism to obtain a solution of the gravitational field equation that would lead to a static universe, effectively using dark energy to balance gravity. Einstein gave the cosmological constant the symbol Λ (capital lambda). Einstein stated that the cosmological constant required that 'empty space takes the role of gravitating negative masses which are distributed all over the interstellar space'.

The mechanism was an example of fine-tuning, and it was later realized that Einstein's static universe would not be stable: local inhomogeneities would ultimately lead to either the runaway expansion or contraction of the universe. The equilibrium is unstable: if the universe expands slightly, then the expansion releases vacuum energy, which causes yet more expansion. Likewise, a universe which contracts slightly will continue contracting. These sorts of disturbances are inevitable, due to the uneven distribution of matter throughout the universe. Further, observations made by Edwin Hubble in 1929 showed that the universe appears to be expanding and not static at all. Einstein reportedly referred to his failure to predict the idea of a dynamic universe, in contrast to a static universe, as his greatest blunder.

Inflationary dark energy

Alan Guth and Alexei Starobinsky proposed in 1980 that a negative pressure field, similar in concept to dark energy, could drive cosmic inflation in the very early universe. Inflation postulates that some repulsive force, qualitatively similar to dark energy, resulted in an enormous and exponential expansion of the universe slightly after the Big Bang. Such expansion is an essential feature of most current models of the Big Bang. However, inflation must have occurred at a much higher energy density than the dark energy we observe today and is thought to have completely ended when the universe was just a fraction of a second old. It is unclear what relation, if any, exists between dark energy and inflation. Even after inflationary models became accepted, the cosmological constant was thought to be irrelevant to the current universe.

Nearly all inflation models predict that the total (matter+energy) density of the universe should be very close to the critical density. During the 1980s, most cosmological research focused on models with critical density in matter only, usually 95% cold dark matter (CDM) and 5% ordinary matter (baryons). These models were found to be successful at forming realistic galaxies and clusters, but some problems appeared in the late 1980s: in particular, the model required a value for the Hubble constant lower than preferred by observations, and the model under-predicted observations of large-scale galaxy clustering. These difficulties became stronger after the discovery of anisotropy in the cosmic microwave background by the COBE spacecraft in 1992, and several modified CDM models came under active study through the mid-1990s: these included the Lambda-CDM model and a mixed cold/hot dark matter model. The first direct evidence for dark energy came from supernova observations in 1998 of accelerated expansion in Riess et al. and in Perlmutter et al., and the Lambda-CDM model then became the leading model. Soon after, dark energy was supported by independent observations: in 2000, the BOOMERanG and Maxima cosmic microwave background (CMB) experiments observed the first acoustic peak in the CMB, showing that the total (matter+energy) density is close to 100% of critical density. Then in 2001, the 2dF Galaxy Redshift Survey gave strong evidence that the matter density is around 30% of critical. The large difference between these two supports a smooth component of dark energy making up the difference. Much more precise measurements from WMAP in 2003–2010 have continued to support the standard model and give more accurate measurements of the key parameters.

The term "dark energy", echoing Fritz Zwicky's "dark matter" from the 1930s, was coined by Michael Turner in 1998.

Change in expansion over time

Diagram representing the accelerated expansion of the universe due to dark energy.

High-precision measurements of the expansion of the universe are required to understand how the expansion rate changes over time and space. In general relativity, the evolution of the expansion rate is estimated from the curvature of the universe and the cosmological equation of state (the relationship between temperature, pressure, and combined matter, energy, and vacuum energy density for any region of space). Measuring the equation of state for dark energy is one of the biggest efforts in observational cosmology today. Adding the cosmological constant to cosmology's standard FLRW metric leads to the Lambda-CDM model, which has been referred to as the "standard model of cosmology" because of its precise agreement with observations.

As of 2013, the Lambda-CDM model is consistent with a series of increasingly rigorous cosmological observations, including the Planck spacecraft and the Supernova Legacy Survey. First results from the SNLS reveal that the average behavior (i.e., equation of state) of dark energy behaves like Einstein's cosmological constant to a precision of 10%. Recent results from the Hubble Space Telescope Higher-Z Team indicate that dark energy has been present for at least 9 billion years and during the period preceding cosmic acceleration.

Nature

The nature of dark energy is more hypothetical than that of dark matter, and many things about it remain in the realm of speculation. Dark energy is thought to be very homogeneous and not very dense, and is not known to interact through any of the fundamental forces other than gravity. Since it is quite rarefied and un-massive—roughly 10−27 kg/m3—it is unlikely to be detectable in laboratory experiments. The reason dark energy can have such a profound effect on the universe, making up 68% of universal density in spite of being so dilute, is that it uniformly fills otherwise empty space.

Independently of its actual nature, dark energy would need to have a strong negative pressure to explain the observed acceleration of the expansion of the universe. According to general relativity, the pressure within a substance contributes to its gravitational attraction for other objects just as its mass density does. This happens because the physical quantity that causes matter to generate gravitational effects is the stress–energy tensor, which contains both the energy (or matter) density of a substance and its pressure. In the Friedmann–Lemaître–Robertson–Walker metric, it can be shown that a strong constant negative pressure (i.e., tension) in all the universe causes an acceleration in the expansion if the universe is already expanding, or a deceleration in contraction if the universe is already contracting. This accelerating expansion effect is sometimes labeled "gravitational repulsion".

Technical definition

In standard cosmology, there are three components of the universe: matter, radiation, and dark energy. Matter is anything whose energy density scales with the inverse cube of the scale factor, i.e., ρ ∝ a−3, while radiation is anything which scales to the inverse fourth power of the scale factor (ρ ∝ a−4). This can be understood intuitively: for an ordinary particle in a cube-shaped box, doubling the length of an edge of the box decreases the density (and hence energy density) by a factor of eight (23). For radiation, the decrease in energy density is greater, because an increase in spatial distance also causes a redshift.

The final component is dark energy; "dark energy" is anything that is, in its effect, an intrinsic property of space: That has a constant energy density, regardless of the dimensions of the volume under consideration (ρ ∝ a0). Thus, unlike ordinary matter, it is not diluted by the expansion of space.

Evidence of existence

The evidence for dark energy is indirect but comes from three independent sources:

  • Distance measurements and their relation to redshift, which suggest the universe has expanded more in the latter half of its life.
  • The theoretical need for a type of additional energy that is not matter or dark matter to form the observationally flat universe (absence of any detectable global curvature).
  • Measures of large-scale wave patterns of mass density in the universe.

Supernovae

A Type Ia supernova (bright spot on the bottom-left) near a galaxy

In 1998, the High-Z Supernova Search Team published observations of Type Ia ("one-A") supernovae. In 1999, the Supernova Cosmology Project followed by suggesting that the expansion of the universe is accelerating. The 2011 Nobel Prize in Physics was awarded to Saul Perlmutter, Brian P. Schmidt, and Adam G. Riess for their leadership in the discovery.

Since then, these observations have been corroborated by several independent sources. Measurements of the cosmic microwave background, gravitational lensing, and the large-scale structure of the cosmos, as well as improved measurements of supernovae, have been consistent with the Lambda-CDM model. Some people argue that the only indications for the existence of dark energy are observations of distance measurements and their associated redshifts. Cosmic microwave background anisotropies and baryon acoustic oscillations serve only to demonstrate that distances to a given redshift are larger than would be expected from a "dusty" Friedmann–Lemaître universe and the local measured Hubble constant.

Supernovae are useful for cosmology because they are excellent standard candles across cosmological distances. They allow researchers to measure the expansion history of the universe by looking at the relationship between the distance to an object and its redshift, which gives how fast it is receding from us. The relationship is roughly linear, according to Hubble's law. It is relatively easy to measure redshift, but finding the distance to an object is more difficult. Usually, astronomers use standard candles: objects for which the intrinsic brightness, or absolute magnitude, is known. This allows the object's distance to be measured from its actual observed brightness, or apparent magnitude. Type Ia supernovae are the best-known standard candles across cosmological distances because of their extreme and consistent luminosity.

Recent observations of supernovae are consistent with a universe made up 71.3% of dark energy and 27.4% of a combination of dark matter and baryonic matter.

Cosmic microwave background

Estimated division of total energy in the universe into matter, dark matter and dark energy based on five years of WMAP data.

The existence of dark energy, in whatever form, is needed to reconcile the measured geometry of space with the total amount of matter in the universe. Measurements of cosmic microwave background (CMB) anisotropies indicate that the universe is close to flat. For the shape of the universe to be flat, the mass–energy density of the universe must be equal to the critical density. The total amount of matter in the universe (including baryons and dark matter), as measured from the CMB spectrum, accounts for only about 30% of the critical density. This implies the existence of an additional form of energy to account for the remaining 70%. The Wilkinson Microwave Anisotropy Probe (WMAP) spacecraft seven-year analysis estimated a universe made up of 72.8% dark energy, 22.7% dark matter, and 4.5% ordinary matter. Work done in 2013 based on the Planck spacecraft observations of the CMB gave a more accurate estimate of 68.3% dark energy, 26.8% dark matter, and 4.9% ordinary matter.

Large-scale structure

The theory of large-scale structure, which governs the formation of structures in the universe (stars, quasars, galaxies and galaxy groups and clusters), also suggests that the density of matter in the universe is only 30% of the critical density.

A 2011 survey, the WiggleZ galaxy survey of more than 200,000 galaxies, provided further evidence towards the existence of dark energy, although the exact physics behind it remains unknown. The WiggleZ survey from the Australian Astronomical Observatory scanned the galaxies to determine their redshift. Then, by exploiting the fact that baryon acoustic oscillations have left voids regularly of ≈150 Mpc diameter, surrounded by the galaxies, the voids were used as standard rulers to estimate distances to galaxies as far as 2,000 Mpc (redshift 0.6), allowing for accurate estimate of the speeds of galaxies from their redshift and distance. The data confirmed cosmic acceleration up to half of the age of the universe (7 billion years) and constrain its inhomogeneity to 1 part in 10. This provides a confirmation to cosmic acceleration independent of supernovae.

Late-time integrated Sachs–Wolfe effect

Accelerated cosmic expansion causes gravitational potential wells and hills to flatten as photons pass through them, producing cold spots and hot spots on the CMB aligned with vast supervoids and superclusters. This so-called late-time Integrated Sachs–Wolfe effect (ISW) is a direct signal of dark energy in a flat universe. It was reported at high significance in 2008 by Ho et al. and Giannantonio et al.

Observational Hubble constant data

A new approach to test evidence of dark energy through observational Hubble constant data (OHD) has gained significant attention in recent years.

The Hubble constant, H(z), is measured as a function of cosmological redshift. OHD directly tracks the expansion history of the universe by taking passively evolving early-type galaxies as “cosmic chronometers”. From this point, this approach provides standard clocks in the universe. The core of this idea is the measurement of the differential age evolution as a function of redshift of these cosmic chronometers. Thus, it provides a direct estimate of the Hubble parameter

The reliance on a differential quantity, Δz/Δt, brings more information and is appealing for computation: It can minimize many common issues and systematic effects. Analyses of supernovae and baryon acoustic oscillations (BAO) are based on integrals of the Hubble parameter, whereas Δz/Δt measures it directly. For these reasons, this method has been widely used to examine the accelerated cosmic expansion and study properties of dark energy.

Direct observation

An attempt to directly observe dark energy in a laboratory failed to detect a new force. Recently, it has been speculated that the currently unexplained excess observed in the XENON1T detector in Italy may have been caused by a chameleon model of dark energy.

Theories of dark energy

Dark energy's status as a hypothetical force with unknown properties makes it a very active target of research. The problem is attacked from a great variety of angles, such as modifying the prevailing theory of gravity (general relativity), attempting to pin down the properties of dark energy, and finding alternative ways to explain the observational data.

The equation of state of Dark Energy for 4 common models by Redshift.
A: CPL Model,
B: Jassal Model,
C: Barboza & Alcaniz Model,
D: Wetterich Model

Cosmological constant

Estimated distribution of matter and energy in the universe

The simplest explanation for dark energy is that it is an intrinsic, fundamental energy of space. This is the cosmological constant, usually represented by the Greek letter Λ (Lambda, hence Lambda-CDM model). Since energy and mass are related according to the equation E = mc2 , Einstein's theory of general relativity predicts that this energy will have a gravitational effect. It is sometimes called a vacuum energy because it is the energy density of empty space – the vacuum.

A major outstanding problem is that the same quantum field theories predict a huge cosmological constant, about 120 orders of magnitude too large. This would need to be almost, but not exactly, cancelled by an equally large term of the opposite sign.

Some supersymmetric theories require a cosmological constant that is exactly zero. Also, it is unknown if there is a metastable vacuum state in string theory with a positive cosmological constant, and it has been conjectured by Ulf Danielsson et al. that no such state exists. This conjecture would not rule out other models of dark energy, such as quintessence, that could be compatible with string theory.

Quintessence

In quintessence models of dark energy, the observed acceleration of the scale factor is caused by the potential energy of a dynamical field, referred to as quintessence field. Quintessence differs from the cosmological constant in that it can vary in space and time. In order for it not to clump and form structure like matter, the field must be very light so that it has a large Compton wavelength.

No evidence of quintessence is yet available, but it has not been ruled out either. It generally predicts a slightly slower acceleration of the expansion of the universe than the cosmological constant. Some scientists think that the best evidence for quintessence would come from violations of Einstein's equivalence principle and variation of the fundamental constants in space or time. Scalar fields are predicted by the Standard Model of particle physics and string theory, but an analogous problem to the cosmological constant problem (or the problem of constructing models of cosmological inflation) occurs: renormalization theory predicts that scalar fields should acquire large masses.

The coincidence problem asks why the acceleration of the Universe began when it did. If acceleration began earlier in the universe, structures such as galaxies would never have had time to form, and life, at least as we know it, would never have had a chance to exist. Proponents of the anthropic principle view this as support for their arguments. However, many models of quintessence have a so-called "tracker" behavior, which solves this problem. In these models, the quintessence field has a density which closely tracks (but is less than) the radiation density until matter–radiation equality, which triggers quintessence to start behaving as dark energy, eventually dominating the universe. This naturally sets the low energy scale of the dark energy.

In 2004, when scientists fit the evolution of dark energy with the cosmological data, they found that the equation of state had possibly crossed the cosmological constant boundary (w = −1) from above to below. A no-go theorem has been proved that this scenario requires models with at least two types of quintessence. This scenario is the so-called Quintom scenario.

Some special cases of quintessence are phantom energy, in which the energy density of quintessence actually increases with time, and k-essence (short for kinetic quintessence) which has a non-standard form of kinetic energy such as a negative kinetic energy. They can have unusual properties: phantom energy, for example, can cause a Big Rip.

Interacting dark energy

This class of theories attempts to come up with an all-encompassing theory of both dark matter and dark energy as a single phenomenon that modifies the laws of gravity at various scales. This could, for example, treat dark energy and dark matter as different facets of the same unknown substance, or postulate that cold dark matter decays into dark energy. Another class of theories that unifies dark matter and dark energy are suggested to be covariant theories of modified gravities. These theories alter the dynamics of the spacetime such that the modified dynamics stems to what have been assigned to the presence of dark energy and dark matter. Dark energy could in principle interact not only with the rest of the dark sector, but also with ordinary matter. However, cosmology alone is not sufficient to effectively constrain the strength of the coupling between dark energy and baryons, so that other indirect techniques or laboratory searches have to be adopted. A recent proposal speculates that the currently unexplained excess observed in the XENON1T detector in Italy may have been caused by a chameleon model of dark energy.

Variable dark energy models

The density of the dark energy might have varied in time during the history of the universe. Modern observational data allow us to estimate the present density of the dark energy. Using baryon acoustic oscillations, it is possible to investigate the effect of dark energy in the history of the Universe, and constrain parameters of the equation of state of dark energy. To that end, several models have been proposed. One of the most popular models is the Chevallier–Polarski–Linder model (CPL). Some other common models are, (Barboza & Alcaniz. 2008), (Jassal et al. 2005), (Wetterich. 2004), (Oztas et al. 2018).

Observational skepticism

Some alternatives to dark energy, such as inhomogeneous cosmology, aim to explain the observational data by a more refined use of established theories. In this scenario, dark energy doesn't actually exist, and is merely a measurement artifact. For example, if we are located in an emptier-than-average region of space, the observed cosmic expansion rate could be mistaken for a variation in time, or acceleration. A different approach uses a cosmological extension of the equivalence principle to show how space might appear to be expanding more rapidly in the voids surrounding our local cluster. While weak, such effects considered cumulatively over billions of years could become significant, creating the illusion of cosmic acceleration, and making it appear as if we live in a Hubble bubble. Yet other possibilities are that the accelerated expansion of the universe is an illusion caused by the relative motion of us to the rest of the universe, or that the statistical methods employed were flawed. It has also been suggested that the anisotropy of the local Universe has been misrepresented as dark energy. This claim was quickly countered by others, including a paper by physicists D. Rubin and J. Heitlauf. A laboratory direct detection attempt failed to detect any force associated with dark energy.

A study published in 2020 questioned the validity of the essential assumption that the luminosity of Type Ia supernovae does not vary with stellar population age, and suggests that dark energy may not actually exist. Lead researcher of the new study, Young-Wook Lee of Yonsei University, said "Our result illustrates that dark energy from SN cosmology, which led to the 2011 Nobel Prize in Physics, might be an artifact of a fragile and false assumption." Multiple issues with this paper were raised by other cosmologists, including Adam Riess, who won the 2011 Nobel Prize for the discovery of dark energy.

Other mechanism driving acceleration

Modified gravity

The evidence for dark energy is heavily dependent on the theory of general relativity. Therefore, it is conceivable that a modification to general relativity also eliminates the need for dark energy. There are very many such theories, and research is ongoing. The measurement of the speed of gravity in the first gravitational wave measured by non-gravitational means (GW170817) ruled out many modified gravity theories as explanations to dark energy.

Astrophysicist Ethan Siegel states that, while such alternatives gain a lot of mainstream press coverage, almost all professional astrophysicists are confident that dark energy exists, and that none of the competing theories successfully explain observations to the same level of precision as standard dark energy.

Implications for the fate of the universe

Cosmologists estimate that the acceleration began roughly 5 billion years ago. Before that, it is thought that the expansion was decelerating, due to the attractive influence of matter. The density of dark matter in an expanding universe decreases more quickly than dark energy, and eventually the dark energy dominates. Specifically, when the volume of the universe doubles, the density of dark matter is halved, but the density of dark energy is nearly unchanged (it is exactly constant in the case of a cosmological constant).

Projections into the future can differ radically for different models of dark energy. For a cosmological constant, or any other model that predicts that the acceleration will continue indefinitely, the ultimate result will be that galaxies outside the Local Group will have a line-of-sight velocity that continually increases with time, eventually far exceeding the speed of light. This is not a violation of special relativity because the notion of "velocity" used here is different from that of velocity in a local inertial frame of reference, which is still constrained to be less than the speed of light for any massive object (see Uses of the proper distance for a discussion of the subtleties of defining any notion of relative velocity in cosmology). Because the Hubble parameter is decreasing with time, there can actually be cases where a galaxy that is receding from us faster than light does manage to emit a signal which reaches us eventually.

However, because of the accelerating expansion, it is projected that most galaxies will eventually cross a type of cosmological event horizon where any light they emit past that point will never be able to reach us at any time in the infinite future because the light never reaches a point where its "peculiar velocity" toward us exceeds the expansion velocity away from us (these two notions of velocity are also discussed in Uses of the proper distance). Assuming the dark energy is constant (a cosmological constant), the current distance to this cosmological event horizon is about 16 billion light years, meaning that a signal from an event happening at present would eventually be able to reach us in the future if the event were less than 16 billion light years away, but the signal would never reach us if the event were more than 16 billion light years away.

As galaxies approach the point of crossing this cosmological event horizon, the light from them will become more and more redshifted, to the point where the wavelength becomes too large to detect in practice and the galaxies appear to vanish completely. Planet Earth, the Milky Way, and the Local Group of which the Milky Way is a part, would all remain virtually undisturbed as the rest of the universe recedes and disappears from view. In this scenario, the Local Group would ultimately suffer heat death, just as was hypothesized for the flat, matter-dominated universe before measurements of cosmic acceleration.

There are other, more speculative ideas about the future of the universe. The phantom energy model of dark energy results in divergent expansion, which would imply that the effective force of dark energy continues growing until it dominates all other forces in the universe. Under this scenario, dark energy would ultimately tear apart all gravitationally bound structures, including galaxies and solar systems, and eventually overcome the electrical and nuclear forces to tear apart atoms themselves, ending the universe in a "Big Rip". On the other hand, dark energy might dissipate with time or even become attractive. Such uncertainties leave open the possibility of gravity eventually prevailing and lead to a universe that contracts in on itself in a "Big Crunch", or that there may even be a dark energy cycle, which implies a cyclic model of the universe in which every iteration (Big Bang then eventually a Big Crunch) takes about a trillion (1012) years. While none of these are supported by observations, they are not ruled out.

In philosophy of science

In philosophy of science, dark energy is an example of an "auxiliary hypothesis", an ad hoc postulate that is added to a theory in response to observations that falsify it. It has been argued that the dark energy hypothesis is a conventionalist hypothesis, that is, a hypothesis that adds no empirical content and hence is unfalsifiable in the sense defined by Karl Popper.

Eternal inflation

From Wikipedia, the free encyclopedia

Eternal inflation is a hypothetical inflationary universe model, which is itself an outgrowth or extension of the Big Bang theory.

According to eternal inflation, the inflationary phase of the universe's expansion lasts forever throughout most of the universe. Because the regions expand exponentially rapidly, most of the volume of the universe at any given time is inflating. Eternal inflation, therefore, produces a hypothetically infinite multiverse, in which only an insignificant fractal volume ends inflation.

Paul Steinhardt, one of the original researchers of the inflationary model, introduced the first example of eternal inflation in 1983, and Alexander Vilenkin showed that it is generic.

Alan Guth's 2007 paper, "Eternal inflation and its implications", states that under reasonable assumptions "Although inflation is generically eternal into the future, it is not eternal into the past." Guth detailed what was known about the subject at the time, and demonstrated that eternal inflation was still considered the likely outcome of inflation, more than 20 years after eternal inflation was first introduced by Steinhardt.

Overview

Development of the theory

Inflation, or the inflationary universe theory, was originally developed as a way to overcome the few remaining problems with what was otherwise considered a successful theory of cosmology, the Big Bang model.

In 1979, Alan Guth introduced the inflationary model of the universe to explain why the universe is flat and homogeneous (which refers to the smooth distribution of matter and radiation on a large scale). The basic idea was that the universe underwent a period of rapidly accelerating expansion a few instants after the Big Bang. He offered a mechanism for causing the inflation to begin: false vacuum energy. Guth coined the term "inflation," and was the first to discuss the theory with other scientists worldwide.

Guth's original formulation was problematic, as there was no consistent way to bring an end to the inflationary epoch and end up with the hot, isotropic, homogeneous universe observed today. Although the false vacuum could decay into empty "bubbles" of "true vacuum" that expanded at the speed of light, the empty bubbles could not coalesce to reheat the universe, because they could not keep up with the remaining inflating universe.

In 1982, this "graceful exit problem" was solved independently by Andrei Linde and by Andreas Albrecht and Paul J. Steinhardt who showed how to end inflation without making empty bubbles and, instead, end up with a hot expanding universe. The basic idea was to have a continuous "slow-roll" or slow evolution from false vacuum to true without making any bubbles. The improved model was called "new inflation."

In 1983, Paul Steinhardt was the first to show that this "new inflation" does not have to end everywhere. Instead, it might only end in a finite patch or a hot bubble full of matter and radiation, and that inflation continues in most of the universe while producing hot bubble after hot bubble along the way. Alexander Vilenkin showed that when quantum effects are properly included, this is actually generic to all new inflation models.

Using ideas introduced by Steinhardt and Vilenkin, Andrei Linde published an alternative model of inflation in 1986 which used these ideas to provide a detailed description of what has become known as the Chaotic Inflation theory or eternal inflation.

Quantum fluctuations

New inflation does not produce a perfectly symmetric universe due to quantum fluctuations during inflation. The fluctuations cause the energy and matter density to be different in different points in space.

Quantum fluctuations in the hypothetical inflation field produce changes in the rate of expansion that are responsible for eternal inflation. Those regions with a higher rate of inflation expand faster and dominate the universe, despite the natural tendency of inflation to end in other regions. This allows inflation to continue forever, to produce future-eternal inflation. As a simplified example, suppose that during inflation, the natural decay rate of the inflaton field is slow compared to the effect of quantum fluctuation. When a mini-universe inflates and "self-reproduces" into, say, twenty causally-disconnected mini-universes of equal size to the original mini-universe, perhaps nine of the new mini-universes will have a larger, rather than smaller, average inflaton field value than the original mini-universe, because they inflated from regions of the original mini-universe where quantum fluctuation pushed the inflaton value up more than the slow inflation decay rate brought the inflaton value down. Originally there was one mini-universe with a given inflaton value; now there are nine mini-universes that have a slightly larger inflaton value. (Of course, there are also eleven mini-universes where the inflaton value is slightly lower than it originally was.) Each mini-universe with the larger inflaton field value restarts a similar round of approximate self-reproduction within itself. (The mini-universes with lower inflaton values may also reproduce, unless its inflaton value is small enough that the region drops out of inflation and ceases self-reproduction.) This process continues indefinitely; nine high-inflaton mini-universes might become 81, then 729... Thus, there is eternal inflation.

In 1980, quantum fluctuations were suggested by Viatcheslav Mukhanov and Gennady Chibisov in the Soviet Union in the context of a model of modified gravity by Alexei Starobinsky to be possible seeds for forming galaxies.

In the context of inflation, quantum fluctuations were first analyzed at the three-week 1982 Nuffield Workshop on the Very Early Universe at Cambridge University. The average strength of the fluctuations was first calculated by four groups working separately over the course of the workshop: Stephen Hawking; Starobinsky; Guth and So-Young Pi; and James M. Bardeen, Paul Steinhardt and Michael Turner.

The early calculations derived at the Nuffield Workshop only focused on the average fluctuations, whose magnitude is too small to affect inflation. However, beginning with the examples presented by Steinhardt and Vilenkin, the same quantum physics was later shown to produce occasional large fluctuations that increase the rate of inflation and keep inflation going eternally.

Further developments

In analyzing the Planck Satellite data from 2013, Anna Ijjas and Paul Steinhardt showed that the simplest textbook inflationary models were eliminated and that the remaining models require exponentially more tuned starting conditions, more parameters to be adjusted, and less inflation. Later Planck observations reported in 2015 confirmed these conclusions.

A 2014 paper by Kohli and Haslam called into question the viability of the eternal inflation theory, by analyzing Linde's chaotic inflation theory in which the quantum fluctuations are modeled as Gaussian white noise. They showed that in this popular scenario, eternal inflation in fact cannot be eternal, and the random noise leads to spacetime being filled with singularities. This was demonstrated by showing that solutions to the Einstein field equations diverge in a finite time. Their paper therefore concluded that the theory of eternal inflation based on random quantum fluctuations would not be a viable theory, and the resulting existence of a multiverse is "still very much an open question that will require much deeper investigation".

Inflation, eternal inflation, and the multiverse

In 1983, it was shown that inflation could be eternal, leading to a multiverse in which space is broken up into bubbles or patches whose properties differ from patch to patch spanning all physical possibilities.

Paul Steinhardt, who produced the first example of eternal inflation, eventually became a strong and vocal opponent of the theory. He argued that the multiverse represented a breakdown of the inflationary theory, because, in a multiverse, any outcome is equally possible, so inflation makes no predictions and, hence, is untestable. Consequently, he argued, inflation fails a key condition for a scientific theory.

Both Linde and Guth, however, continued to support the inflationary theory and the multiverse. Guth declared:

It's hard to build models of inflation that don't lead to a multiverse. It's not impossible, so I think there's still certainly research that needs to be done. But most models of inflation do lead to a multiverse, and evidence for inflation will be pushing us in the direction of taking the idea of a multiverse seriously.

According to Linde, "It's possible to invent models of inflation that do not allow a multiverse, but it's difficult. Every experiment that brings better credence to inflationary theory brings us much closer to hints that the multiverse is real."

In 2018 the late Stephen Hawking and Thomas Hertog published a paper in which the need for an infinite multiverse vanishes as Hawking describes their theory gives universes which are "reasonably smooth and globally finite". The theory uses the holographic principle to define an 'exit plane' from the timeless state of eternal inflation, the universes which are generated on the plane are described using a redefinition of the no-boundary wavefunction, in fact the theory requires a boundary at the beginning of time. Stated simply Hawking says that their findings "imply a significant reduction of the multiverse" which as the University of Cambridge points out, makes the theory "predictive and testable" using gravitational wave astronomy.

 

False vacuum decay

From Wikipedia, the free encyclopedia
A scalar field φ (which represents physical position) in a false vacuum. Note that the energy E is higher in the false vacuum than that in the true vacuum or ground state, but there is a barrier preventing the field from classically rolling down to the true vacuum. Therefore, the transition to the true vacuum must be stimulated by the creation of high-energy particles or through quantum-mechanical tunneling.

In quantum field theory, a false vacuum is a hypothetical vacuum that is stable, but not in the most stable state possible (it is metastable). It may last for a very long time in that state, but could eventually decay to the more stable state, an event known as false vacuum decay. The most common suggestion of how such a decay might happen in our universe is called bubble nucleation – if a small region of the universe by chance reached a more stable vacuum, this "bubble" (also called "bounce") would spread.

A false vacuum exists at a local minimum of energy and is therefore not stable, in contrast to a true vacuum, which exists at a global minimum and is stable.

Definition of true vs false vacuum

A vacuum is defined as a space with as little energy in it as possible. Despite the name, the vacuum still has quantum fields. A true vacuum is stable because it is at a global minimum of energy, and is commonly assumed to coincide with a physical vacuum state we live in. It is possible that a physical vacuum state is a configuration of quantum fields representing a local minimum but not global minimum of energy. This type of vacuum state is called a "false vacuum".

Implications

Existential threat

If our universe is in a false vacuum state rather than a true vacuum state, then the decay from the less stable false vacuum to the more stable true vacuum (called false vacuum decay) could have dramatic consequences. The effects could range from complete cessation of existing fundamental forces, elementary particles and structures comprising them, to subtle change in some cosmological parameters, mostly depending on potential difference between true and false vacuum. Some false vacuum decay scenarios are compatible with survival of structures like galaxies and stars or even biological life while others involve the full destruction of baryonic matter or even immediate gravitational collapse of the universe, although in this more extreme case the likelihood of a “bubble” forming may be very low (i.e. false vacuum decay may be impossible).

A paper by Coleman and de Luccia which attempted to include simple gravitational assumptions into these theories noted that if this was an accurate representation of nature, then the resulting universe "inside the bubble" in such a case would appear to be extremely unstable and would almost immediately collapse:

In general, gravitation makes the probability of vacuum decay smaller; in the extreme case of very small energy-density difference, it can even stabilize the false vacuum, preventing vacuum decay altogether. We believe we understand this. For the vacuum to decay, it must be possible to build a bubble of total energy zero. In the absence of gravitation, this is no problem, no matter how small the energy-density difference; all one has to do is make the bubble big enough, and the volume/surface ratio will do the job. In the presence of gravitation, though, the negative energy density of the true vacuum distorts geometry within the bubble with the result that, for a small enough energy density, there is no bubble with a big enough volume/surface ratio. Within the bubble, the effects of gravitation are more dramatic. The geometry of space-time within the bubble is that of anti-de Sitter space, a space much like conventional de Sitter space except that its group of symmetries is O(3, 2) rather than O(4, 1). Although this space-time is free of singularities, it is unstable under small perturbations, and inevitably suffers gravitational collapse of the same sort as the end state of a contracting Friedmann universe. The time required for the collapse of the interior universe is on the order of ... microseconds or less.

The possibility that we are living in a false vacuum has never been a cheering one to contemplate. Vacuum decay is the ultimate ecological catastrophe; in the new vacuum there are new constants of nature; after vacuum decay, not only is life as we know it impossible, so is chemistry as we know it. However, one could always draw stoic comfort from the possibility that perhaps in the course of time the new vacuum would sustain, if not life as we know it, at least some structures capable of knowing joy. This possibility has now been eliminated.

The second special case is decay into a space of vanishing cosmological constant, the case that applies if we are now living in the debris of a false vacuum which decayed at some early cosmic epoch. This case presents us with less interesting physics and with fewer occasions for rhetorical excess than the preceding one. It is now the interior of the bubble that is ordinary Minkowski space ...

— Sidney Coleman and Frank De Luccia

In a 2005 paper published in Nature, as part of their investigation into global catastrophic risks, MIT physicist Max Tegmark and Oxford philosopher Nick Bostrom calculate the natural risks of the destruction of the Earth at less than 1/109 per year from all natural (i.e. non-anthropogenic) events, including a transition to a lower vacuum state. They argue that due to observer selection effects, we might underestimate the chances of being destroyed by vacuum decay because any information about this event would reach us only at the instant when we too were destroyed. This is in contrast to events like risks from impacts, gamma-ray bursts, supernovae and hypernovae, the frequencies of which we have adequate direct measures.

Inflation

A number of theories suggest that cosmic inflation may be an effect of a false vacuum decaying into the true vacuum. The inflation itself may be the consequence of the Higgs field trapped in a false vacuum state with Higgs self-coupling λ and its βλ function very close to zero at the planck scale. A future electron-positron collider would be able to provide the precise measurements of the top quark needed for such calculations.

Chaotic Inflation Theory suggests that the universe may be in either a false vacuum or a true vacuum state. Alan Guth, in his original proposal for cosmic inflation, proposed that inflation could end through quantum mechanical bubble nucleation of the sort described above. See History of Chaotic inflation theory. It was soon understood that a homogeneous and isotropic universe could not be preserved through the violent tunneling process. This led Andrei Linde and, independently, Andreas Albrecht and Paul Steinhardt, to propose "new inflation" or "slow roll inflation" in which no tunnelling occurs, and the inflationary scalar field instead graphs as a gentle slope.

In 2014, researchers at the Chinese Academy of Sciences' Wuhan Institute of Physics and Mathematics suggested that the universe could have been spontaneously created from nothing (no space, time, nor matter) by quantum fluctuations of metastable false vacuum causing an expanding bubble of true vacuum.

Vacuum decay varieties

Electroweak vacuum decay

Electroweak vacuum stability landscape as estimated in 2012
 
Electroweak vacuum stability landscape as estimated in 2018. TRH is grand unification energy. ξ is the degree of non-minimal coupling between fundamental forces.

The stability criteria for the electroweak interaction was first formulated in 1979 as a function of the masses of the theoretical Higgs boson and the heaviest fermion. Discovery of the top quark in 1995 and the Higgs boson in 2012 have allowed physicists to validate the criteria against experiment, therefore since 2012 the electroweak interaction is considered as the most promising candidate for a metastable fundamental force. The corresponding false vacuum hypothesis is called either 'Electroweak vacuum instability' or 'Higgs vacuum instability'. The present false vacuum state is called (De Sitter space), while tentative true vacuum is called (Anti-de Sitter space).

The diagrams show the uncertainty ranges of Higgs boson and top quark masses as oval-shaped lines. Underlying colors indicate if the electroweak vacuum state is likely to be stable, merely long-lived or completely unstable for given combination of masses. The "electroweak vacuum decay" hypothesis was sometimes misreported as the Higgs boson "ending" the universe. A 125.18±0.16 GeV/c2  Higgs boson mass is likely to be on the metastable side of stable-metastable boundary (estimated in 2012 as 123.8–135.0 GeV.) However, a definitive answer requires much more precise measurements of the top quark's pole mass, although improved measurement precision of Higgs boson and top quark masses further reinforced the claim of physical electroweak vacuum being in the metastable state as of 2018. Nonetheless, new physics beyond the Standard Model of Particle Physics could drastically change the stability landscape division lines, rendering previous stability and metastability criteria incorrect.

If measurements of the Higgs boson and top quark suggest that our universe lies within a false vacuum of this kind, this would imply that, more than likely in many billions of years, the bubble's effects will propagate across the universe at nearly the speed of light from its origin in space-time.

Other decay modes

Bubble nucleation

When the false vacuum decays, the lower-energy true vacuum forms through a process known as bubble nucleation. In this process, instanton effects cause a bubble containing the true vacuum to appear. The walls of the bubble (or domain walls) have a positive surface tension, as energy is expended as the fields roll over the potential barrier to the true vacuum. The former tends as the cube of the bubble's radius while the latter is proportional to the square of its radius, so there is a critical size at which the total energy of the bubble is zero; smaller bubbles tend to shrink, while larger bubbles tend to grow. To be able to nucleate, the bubble must overcome an energy barrier of height.

 



 

(Eq. 1)

where is the difference in energy between the true and false vacuums, is the unknown (possibly extremely large) surface tension of the domain wall, and is the radius of the bubble. Rewriting Eq. 1 gives the critical radius as

 

 

 

 

(Eq. 2)

The height of the potential barrier between true and false vacuum was estimated to be around 1020 eV, based on theoretical calculations in 2020.

A bubble smaller than the critical size can overcome the potential barrier via quantum tunnelling of instantons to lower energy states. For a large potential barrier, the tunneling rate per unit volume of space is given by

 

 

 

 

(Eq. 3)

where is the reduced Planck constant. As soon as a bubble of lower-energy vacuum grows beyond the critical radius defined by Eq. 2, the bubble's wall will begin to accelerate outward. Due to the typically large difference in energy between the false and true vacuums, the speed of the wall approaches the speed of light extremely quickly. The bubble does not produce any gravitational effects because the negative energy density of the bubble interior is cancelled out by the positive kinetic energy of the wall.

Small bubbles of true vacuum can be inflated to critical size by providing energy, although required energy densities are several orders of magnitude larger than what is attained in any natural or artificial process. It is also thought that certain environments can catalyze bubble formation by lowering the potential barrier.

Nucleation seeds

In a study in 2015, it was pointed out that the vacuum decay rate could be vastly increased in the vicinity of black holes, which would serve as a nucleation seed. According to this study, a potentially catastrophic vacuum decay could be triggered at any time by primordial black holes, should they exist. The authors note however that if primordial black holes cause a false vacuum collapse then it should have happened long before humans evolved on Earth. A subsequent study in 2017 indicated that the bubble would collapse into a primordial black hole rather than originate from it, either by ordinary collapse or by bending space in such a way that it breaks off into a new universe. In 2019, it was found that although small non-spinning black holes may increase true vacuum nucleation rate, rapidly spinning black holes will stabilize false vacuums to decay rates lower than expected for flat space-time. Proposed alternative nucleation seeds include cosmic strings and magnetic monopoles.

If particle collisions produce mini black holes then energetic collisions such as the ones produced in the Large Hadron Collider (LHC) could trigger such a vacuum decay event, a scenario which has attracted the attention of the news media. It is likely to be unrealistic, because if such mini black holes can be created in collisions, they would also be created in the much more energetic collisions of cosmic radiation particles with planetary surfaces or during the early life of the universe as tentative primordial black holes. Hut and Rees note that, because cosmic ray collisions have been observed at much higher energies than those produced in terrestrial particle accelerators, these experiments should not, at least for the foreseeable future, pose a threat to our current vacuum. Particle accelerators have reached energies of only approximately eight tera electron volts (8×1012 eV). Cosmic ray collisions have been observed at and beyond energies of 5×1019 eV, six million times more powerful – the so-called Greisen–Zatsepin–Kuzmin limit – and cosmic rays in vicinity of origin may be more powerful yet. John Leslie has argued that if present trends continue, particle accelerators will exceed the energy given off in naturally occurring cosmic ray collisions by the year 2150. Fears of this kind were raised by critics of both the Relativistic Heavy Ion Collider and the Large Hadron Collider at the time of their respective proposal, and determined to be unfounded by scientific inquiry.

Cooperative

From Wikipedia, the free encyclopedia ...