Stephen Hawking described it as the most spectacular failure of any physical theory in history. Can a new theory of information rescue cosmologists?
Original link: https://medium.com/the-physics-arxiv-blog/how-the-nature-of-information-could-resolve-one-of-the-great-paradoxes-of-cosmology-8c16fc714756
One
of the biggest puzzles in science is the cosmological constant paradox.
This arises when physicists attempt to calculate the energy density of
the universe from first principles. Using quantum mechanics, the number
they come up with is 10^94 g/cm^3.
And
yet the observed energy density, calculated from the density of mass in
the cosmos and the way the universe is expanding, is about 10^-27
g/cm^3. In other words, our best theory of the universe misses the mark
by 120 orders of magnitude.
That’s
left cosmologists somewhat red-faced. Indeed, Stephen Hawking has
famously described this as the most spectacular failure of any physical
theory in history. This huge discrepancy is all the more puzzling
because quantum mechanics makes such accurate predictions in other
circumstances. Just why it goes so badly wrong here is unknown.
Today,
Chris Fields, an independent researcher formerly with New Mexico State
University in Las Cruces, puts forward a simple explanation. His idea is
that the discrepancy arises because large objects, such as planets and
stars, behave classically rather than demonstrating quantum properties.
And he’s provided some simple calculations to make his case.
One
of the key properties of quantum objects is that they can exist in a
superposition of states until they are observed. When that happens,
these many possibilities “collapse” and become one specific outcome, a
process known as quantum decoherence.
For
example, a photon can be in a superposition of states that allow it to
be in several places at the same time. However, as soon as the photon is
observed the superposition decoheres and the photon appears in one
place.
This process of
decoherence must apply to everything that has a specific position, says
Fields. Even to large objects such as stars, whose position is known
with respect to the cosmic microwave background, the echo of the big
bang which fills the universe.
In
fact, Fields argues that it is the interaction between the cosmic
microwave background and all large objects in the universe that causes
them to decohere giving them specific positions which astronomers
observe.
But
there is an important consequence from having a specific
position — there must be some information associated with this location
in 3D space. If a location is unknown, then the amount of information
must be small. But if it is known with precision, the information
content is much higher.
And
given that there are some 10^25 stars in the universe, that’s a lot of
information. Fields calculates that encoding the location of each star
to within 10 cubic kilometres requires some 10^93 bits.
That
immediately leads to an entirely new way of determining the energy
density of the cosmos. Back in the 1960s, the physicist Rolf Landauer
suggested that every bit of information had an energy associated with
it, an idea that has gained considerable traction since then.
So
Fields uses Landauer’s principle to calculate the energy associated
with the locations of all the stars in the universe. This turns out to
be about 10^-30 g /cm^3, very similar to the observed energy density of
the universe.
But here’s the
thing. That calculation requires the position of each star to be
encoded only to within 10 km^3. Fields also asks how much information is
required to encode the position of stars to the much higher resolution
associated with the Planck length. “Encoding 10^25 stellar positions at
[the Planck length] would incur a free-energy cost ∼ 10^117 larger than
that found here,” he says.
That
difference is remarkably similar to the 120 orders of magnitude
discrepancy between the observed energy density and that calculated
using quantum mechanics. Indeed, Fields says that the discrepancy arises
because the positions of the stars can be accounted for using quantum
mechanics. “It seems reasonable to suggest that the discrepancy between
these numbers may be due to the assumption that encoding classical
information at [the Planck scale] can be considered physically
meaningful.”
That’s
a fascinating result that raises important questions about the nature
of reality. First, there is the hint in Fields’ ideas that information
provides the ghostly bedrock on which the laws of physics are based.
That’s an idea that has gained traction among other physicists too.
Then
there is the role of energy. One important question is where this
energy might have come from in the first place. The process of
decoherence seems to create it from nothing.
Cosmologists
generally overlook violations of the principle of conservation of
energy. After all, the big bang itself is the biggest offender. So don’t
expect much hand wringing over this. But Fields’ approach also implies
that a purely quantum universe would have an energy density of zero,
since nothing would have localised position. That’s bizarre.
Beyond
this is the even deeper question of how the universe came to be
classical at all, given that cosmologists would have us believe that the
big bang was a quantum process. Fields suggests that it is the
interaction between the cosmic microwave background and the rest of the
universe that causes the quantum nature of the universe to decohere and
become classical.
Perhaps.
What is all too clear is that there are fundamental and fascinating
problems in cosmology — and the role that information plays in reality.
Ref: arxiv.org/abs/1502.03424 : Is Dark Energy An Artifact Of Decoherence?