In
quantum field theory, a
false vacuum is a
metastable sector of
space that appears to be a
perturbative vacuum, but is unstable due to
instanton effects that may
tunnel to a lower energy state. This tunneling can be caused by quantum fluctuations or the creation of high-energy particles. The false vacuum is a
local minimum, but not the lowest energy state, even though it may remain stable for some time.
Stability and instability of the vacuum
Diagram showing the Higgs boson and
top quark masses, which could indicate whether our universe is stable, or a
long-lived 'bubble'. The outer dotted line is the current measurement uncertainties; the inner ones show predicted sizes after completion of future physics programs, but their location could be anywhere inside the outer.
[1]
For decades, scientific models of our universe have included the possibility that it exists as a
long-lived, but not completely stable, sector of space, which could potentially at some time be destroyed upon 'toppling' into a more stable vacuum state.
[2][3][4][5][6] If the universe were indeed in such a false vacuum state, a catastrophic bubble of more stable "true vacuum" could theoretically occur at any time or place expanding outward at the speed of light.
[2][7] The
Standard Model of
particle physics opens the possibility of calculating, from the masses of the
Higgs boson and the
top quark, whether the universe's present electroweak vacuum state is likely to be stable or merely long-lived.
[8][9] (This was sometimes misreported as the Higgs boson "ending" the universe
[13]). A
125–127 GeV Higgs mass seems to be extremely close to the boundary for stability (estimated in 2012 as
123.8–135.0 GeV[1]). However, a definitive answer requires much more precise measurements of the top quark's
pole mass,
[1] and new physics beyond the
Standard Model of Particle Physics could drastically change this picture.
[14]
Implications
If measurements of these particles suggests that our universe lies within a false vacuum of this kind, then it would imply—more than likely in many billions of years
[15][Note 1]—that it could cease to exist as we know it, if a true vacuum happened to
nucleate.
[15]
This is because, if the Standard Model is correct, the
particles and
forces we observe in our universe exist as they do because of underlying
quantum fields. Quantum fields can have states of differing stability, including 'stable', 'unstable', or '
metastable' (meaning, long-lived but capable of being "toppled" in the right circumstances). If a more stable vacuum state were able to arise, then existing particles and forces would no longer arise as they do in the universe's present state. Different particles or forces would arise from (and be shaped by) whatever new quantum states arose. The world we know depends upon these particles and forces, so if this happened, everything around us, from
subatomic particles to
galaxies, and all
fundamental forces, would be reconstituted into new fundamental particles and forces and structures. The universe would lose all of its present structures and become inhabited by new ones (depending upon the exact states involved) based upon the same quantum fields.
[citation needed]
It would also have implications for other aspects of physics, and would suggest that the Higgs
self-coupling λ and its β
λ function could be very close to zero at the Planck scale, with "intriguing" implications, including implications for theories of gravity and Higgs-based inflation.
[1]:218 A future electron-positron collider would be able to provide the precise measurements of the top quark needed for such calculations.
[1]
In a new study posted on the arXiv in March 2015, it was pointed out that the vacuum decay rate could be vastly increased in the vicinity of black holes, which would serve as a nucleation seed
[1]. According to the new study, a potentially catastrophic vacuum decay could be triggered any time by
primordial black holes, should they exist. It was also discussed that tiny black holes potentially produced at the LHC could trigger such a vacuum decay event, but the results in the existing study were not conclusive.
Vacuum metastability event
A hypothetical
vacuum metastability event would be theoretically possible if our universe were part of a metastable (false) vacuum in the first place, an issue that was highly theoretical and far from resolved in 1982.
[2] A false vacuum is one that appears stable, and is stable within certain limits and conditions, but is capable of being disrupted and entering a different state which is more stable. If this were the case, a bubble of lower-energy vacuum could come to exist by chance or otherwise in our universe, and
catalyze the conversion of our universe to a lower energy state in a volume expanding at nearly the speed of light, destroying all that we know without forewarning.
[3] Chaotic Inflation theory suggests that the universe may be in either a false vacuum or a true vacuum state.
A paper by Coleman and de Luccia which attempted to include simple gravitational assumptions into these theories noted that if this was an accurate representation of nature, then the resulting universe "inside the bubble" in such a case would appear to be extremely unstable and would almost immediately collapse:
[3]
In general, gravitation makes the probability of vacuum decay smaller; in the extreme case of very small energy-density difference, it can even stabilize the false vacuum, preventing vacuum decay altogether. We believe we understand this. For the vacuum to decay, it must be possible to build a bubble of total energy zero. In the absence of gravitation, this is no problem, no matter how small the energy-density difference; all one has to do is make the bubble big enough, and the volume/surface ratio will do the job. In the presence of gravitation, though, the negative energy density of the true vacuum distorts geometry within the bubble with the result that, for a small enough energy density, there is no bubble with a big enough volume/surface ratio. Within the bubble, the effects of gravitation are more dramatic. The geometry of space-time within the bubble is that of anti-de Sitter space, a space much like conventional
de Sitter space except that its group of symmetries is O(3, 2) rather than O(4, 1). Although this space-time is free of singularities, it is unstable under small perturbations, and inevitably suffers gravitational collapse of the same sort as the end state of a contracting Friedmann universe. The time required for the collapse of the interior universe is on the order of ... microseconds or less.
The possibility that we are living in a false vacuum has never been a cheering one to contemplate. Vacuum decay is the ultimate ecological catastrophe; in the new vacuum there are new constants of nature; after vacuum decay, not only is life as we know it impossible, so is chemistry as we know it. However, one could always draw
stoic comfort from the possibility that perhaps in the course of time the new vacuum would sustain, if not life as we know it, at least some structures capable of knowing joy. This possibility has now been eliminated.
The second special case is decay into a space of vanishing cosmological constant, the case that applies if we are now living in the debris of a false vacuum which decayed at some early cosmic epoch. This case presents us with less interesting physics and with fewer occasions for rhetorical excess than the preceding one. It is now the interior of the bubble that is ordinary
Minkowski space...
Such an event would be one possible
doomsday event. It was used as a
plot device in a science-fiction story in 1988 by
Geoffrey A. Landis,
[16] in 2000 by
Stephen Baxter,
[17] in 2002 by
Greg Egan,
[18] and in 2015 by
Alastair Reynolds in his novel
Poseidon's Wake.
In theory, either high enough energy concentrations or random chance could trigger the tunneling needed to set this event in motion. However an immense number of
ultra-high energy particles and events have occurred in the history of our universe, dwarfing by many orders of magnitude any events at human disposal. Hut and Rees
[19] note that, because we have observed
cosmic ray collisions at much higher energies than those produced in terrestrial particle accelerators, these experiments should not, at least for the foreseeable future, pose a threat to our current vacuum. Particle accelerators have reached energies of only approximately eight
tera electron volts (8×10
12 eV). Cosmic ray collisions have been observed at and beyond energies of 10
18 eV, a million times more powerful – the so-called
Greisen–Zatsepin–Kuzmin limit – and other cosmic events may be more powerful yet. Against this, John Leslie has argued
[20] that if present trends continue, particle accelerators will exceed the energy given off in naturally occurring cosmic ray collisions by the year 2150. Fears of this kind were raised by critics of both the
Relativistic Heavy Ion Collider and the
Large Hadron Collider at the time of their respective proposal, and determined to be unfounded by scientific inquiry.
On the other hand, if the
many-worlds interpretation of quantum mechanics is correct, the explanation for why there has been no vacuum decay yet despite many high-energy particle collisions changes entirely. In this case, the corresponding collisions did trigger the vacuum decay, and we're not observing it simply because every such event excludes any observers in its causal (light-cone) future - and there are always worlds exactly identical to such a world in everything except the decay event and its future cone. More generally, the same applies to any set of future light-cones - that is, any causally closed patch of
spacetime - as long as its content either destroys any observers or is eventually unremarkable. Then observers' transition through the boundary of this patch, such as the (potential) cone of the decay event, will have quantitatively altered probability distributions - but formally and subjectively it will be just the same quantum branching, qualitatively indistinguishable from any other ordinary moment of existence. If this is the case,
fine-tuning is an active process, and therefore a vacuum metastability event will never happen.
[21]
Bubble nucleation
In the theoretical physics of the false vacuum, the system moves to a lower energy state – either the true vacuum, or another, lower energy vacuum – through a process known as bubble
nucleation.
[4][5][22][23][24][25] In this, instanton effects cause a bubble to appear in which fields have their true vacuum values inside. Therefore, the interior of the bubble has a lower energy. The walls of the bubble (or
domain walls) have a
surface tension, as energy is expended as the fields roll over the potential barrier to the lower energy vacuum. The most likely size of the bubble is determined in the semi-classical approximation to be such that the bubble has zero total change in the energy: the decrease in energy by the true vacuum in the interior is compensated by the tension of the walls.
Joseph Lykken has said that study of the exact properties of the
Higgs boson could shed light on the possibility of vacuum collapse.
[26]
Expansion of bubble
Any
increase in size of the bubble will
decrease its potential energy, as the energy of the wall increases as the area of a sphere
but the negative contribution of the interior increases more quickly, as the volume of a sphere
. Therefore, after the bubble is nucleated, it quickly begins expanding at very nearly the speed of light. The excess energy contributes to the very large kinetic energy of the walls. If two bubbles are nucleated and they eventually collide, it is thought that particle production would occur where the walls collide.
The tunnelling rate is increased by increasing the energy difference between the two vacua and decreased by increasing the height or width of the barrier.
Gravitational effects
The addition of
gravity to the story leads to a considerably richer variety of phenomena. The key insight is that a false vacuum with positive potential energy density is a
de Sitter vacuum, in which the potential energy acts as a
cosmological constant and the
Universe is undergoing the exponential expansion of
de Sitter space. This leads to a number of interesting effects, first studied by
Coleman and de Luccia.
[3]
Development of theories
Alan Guth, in his original proposal for
cosmic inflation,
[27] proposed that inflation could end through quantum mechanical bubble nucleation of the sort described
above. See
History of Chaotic inflation theory. It was soon understood that a homogeneous and isotropic universe could not be preserved through the violent tunneling process. This led
Andrei Linde[28] and, independently, Andreas Albrecht and
Paul Steinhardt,
[29] to propose "new inflation" or "slow roll inflation" in which no tunnelling occurs, and the inflationary scalar field instead rolls down a gentle slope.
That omission throws the energy budget out which is why they then have to propose extra downward radiation from GHGs to make the energy budget balance.
DWIR from GHGs cannot raise the surface temperature above the S-B expectation of 255K because convection works to negate the thermal effect of downward radiation from within the atmosphere.
The entire 33K surface temperature enhancement is induced by the mass of the atmosphere exchanging energy with the surface via adiabatic ascent and descent within the gravitational field.
The energy engaged in that conductive exchange cannot also be involved in the radiative exchange with space because the same package of energy cannot be involved in two processes at once.
The amount of energy tied up in the conductive exchange is related to atmospheric density simply because the more mass available within a given volume to acquire energy from the surface by conduction the greater the proportion of solar energy that will be diverted from outward radiation to space.
The 8-12 micron wavelengths are not short. Did you intend to write:
Therefore, the highest energy photons at the shorter end of the long-wave 8-12 micron wavelengths emitted from Earth's surface fall within the direct atmospheric window to space without any interaction with greenhouse gases.
“If you can’t explain the pause, you can’t explain the cause”
However, unless I missed something, if this new theory is correct then CO2 released since the industrial revolution should have cooled the atmosphere. There would also be a cooling feedback due to reduced humidity.
This theory needs to match real world data at least as well, and preferably better than the widely believed AGW theory.
Between the two theories, which provides the closest match for the warming over the last century and the paleoclimate record?
Do we need to find a whole new explanation for Climate Change as well?
I’m a skeptic. I’m skeptical about economic predictions, based on crop predictions, based on climate predictions.
I’m also skeptical about claims that decades of theory and research is fundamentally flawed and should be scrapped altogether.
This needs supporting evidence and lots of it.
Although GHGs have a nominal cooling effect by radiating directly to space that is offset by convective changes for a zero net effect on surface temperature for reasons I have explained elsewhere.
You can call that a third theory if you wish.
The changes in global temperature have other, natural causes involving solar effects on the stratosphere and mesosphere.
I've only just stumbled on all this, so I've not been following any discussion, I'm afraid.
Just to clarify, are you saying that the "atmospheric mass/gravity/pressure theory of the 33C greenhouse effect" * predicts a cooling by CO2 and water, but you have proposed an alternative (or at least a variation) that has a net neutral effect?
If so is there a summary online you could link to? Or is it somewhere in the comments of other articles?
Sounds like I didn't misunderstand too much though and this theory (and I guess to a lesser extent, yours) will rely on another factor to explain any warming? Is solar variation still considered the prime candidate if AGW isn't the cause? It's not mentioned as much as it used to be, so I wasn't sure if it had been ruled out?
* Seriously, couldn't someone have thought of a snappier name for this And the acronym is utterly ridiculous- The AMGPT33CGE? That will never catch on.
What seems to happen is that radiative gases allow energy to leak out to space from the adiabatic convective cycle within the atmosphere.
The result is that less kinetic energy is returned to the surface in adiabatic descent than is taken up by adiabatic ascent which should cool the surface.
However that means that the atmosphere does not expand as much as it would have done without GHGs so the air at the surface remains denser than would otherwise have been the case.
Denser air at the surface absorbs more of the solar throughput by conduction and convection which should warm the surface.
I suggest that the two processes cancel each other out.
As far as I know it is only I who have pointed trhis out so far.
http://upload.wikimedia.org/wikipedia/commons/e/e9/Atmospheric.transmittance.IR.jpg
What I am saying is that within the 8-12u window, ~80% of the IR from the surface passes directly to space. I suppose the other 20% is scattered by water vapor/clouds I presume. Using your assumption of 100W/m2 emission from surface, thus ~80% of 100W/m2 or 80W/m2 goes directly to space, double what Trenberth claims.
From your recent post, you offer a "GHE" for the air, and another for the surface, both distinct from HockeyStick's cloud model.
I sugggest you carry these through to your present deconstruction of the TFK2008 diagram; ie
> a deep-cloud/moist-adiabat sector (OLR, DLR, etc);
> a "greenhouse-gas" sector (OLR, DLR);
> a surface sector (OLR);
the whole being linked only at the surface.
Several of the flows shown should be combined as double-headed arrows, as they represent exchanges or transactions, where only the nett is important.
I feel we need to get the "shape" exactly right before running the numbers.
Peter Shaw