Search This Blog

Sunday, December 31, 2023

Time travel in fiction

From Wikipedia, the free encyclopedia
Poster for the 1960 film adaptation of H. G. Wells' 1895 novella The Time Machine

Time travel is a common theme in fiction, mainly since the late 19th century, and has been depicted in a variety of media, such as literature, television, film, and advertisements.

The concept of time travel by mechanical means was popularized in H. G. Wells' 1895 story, The Time Machine. In general, time travel stories focus on the consequences of traveling into the past or the future. The central premise for these stories often involves changing history, either intentionally or by accident, and the ways by which altering the past changes the future and creates an altered present or future for the time traveler upon their return home. In other instances, the premise is that the past cannot be changed or that the future is predetermined, and the protagonist's actions turn out to be either inconsequential or intrinsic to events as they originally unfolded. Some stories focus solely on the paradoxes and alternate timelines that come with time travel, rather than time traveling itself. They often provide some sort of social commentary, as time travel provides a "necessary distancing effect" that allows science fiction to address contemporary issues in metaphorical ways.

Mechanisms

Time travel in modern fiction is sometimes achieved by space and time warps, stemming from the scientific theory of general relativity. Stories from antiquity often featured time travel into the future through a time slip brought on by traveling or sleeping, or in other cases, time travel into the past through supernatural means, for example brought on by angels or spirits.

Time slip

A time slip is a plot device in fantasy and science fiction in which a person, or group of people, seem to travel through time by unknown means. The idea of a time slip has been used in 19th century fantasy, an early example being Washington Irving's 1819 Rip Van Winkle, where the mechanism of time travel is an extraordinarily long sleep. Mark Twain's 1889 A Connecticut Yankee in King Arthur's Court had considerable influence on later writers. The first novel to include both travel to the past and travel to the future and return to the present is the Charles Dickens 1843 novel A Christmas Carol.

Time slip is one of the main plot devices of time travel stories, another being a time machine. The difference is that in time slip stories, the protagonist typically has no control and no understanding of the process (which is often never explained at all) and is either left marooned in a past or future time and must make the best of it, or is eventually returned by a process as unpredictable and uncontrolled as the journey out. The plot device is also popular in children's literature. The 2011 film, Midnight in Paris similarly presents time travel as occurring without an explained mechanism, as the director "eschews a 'realist' internal logic that might explain the time travel, while also foregoing experimental time Distortion techniques, in favor of straightforward editing and a fantastical narrative set-up".

Communication from the future

In literature, communication from the future is a plot device in some science fiction and fantasy stories. Forrest J. Ackerman noted in his 1973 anthology of the best fiction of the year that "the theme of getting hold of tomorrow's newspaper is a recurrent one". An early example of this device can be found in H.G. Wells's 1932 short story "The Queer Story of Brownlow's Newspaper",[20][21] which tells the tale of a man who receives such a paper from 40 years in the future. The 1944 film It Happened Tomorrow also employs this device, with the protagonist receiving the next day's newspaper from an elderly colleague (who is possibly a ghost). Ackerman's anthology also highlights a 1972 short story by Robert Silverberg, "What We Learned From This Morning's Newspaper". In that story, a block of homeowners wake to discover that on November 22, they have received The New York Times for the coming December 1. As characters learn of future events affecting them through a newspaper delivered a week early, the ultimate effect is that this "so upsets the future that spacetime is destroyed". The television series Early Edition, similar to the film It Happened Tomorrow, also revolved around a character who daily received the next day's newspaper, and sought to change some event therein forecast to happen.

A newspaper from the future can be a fictional edition of a real newspaper, or an entirely fictional newspaper. John Buchan's 1932 novel The Gap in the Curtain, is similarly premised on a group of people being enabled to see, for a moment, an item in The Times newspaper from one year in the future. During the Swedish general election of 2006, the Swedish liberal party used election posters which looked like news items, called Framtidens nyheter ("News of the future"), featuring a future Sweden that had become what the party wanted.

A communication from the future raises questions about the ability of humans to control their destiny. The visual novel Steins;Gate features characters sending short text messages backwards in time to avert disaster, only to find their problems are exacerbated due to not knowing how individuals in the past will actually utilize the information.

Precognition

Precognition has been explored as a form of time travel in fiction. Author J. B. Priestley wrote of it both in fiction and non-fiction, analysing testimonials of precognition and other "temporal anomalies" in his book Man and Time. His books include time travel to the future through dreaming, which upon waking up results in memories from the future. Such memories, he writes, may also lead to the feeling of déjà vu, that the present events have already been experienced, and are now being re-experienced. Infallible precognition, which describes the future as it truly is, may lead to causal loops, one form of which is explored in Newcomb's paradox. The film 12 Monkeys heavily deals with themes of predestination and the Cassandra complex, where the protagonist who travels back in time explains that he can't change the past.

The protagonist of the short story Story of Your Life experiences life as a superimposition of the present and the totality of her life, future included, as a consequence of learning an alien language. The mental faculty is speculation based on the Sapir–Whorf hypothesis.

Time loop

A "time loop" or "temporal loop" is a plot device in which periods of time are repeated and re-experienced by the characters, and there is often some hope of breaking out of the cycle of repetition. Time loops are sometimes referred to as causal loops, but these two concepts are distinct. Although similar, causal loops are unchanging and self-originating, whereas time loops are constantly resetting. In a time loop when a certain condition is met, such as a death of a character or a clock reaching a certain time, the loop starts again, with one or more characters retaining the memories from the previous loop. Stories with time loops commonly center on the character learning from each successive loop through time.

Experiencing time in reverse

In some media, certain characters are presented as moving through time backwards. This is a very old concept, with some accounts asserting that English mythological figure Merlin lived backwards, and appeared to be able to prophesy the future because for him it was a memory. This tradition has been reflected in certain modern fictional accounts of the character. In the Piers Anthony book Bearing an Hourglass, the second of eight books in the Incarnations of Immortality series, the character of Norton becomes the incarnation of Time and continues his life living backwards in time. The 2016 film Doctor Strange has the character use the Time Stone, one of the Infinity Stones in the Marvel Cinematic Universe, to reverse time, experiencing time backwards while so doing.

In the film Tenet, characters time travel without jumping back, but by experiencing past reality in reverse, and at the same speed, after going through a 'turnstile' device and until they revert back to normal time flow by going through such a device again. In the meantime, two versions of the time traveller coexist (and must not meet, lest they mutually destruct): the one that had been 'traveling forward' (existing normally) until entering a turnstile and the one traveling backward from the turnstile. The laws of thermodynamics are reversed for time traveling people and objects, so that for example backward travel requires the use of a respirator. Objects left behind by time travellers obey 'reverse thermodynamics;' for example, bullets shot or even simply deposited while traveling backward fly back into (forward traveling) guns.

Record

Protagonists do not travel in time but perceive other times through a record. Depending on the technology, they can minimally consult the record or maximally interact with it as a simulated reality that can deviate causally from the original timeline from the point of interaction. A record can be consulted multiple times, thus providing a time loop mechanism.

Philip K. Dick's novel The Man in the High Castle features books reporting on an alternate timeline. The TV series transposes the mechanism of the books to newsreels. Incidentally, the alternate timeline is the historic timeline, as opposed to the alternate history of the works, so that the records also function as meta-references to the timeline experienced by the authors and the consumers of the works.

The plot of the film Source Code features a simulated and time-looped reality based on the memories of a dead man.

Themes

Time paradox

The idea of changing the past is logically contradictory, creating situations like the grandfather paradox, where time travellers go back in time and change the past in a way that affects their own future, such as by killing their own grandparents. The engineer Paul J. Nahin states that "even though the consensus today is that the past cannot be changed, science fiction writers have used the idea of changing the past for good story effect". Time travel to the past and precognition without the ability to change events may result in causal loops.

The possibility of characters inadvertently or intentionally changing the past gave rise to the idea of "time police", people tasked with preventing such changes from occurring by themselves engaging in time travel to rectify such changes.

Alternative future, history, timelines, and dimensions

An alternative future or alternate future is a possible future that never comes to pass, typically when someone travels back into the past and alters it so that the events of the alternative future cannot occur, or when a communication from the future to the past effected a change that alters the future. Alternative histories may exist "side by side", with the time traveller actually arriving at different dimensions as he changes time.

Butterfly effect

The butterfly effect is the notion that small events can have large, widespread consequences. The term describes events observed in chaos theory where a very small change in initial conditions results in vastly different outcomes. The term was coined by mathematician Edward Lorenz years after the phenomenon was first described.

The butterfly effect has found its way into popular imagination. For example, in Ray Bradbury's 1952 short story A Sound of Thunder, the killing of a single insect millions of years in the past drastically changes the world, and in the 2004 film The Butterfly Effect, the protagonist's small changes to his past results in extreme changes.

Time tourism

A "distinct subgenre" of stories explore time travel as a means of tourism, with travelers curious to visit periods or events such as the Victorian Era or the Crucifixion of Christ, or to meet historical figures such as Abraham Lincoln or Ludwig van Beethoven. This theme can be addressed from two or three directions. An early example of present-day tourists travelling back to the past is Ray Bradbury's 1952 A Sound of Thunder, in which the protagonists are big game hunters who travel to the distant past to hunt dinosaurs. An early example of another type, in which tourists from the future visit the present, is Catherine L. Moore and Henry Kuttner's 1946 Vintage Season. The final type in which there are people time-traveling to the future is experienced in the second book of Douglas Adams' The Hitchhiker's Guide to the Galaxy series, The Restaurant at the End of the Universe, which, as the title indicates, includes a restaurant that exists at the end of the universe. In the restaurant, people time-traveling from all over the space-time continuum (especially the rich) came to the restaurant to view the explosion of the universe put on repeat.

Time war

The Encyclopedia of Science Fiction describes a time war as a fictional war that is "fought across time, usually with each side knowingly using time travel ... in an attempt to establish the ascendancy of one or another version of history". Time wars are also known as "change wars" and "temporal wars". Examples include Clifford D. Simak's 1951 Time and Again, Barrington J. Bayley's 1974 The Fall of Chronopolis, and Matthew Costello's 1990 Time of the Fox.

Ghost story

Researcher Barbara Bronlow wrote that traditional ghost stories are in effect an early form of time travel, since they depict living people of the present interacting with (dead) people of the past. She noted as an instance that Christopher Marlow's Doctor Faustus called up Helen of Troy and met her arising from her grave.

Temporal paradox

From Wikipedia, the free encyclopedia

A temporal paradox, time paradox, or time travel paradox, is a paradox, an apparent contradiction, or logical contradiction associated with the idea of time travel or other foreknowledge of the future. While the notion of time travel to the future complies with the current understanding of physics via relativistic time dilation, temporal paradoxes arise from circumstances involving hypothetical time travel to the past – and are often used to demonstrate its impossibility.

Types

Temporal paradoxes fall into three broad groups: bootstrap paradoxes, consistency paradoxes, and Newcomb's paradox. Bootstrap paradoxes violate causality by allowing future events to influence the past and cause themselves, or "bootstrapping", which derives from the idiom "pull oneself up by one's bootstraps." Consistency paradoxes, on the other hand, are those where future events influence the past to cause an apparent contradiction, exemplified by the grandfather paradox, where a person travels to the past to kill their grandfather. Newcomb's paradox stems from the apparent contradictions that stem from the assumptions of both free will and foreknowledge of future events. All of these are sometimes referred to individually as "causal loops." The term "time loop" is sometimes referred to as a causal loop, but although they appear similar, causal loops are unchanging and self-originating, whereas time loops are constantly resetting.

Bootstrap paradox

A boot-strap paradox, also known as an information loop, an information paradox, an ontological paradox, or a "predestination paradox" is a paradox of time travel that occurs when any event, such as an action, information, an object, or a person, ultimately causes itself, as a consequence of either retrocausality or time travel.

Backward time travel would allow information, people, or objects whose histories seem to "come from nowhere". Such causally looped events then exist in spacetime, but their origin cannot be determined. The notion of objects or information that are "self-existing" in this way is often viewed as paradoxical. Everett gives the movie Somewhere in Time as an example involving an object with no origin: an old woman gives a watch to a playwright who later travels back in time and meets the same woman when she was young, and gives her the same watch that she will later give to him.

Information paradox

A second class of temporal paradoxes is related to information being created from nothing.

Predestination paradox

Smeenk uses the term "predestination paradox" to refer specifically to situations in which a time traveler goes back in time to try to prevent some event in the past.

Grandfather paradox

Top: original billiard ball trajectory. Middle: the billiard ball emerges from the future, and delivers its past self a strike that averts the past ball from entering the time machine. Bottom: The billiard ball never enters the time machine, giving rise to the paradox, putting into question how its older self could ever emerge from the time machine and divert its course.

The consistency paradox or grandfather paradox occurs when the past is changed in any way, thus creating a contradiction. A common example given is traveling to the past and intervening with the conception of one's ancestors (such as causing the death of the parent beforehand), thus affecting the conception of oneself. If the time traveler were not born, then it would not be possible for them to undertake such an act in the first place. Therefore, the ancestor lives to offspring the time traveler's next-generation ancestor, and eventually the time traveler. There is thus no predicted outcome to this. Consistency paradoxes occur whenever changing the past is possible. A possible resolution is that a time traveller can do anything that did happen, but cannot do anything that did not happen. Doing something that did not happen results in a contradiction. This is referred to as the Novikov self-consistency principle.

Variants

The grandfather paradox encompasses any change to the past, and it is presented in many variations, including killing one's past self, Both the "retro-suicide paradox" and the "grandfather paradox" appeared in letters written into Amazing Stories in the 1920s. Another variant of the grandfather paradox is the "Hitler paradox" or "Hitler's murder paradox", in which the protagonist travels back in time to murder Adolf Hitler before he can instigate World War II and the Holocaust. Rather than necessarily physically preventing time travel, the action removes any reason for the travel, along with any knowledge that the reason ever existed.

Physicist John Garrison et al. give a variation of the paradox of an electronic circuit that sends a signal through a time machine to shut itself off, and receives the signal before it sends it.

Newcomb's paradox

Newcomb's paradox is a thought experiment showing an apparent contradiction between the expected utility principle and the strategic dominance principle.

The thought experiment is often extended to explore causality and free will by allowing for "perfect predictors": if perfect predictors of the future exist, for example if time travel exists as a mechanism for making perfect predictions, then perfect predictions appear to contradict free will because decisions apparently made with free will are already known to the perfect predictor. Predestination does not necessarily involve a supernatural power, and could be the result of other "infallible foreknowledge" mechanisms. Problems arising from infallibility and influencing the future are explored in Newcomb's paradox.

Proposed resolutions

Logical impossibility

Even without knowing whether time travel to the past is physically possible, it is possible to show using modal logic that changing the past results in a logical contradiction. If it is necessarily true that the past happened in a certain way, then it is false and impossible for the past to have occurred in any other way. A time traveler would not be able to change the past from the way it is; they would only act in a way that is already consistent with what necessarily happened.

Consideration of the grandfather paradox has led some to the idea that time travel is by its very nature paradoxical and therefore logically impossible. For example, the philosopher Bradley Dowden made this sort of argument in the textbook Logical Reasoning, arguing that the possibility of creating a contradiction rules out time travel to the past entirely. However, some philosophers and scientists believe that time travel into the past need not be logically impossible provided that there is no possibility of changing the past, as suggested, for example, by the Novikov self-consistency principle. Dowden revised his view after being convinced of this in an exchange with the philosopher Norman Swartz.

Illusory time

Consideration of the possibility of backward time travel in a hypothetical universe described by a Gödel metric led famed logician Kurt Gödel to assert that time might itself be a sort of illusion. He suggests something along the lines of the block time view, in which time is just another dimension like space, with all events at all times being fixed within this four-dimensional "block".

Physical impossibility

Sergey Krasnikov writes that these bootstrap paradoxes – information or an object looping through time – are the same; the primary apparent paradox is a physical system evolving into a state in a way that is not governed by its laws. He does not find these paradoxical and attributes problems regarding the validity of time travel to other factors in the interpretation of general relativity.

Self-sufficient loops

A 1992 paper by physicists Andrei Lossev and Igor Novikov labeled such items without origin as Jinn, with the singular term Jinnee. This terminology was inspired by the Jinn of the Quran, which are described as leaving no trace when they disappear. Lossev and Novikov allowed the term "Jinn" to cover both objects and information with the reflexive origin; they called the former "Jinn of the first kind", and the latter "Jinn of the second kind". They point out that an object making circular passage through time must be identical whenever it is brought back to the past, otherwise it would create an inconsistency; the second law of thermodynamics seems to require that the object tends to a lower energy state throughout its history, and such objects that are identical in repeating points in their history seem to contradict this, but Lossev and Novikov argued that since the second law only requires entropy to increase in closed systems, a Jinnee could interact with its environment in such a way as to regain "lost" entropy. They emphasize that there is no "strict difference" between Jinn of the first and second kind. Krasnikov equivocates between "Jinn", "self-sufficient loops", and "self-existing objects", calling them "lions" or "looping or intruding objects", and asserts that they are no less physical than conventional objects, "which, after all, also could appear only from either infinity or a singularity."

Novikov self-consistency principle

The self-consistency principle developed by Igor Dmitriyevich Novikov expresses one view as to how backward time travel would be possible without the generation of paradoxes. According to this hypothesis, even though general relativity permits some exact solutions that allow for time travel that contain closed timelike curves that lead back to the same point in spacetime, physics in or near closed timelike curves (time machines) can only be consistent with the universal laws of physics, and thus only self-consistent events can occur. Anything a time traveler does in the past must have been part of history all along, and the time traveler can never do anything to prevent the trip back in time from happening, since this would represent an inconsistency. The authors concluded that time travel need not lead to unresolvable paradoxes, regardless of what type of object was sent to the past.

Physicist Joseph Polchinski considered a potentially paradoxical situation involving a billiard ball that is fired into a wormhole at just the right angle such that it will be sent back in time and collides with its earlier self, knocking it off course, which would stop it from entering the wormhole in the first place. Kip Thorne referred to this problem as "Polchinski's paradox". Thorne and two of his students at Caltech, Fernando Echeverria and Gunnar Klinkhammer, went on to find a solution that avoided any inconsistencies, and found that there was more than one self-consistent solution, with slightly different angles for the glancing blow in each case. Later analysis by Thorne and Robert Forward showed that for certain initial trajectories of the billiard ball, there could be an infinite number of self-consistent solutions. It is plausible that there exist self-consistent extensions for every possible initial trajectory, although this has not been proven. The lack of constraints on initial conditions only applies to spacetime outside of the chronology-violating region of spacetime; the constraints on the chronology-violating region might prove to be paradoxical, but this is not yet known.

Novikov's views are not widely accepted. Visser views causal loops and Novikov's self-consistency principle as an ad hoc solution, and supposes that there are far more damaging implications of time travel. Krasnikov similarly finds no inherent fault in causal loops but finds other problems with time travel in general relativity. Another conjecture, the cosmic censorship hypothesis, suggests that every closed timelike curve passes through an event horizon, which prevents such causal loops from being observed.

Parallel universes

The interacting-multiple-universes approach is a variation of the many-worlds interpretation of quantum mechanics that involves time travelers arriving in a different universe than the one from which they came; it has been argued that, since travelers arrive in a different universe's history and not their history, this is not "genuine" time travel. Stephen Hawking has argued for the chronology protection conjecture, that even if the MWI is correct, we should expect each time traveler to experience a single self-consistent history so that time travelers remain within their world rather than traveling to a different one.

David Deutsch has proposed that quantum computation with a negative delay—backward time travel—produces only self-consistent solutions, and the chronology-violating region imposes constraints that are not apparent through classical reasoning. However Deutsch's self-consistency condition has been demonstrated as capable of being fulfilled to arbitrary precision by any system subject to the laws of classical statistical mechanics, even if it is not built up by quantum systems. Allen Everett has also argued that even if Deutsch's approach is correct, it would imply that any macroscopic object composed of multiple particles would be split apart when traveling back in time, with different particles emerging in different worlds.

Postulates of special relativity

In physics, Albert Einstein derived the theory of special relativity in 1905 from principle now called the postulates of special relativity. Einstein's formulation is said to only require two postulates, though his derivation implies a few more assumptions.

The idea that special relativity depended only on two postulates, both of which seemed to be follow from the theory and experiment of the day, was one of the most compelling arguments for the correctness of the theory (Einstein 1912: "This theory is correct to the extent to which the two principles upon which it is based are correct. Since these seem to be correct to a great extent, ...")

Postulates of special relativity

1. First postulate (principle of relativity)

The laws of physics take the same form in all inertial frames of reference.

2. Second postulate (invariance of c)

As measured in any inertial frame of reference, light is always propagated in empty space with a definite velocity c that is independent of the state of motion of the emitting body. Or: the speed of light in free space has the same value c in all inertial frames of reference.

The two-postulate basis for special relativity is the one historically used by Einstein, and it is sometimes the starting point today. As Einstein himself later acknowledged, the derivation of the Lorentz transformation tacitly makes use of some additional assumptions, including spatial homogeneity, isotropy, and memorylessness. Also Hermann Minkowski implicitly used both postulates when he introduced the Minkowski space formulation, even though he showed that c can be seen as a space-time constant, and the identification with the speed of light is derived from optics.

Alternative derivations of special relativity

Historically, Hendrik Lorentz and Henri Poincaré (1892–1905) derived the Lorentz transformation from Maxwell's equations, which served to explain the negative result of all aether drift measurements. By that the luminiferous aether becomes undetectable in agreement with what Poincaré called the principle of relativity (see History of Lorentz transformations and Lorentz ether theory). A more modern example of deriving the Lorentz transformation from electrodynamics (without using the historical aether concept at all), was given by Richard Feynman.

George Francis FitzGerald already made an argument similar to Einstein's in 1889, in response to the Michelson-Morley experiment seeming to show both postulates to be true. He wrote that a length contraction is "almost the only hypothesis that can reconcile" the apparent contradictions. Lorentz independently came to similar conclusions, and later wrote "the chief difference being that Einstein simply postulates what we have deduced".

Following these derivations, many alternative derivations have been proposed, based on various sets of assumptions. It has often been argued (such as by Vladimir Ignatowski in 1910, or Philipp Frank and Hermann Rothe in 1911, and many others in subsequent years) that a formula equivalent to the Lorentz transformation, up to a nonnegative free parameter, follows from just the relativity postulate itself, without first postulating the universal light speed. These formulations rely on the aforementioned various assumptions such as isotropy. The numerical value of the parameter in these transformations can then be determined by experiment, just as the numerical values of the parameter pair c and the Vacuum permittivity are left to be determined by experiment even when using Einstein's original postulates. Experiment rules out the validity of the Galilean transformations. When the numerical values in both Einstein's and other approaches have been found then these different approaches result in the same theory.

Insufficiency of the two standard postulates

Einstein's 1905 derivation is not complete. A break in Einstein's logic occurs where, after having established "the law of the constancy of the speed of light" for empty space, he invokes the law in situations where space is no longer empty. For the derivation to apply to physical objects requires an additional postulate or "bridging hypothesis", that the geometry derived for empty space also applies when a space is populated. This would be equivalent to stating that we know that the introduction of matter into a region, and its relative motion, have no effect on lightbeam geometry.

Such a statement would be problematic, as Einstein rejected the notion that a process such as light-propagation could be immune to other factors (1914: "There can be no doubt that this principle is of far-reaching significance; and yet, I cannot believe in its exact validity. It seems to me unbelievable that the course of any process (e.g., that of the propagation of light in a vacuum) could be conceived of as independent of all other events in the world.")

Including this "bridge" as an explicit third postulate might also have damaged the theory's credibility, as refractive index and the Fizeau effect would have suggested that the presence and behaviour of matter does seem to influence light-propagation, contra the theory. If this bridging hypothesis had been stated as a third postulate, it could have been claimed that the third postulate (and therefore the theory) were falsified by the experimental evidence.

The 1905 system as "null theory"

Without a "bridging hypothesis" as a third postulate, the 1905 derivation is open to the criticism that its derived relationships may only apply in vacuo, that is, in the absence of matter.

The controversial suggestion that the 1905 theory, derived by assuming empty space, might only apply to empty space, appears in Edwin F. Taylor and John Archibald Wheeler's book "Spacetime Physics" (Box 3-1: "The Principle of Relativity Rests on Emptiness").

A similar suggestion that the reduction of GR geometry to SR's flat spacetime over small regions may be "unphysical" (because flat pointlike regions cannot contain matter capable of acting as physical observers) was acknowledged but rejected by Einstein in 1914 ("The equations of the new theory of relativity reduce to those of the original theory in the special case where the gμν can be considered constant ... the sole objection that can be raised against the theory is that the equations we have set up might, perhaps, be void of any physical content. But no one is likely to think in earnest that this objection is justified in the present case").

Einstein revisited the problem in 1919 ("It is by no means settled a priori that a limiting transition of this kind has any possible meaning. For if gravitational fields do play an essential part in the structure of the particles of matter, the transition to the limiting case of constant gμν would, for them, lose its justification, for indeed, with constant gμν there could not be any particles of matter.")

A further argument for unphysicality can be gleaned from Einstein's solution to the "hole problem" under general relativity, in which Einstein rejects the physicality of coordinate-system relationships in truly empty space.

Alternative relativistic models

Einstein's special theory is not the only theory that combines a form of lightspeed constancy with the relativity principle. A theory along the lines of that proposed by Heinrich Hertz (in 1890) allows for light to be fully dragged by all objects, giving local c-constancy for all physical observers. The logical possibility of a Hertzian theory shows that Einstein's two standard postulates (without the bridging hypothesis) are not sufficient to allow us to arrive uniquely at the solution of special relativity (although special relativity might be considered the most minimalist solution).

Einstein agreed that the Hertz theory was logically consistent ("It is on the basis of this hypothesis that Hertz developed an electrodynamics of moving bodies that is free of contradictions."), but dismissed it on the grounds of a poor agreement with the Fizeau result, leaving special relativity as the only remaining option. Given that SR was similarly unable to reproduce the Fizeau result without introducing additional auxiliary rules (to address the different behaviour of light in a particulate medium), this was perhaps not a fair comparison.

Mathematical formulation of the postulates

In the rigorous mathematical formulation of special relativity, we suppose that the universe exists on a four-dimensional spacetime M. Individual points in spacetime are known as events; physical objects in spacetime are described by worldlines (if the object is a point particle) or worldsheets (if the object is larger than a point). The worldline or worldsheet only describes the motion of the object; the object may also have several other physical characteristics such as energy-momentum, mass, charge, etc.

In addition to events and physical objects, there are a class of inertial frames of reference. Each inertial frame of reference provides a coordinate system for events in the spacetime M. Furthermore, this frame of reference also gives coordinates to all other physical characteristics of objects in the spacetime; for instance, it will provide coordinates for the momentum and energy of an object, coordinates for an electromagnetic field, and so forth.

We assume that given any two inertial frames of reference, there exists a coordinate transformation that converts the coordinates from one frame of reference to the coordinates in another frame of reference. This transformation not only provides a conversion for spacetime coordinates , but will also provide a conversion for all other physical coordinates, such as a conversion law for momentum and energy , etc. (In practice, these conversion laws can be efficiently handled using the mathematics of tensors.)

We also assume that the universe obeys a number of physical laws. Mathematically, each physical law can be expressed with respect to the coordinates given by an inertial frame of reference by a mathematical equation (for instance, a differential equation) which relates the various coordinates of the various objects in the spacetime. A typical example is Maxwell's equations. Another is Newton's first law.

1. First Postulate (Principle of relativity)

Under transitions between inertial reference frames, the equations of all fundamental laws of physics stay form-invariant, while all the numerical constants entering these equations preserve their values. Thus, if a fundamental physical law is expressed with a mathematical equation in one inertial frame, it must be expressed by an identical equation in any other inertial frame, provided both frames are parameterised with charts of the same type. (The caveat on charts is relaxed, if we employ connections to write the law in a covariant form.)

2. Second Postulate (Invariance of c)

There exists an absolute constant with the following property. If A, B are two events which have coordinates and in one inertial frame , and have coordinates and in another inertial frame , then
if and only if .

Informally, the Second Postulate asserts that objects travelling at speed c in one reference frame will necessarily travel at speed c in all reference frames. This postulate is a subset of the postulates that underlie Maxwell's equations in the interpretation given to them in the context of special relativity. However, Maxwell's equations rely on several other postulates, some of which are now known to be false (e.g., Maxwell's equations cannot account for the quantum attributes of electromagnetic radiation).

The second postulate can be used to imply a stronger version of itself, namely that the spacetime interval is invariant under changes of inertial reference frame. In the above notation, this means that

for any two events A, B. This can in turn be used to deduce the transformation laws between reference frames; see Lorentz transformation.

The postulates of special relativity can be expressed very succinctly using the mathematical language of pseudo-Riemannian manifolds. The second postulate is then an assertion that the four-dimensional spacetime M is a pseudo-Riemannian manifold equipped with a metric g of signature (1,3), which is given by the Minkowski metric when measured in each inertial reference frame. This metric is viewed as one of the physical quantities of the theory; thus it transforms in a certain manner when the frame of reference is changed, and it can be legitimately used in describing the laws of physics. The first postulate is an assertion that the laws of physics are invariant when represented in any frame of reference for which g is given by the Minkowski metric. One advantage of this formulation is that it is now easy to compare special relativity with general relativity, in which the same two postulates hold but the assumption that the metric is required to be Minkowski is dropped.

The theory of Galilean relativity is the limiting case of special relativity in the limit (which is sometimes referred to as the non-relativistic limit). In this theory, the first postulate remains unchanged, but the second postulate is modified to:

If A, B are two events which have coordinates and in one inertial frame , and have coordinates and in another inertial frame , then . Furthermore, if , then
.

The physical theory given by classical mechanics, and Newtonian gravity is consistent with Galilean relativity, but not special relativity. Conversely, Maxwell's equations are not consistent with Galilean relativity unless one postulates the existence of a physical aether. In a surprising number of cases, the laws of physics in special relativity (such as the famous equation ) can be deduced by combining the postulates of special relativity with the hypothesis that the laws of special relativity approach the laws of classical mechanics in the non-relativistic limit.

Cosmological constant problem

From Wikipedia, the free encyclopedia
 
In cosmology, the cosmological constant problem or vacuum catastrophe is the substantial disagreement between the observed values of vacuum energy density (the small value of the cosmological constant) and the much larger theoretical value of zero-point energy suggested by quantum field theory.

Depending on the Planck energy cutoff and other factors, the quantum vacuum energy contribution to the effective cosmological constant is calculated to be between 50 and as much as 120 orders of magnitude greater than observed, a state of affairs described by physicists as "the largest discrepancy between theory and experiment in all of science" and "the worst theoretical prediction in the history of physics".

History

The basic problem of a vacuum energy producing a gravitational effect was identified as early as 1916 by Walther Nernst. He predicted that the value had to be either zero or very small. In 1926, Wilhelm Lenz concluded that "If one allows waves of the shortest observed wavelengths λ ≈ 2 × 10−11 cm, ... and if this radiation, converted to material density (u/c2 ≈ 106), contributed to the curvature of the observable universe – one would obtain a vacuum energy density of such a value that the radius of the observable universe would not reach even to the Moon." 

After the development of quantum field theory in the 1940s, the first to address contributions of quantum fluctuations to the cosmological constant was Yakov Zel'dovich in the 1960s. In quantum mechanics, the vacuum itself should experience quantum fluctuations. In general relativity, those quantum fluctuations constitute energy that would add to the cosmological constant. However, this calculated vacuum energy density is many orders of magnitude bigger than the observed cosmological constant. Original estimates of the degree of mismatch were as high as 120 to 122 orders of magnitude; however, modern research suggests that, when Lorentz invariance is taken into account, the degree of mismatch is closer to 60 orders of magnitude.

With the development of inflationary cosmology in the 1980s, the problem became much more important: as cosmic inflation is driven by vacuum energy, differences in modeling vacuum energy lead to huge differences in the resulting cosmologies. Were the vacuum energy precisely zero, as was once believed, then the expansion of the universe would not accelerate as observed, according to the standard Λ-CDM model.

Cutoff dependence

The calculated vacuum energy is a positive, rather than negative, contribution to the cosmological constant because the existing vacuum has negative quantum-mechanical pressure, while in general relativity, the gravitational effect of negative pressure is a kind of repulsion. (Pressure here is defined as the flux of quantum-mechanical momentum across a surface.) Roughly, the vacuum energy is calculated by summing over all known quantum-mechanical fields, taking into account interactions and self-interactions between the ground states, and then removing all interactions below a minimum "cutoff" wavelength to reflect that existing theories break down and may fail to be applicable around the cutoff scale. Because the energy is dependent on how fields interact within the current vacuum state, the vacuum energy contribution would have been different in the early universe; for example, the vacuum energy would have been significantly different prior to electroweak symmetry breaking during the quark epoch.

Renormalization

The vacuum energy in quantum field theory can be set to any value by renormalization. This view treats the cosmological constant as simply another fundamental physical constant not predicted or explained by theory. Such a renormalization constant must be chosen very accurately because of the many-orders-of-magnitude discrepancy between theory and observation, and many theorists consider this ad-hoc constant as equivalent to ignoring the problem.

Estimated values

The vacuum energy density of the Universe based on 2015 measurements by the Planck collaboration is ρvac = 5.96×10−27 kg/m35.3566×10−10 J/m3 = 3.35 GeV/m3 or about 2.5×10−47 GeV4 in geometrized units.

One assessment, made by Jérôme Martin of the Institut d'Astrophysique de Paris in 2012, placed the expected theoretical vacuum energy scale around 108 GeV4, for a difference of about 55 orders of magnitude.

Proposed solutions

Some physicists propose an anthropic solution, and argue that we live in one region of a vast multiverse that has different regions with different vacuum energies. These anthropic arguments posit that only regions of small vacuum energy such as the one in which we live are reasonably capable of supporting intelligent life. Such arguments have existed in some form since at least 1981. Around 1987, Steven Weinberg estimated that the maximum allowable vacuum energy for gravitationally-bound structures to form is problematically large, even given the observational data available in 1987, and concluded the anthropic explanation appears to fail; however, more recent estimates by Weinberg and others, based on other considerations, find the bound to be closer to the actual observed level of dark energy. Anthropic arguments gradually gained credibility among many physicists after the discovery of dark energy and the development of the theoretical string theory landscape, but are still derided by a substantial skeptical portion of the scientific community as being problematic to verify. Proponents of anthropic solutions are themselves divided on multiple technical questions surrounding how to calculate the proportion of regions of the universe with various dark energy constants.

Other proposals involve modifying gravity to diverge from general relativity. These proposals face the hurdle that the results of observations and experiments so far have tended to be extremely consistent with general relativity and the ΛCDM model, and inconsistent with thus-far proposed modifications. In addition, some of the proposals are arguably incomplete, because they solve the "new" cosmological constant problem by proposing that the actual cosmological constant is exactly zero rather than a tiny number, but fail to solve the "old" cosmological constant problem of why quantum fluctuations seem to fail to produce substantial vacuum energy in the first place. Nevertheless, many physicists argue that, due in part to a lack of better alternatives, proposals to modify gravity should be considered "one of the most promising routes to tackling" the cosmological constant problem.

Bill Unruh and collaborators have argued that when the energy density of the quantum vacuum is modeled more accurately as a fluctuating quantum field, the cosmological constant problem does not arise. Going in a different direction, George F. R. Ellis and others have suggested that in unimodular gravity, the troublesome contributions simply do not gravitate.

Another argument, due to Stanley Brodsky and Robert Shrock, is that in light front quantization, the quantum field theory vacuum becomes essentially trivial. In the absence of vacuum expectation values, there is no contribution from QED, weak interactions, and QCD to the cosmological constant. It is thus predicted to be zero in a flat space-time. From light front quantization insight, the origin of the cosmological constant problem is traced back to unphysical non-causal terms in the standard calculation, which lead to an erroneously large value of the cosmological constant. 

In 2018, a mechanism for cancelling Λ out has been proposed through the use of a symmetry breaking potential in a Lagrangian formalism in which matter shows a non-vanishing pressure. The model assumes that standard matter provides a pressure which counterbalances the action due to the cosmological constant. Luongo and Muccino have shown that this mechanism permits to take vacuum energy as quantum field theory predicts, but removing the huge magnitude through a counterbalance term due to baryons and cold dark matter only.

In 1999, Andrew Cohen, David B. Kaplan and Ann Nelson proposed that correlations between the UV and IR cutoffs in effective quantum field theory are enough to reduce the theoretical cosmological constant down to the measured cosmological constant due to the CKN bound. In 2021, Nikita Blinov and Patrick Draper confirmed through the holographic principle that the CKN bound predicts the measured cosmological constant, all while maintaining the predictions of effective field theory in less extreme conditions.

Inequality (mathematics)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Inequality...