Search This Blog

Wednesday, November 5, 2025

Many-worlds interpretation

From Wikipedia, the free encyclopedia
The quantum-mechanical "Schrödinger's cat" paradox according to the many-worlds interpretation. In this interpretation, every quantum event is a branch point; the cat is both alive and dead, even after the box is opened, but the "alive" and "dead" cats are in different branches of the multiverse, both of which are equally real, but which do not interact with each other.

The many-worlds interpretation (MWI) is an interpretation of quantum mechanics that asserts that the universal wavefunction is objectively real, and that there is no wave function collapse. This implies that all possible outcomes of quantum measurements are physically realized in different "worlds". The evolution of reality as a whole in MWI is rigidly deterministic and local. Many-worlds is also called the relative state formulation or the Everett interpretation, after physicist Hugh Everett, who first proposed it in 1957. Bryce DeWitt popularized the formulation and named it many-worlds in the 1970s.

In modern versions of many-worlds, the subjective appearance of wave function collapse is explained by the mechanism of quantum decoherence. Decoherence approaches to interpreting quantum theory have been widely explored and developed since the 1970s. MWI is considered a mainstream interpretation of quantum mechanics, along with the other decoherence interpretations, the Copenhagen interpretation, and hidden variable theories such as Bohmian mechanics.

The many-worlds interpretation implies that there are many parallel, non-interacting worlds. It is one of a number of multiverse hypotheses in physics and philosophy. MWI views time as a many-branched tree, wherein every possible quantum outcome is realized. This is intended to resolve the measurement problem and thus some paradoxes of quantum theory, such as Wigner's friend, the EPR paradox and Schrödinger's cat, since every possible outcome of a quantum event exists in its own world.

Overview of the interpretation

The many-worlds interpretation's key idea is that the linear and unitary dynamics of quantum mechanics applies everywhere and at all times and so describes the whole universe. In particular, it models a measurement as a unitary transformation, a correlation-inducing interaction, between observer and object, without using a collapse postulate, and models observers as ordinary quantum-mechanical systems. This stands in contrast to the Copenhagen interpretation, in which a measurement is a "primitive" concept, not describable by unitary quantum mechanics; using the Copenhagen interpretation the universe is divided into a quantum and a classical domain, and the collapse postulate is central. In MWI, there is no division between classical and quantum: everything is quantum and there is no collapse. MWI's main conclusion is that the universe (or multiverse in this context) is composed of a quantum superposition of an uncountable or undefinable amount or number of increasingly divergent, non-communicating parallel universes or quantum worlds. Sometimes dubbed Everett worlds, each is an internally consistent and actualized alternative history or timeline.

The many-worlds interpretation uses decoherence to explain the measurement process and the emergence of a quasi-classical world. Wojciech H. Zurek, one of decoherence theory's pioneers, said: "Under scrutiny of the environment, only pointer states remain unchanged. Other states decohere into mixtures of stable pointer states that can persist, and, in this sense, exist: They are einselected." Zurek emphasizes that his work does not depend on a particular interpretation.

The many-worlds interpretation shares many similarities with the decoherent histories interpretation, which also uses decoherence to explain the process of measurement or wave function collapse. MWI treats the other histories or worlds as real, since it regards the universal wave function as the "basic physical entity" or "the fundamental entity, obeying at all times a deterministic wave equation". The decoherent histories interpretation, on the other hand, needs only one of the histories (or worlds) to be real.

Several authors, including Everett, John Archibald Wheeler and David Deutsch, call many-worlds a theory or metatheory, rather than just an interpretation. Everett argued that it was the "only completely coherent approach to explaining both the contents of quantum mechanics and the appearance of the world." Deutsch dismissed the idea that many-worlds is an "interpretation", saying that to call it an interpretation "is like talking about dinosaurs as an 'interpretation' of fossil records".

Formulation

In his 1957 doctoral dissertation, Everett proposed that, rather than relying on external observation for analysis of isolated quantum systems, one could mathematically model an object, as well as its observers, as purely physical systems within the mathematical framework developed by Paul Dirac, John von Neumann, and others, discarding altogether the ad hoc mechanism of wave function collapse.

Relative state

Everett's original work introduced the concept of a relative state. Two (or more) subsystems, after a general interaction, become correlated, or as is now said, entangled. Everett noted that such entangled systems can be expressed as the sum of products of states, where the two or more subsystems are each in a state relative to each other. After a measurement or observation one of the pair (or triple, etc.) is the measured, object or observed system, and one other member is the measuring apparatus (which may include an observer) having recorded the state of the measured system. Each product of subsystem states in the overall superposition evolves over time independently of other products. Once the subsystems interact, their states have become correlated or entangled and can no longer be considered independent. In Everett's terminology, each subsystem state was now correlated with its relative state, since each subsystem must now be considered relative to the other subsystems with which it has interacted.

In the example of Schrödinger's cat, after the box is opened, the entangled system is the cat, the poison vial and the observer. One relative triple of states would be the alive cat, the unbroken vial and the observer seeing an alive cat. Another relative triple of states would be the dead cat, the broken vial and the observer seeing a dead cat.

In the example of a measurement of a continuous variable (e.g., position q), the object-observer system decomposes into a continuum of pairs of relative states: the object system's relative state becomes a Dirac delta function each centered on a particular value of q and the corresponding observer relative state representing an observer having recorded the value of q. The states of the pairs of relative states are, post measurement, correlated with each other.

In Everett's scheme, there is no collapse; instead, the Schrödinger equation, or its quantum field theory, relativistic analog, holds all the time, everywhere. An observation or measurement is modeled by applying the wave equation to the entire system, comprising the object being observed and the observer. One consequence is that every observation causes the combined observer–object's wavefunction to change into a quantum superposition of two or more non-interacting branches.

Thus the process of measurement or observation, or any correlation-inducing interaction, splits the system into sets of relative states, where each set of relative states, forming a branch of the universal wave function, is consistent within itself, and all future measurements (including by multiple observers) will confirm this consistency.

Renamed many-worlds

Everett had referred to the combined observer–object system as split by an observation, each split corresponding to the different or multiple possible outcomes of an observation. These splits generate a branching tree, where each branch is a set of all the states relative to each other. Bryce DeWitt popularized Everett's work with a series of publications calling it the Many Worlds Interpretation. Focusing on the splitting process, DeWitt introduced the term "world" to describe a single branch of that tree, which is a consistent history. All observations or measurements within any branch are consistent within themselves.

Since many observation-like events have happened and are constantly happening, Everett's model implies that there are an enormous and growing number of simultaneously existing states or "worlds".

Properties

MWI removes the observer-dependent role in the quantum measurement process by replacing wave function collapse with the established mechanism of quantum decoherence. As the observer's role lies at the heart of all "quantum paradoxes" such as the EPR paradox and von Neumann's "boundary problem", this provides a clearer and easier approach to their resolution.

Since the Copenhagen interpretation requires the existence of a classical domain beyond the one described by quantum mechanics, it has been criticized as inadequate for the study of cosmology. While there is no evidence that Everett was inspired by issues of cosmology, he developed his theory with the explicit goal of allowing quantum mechanics to be applied to the universe as a whole, hoping to stimulate the discovery of new phenomena. This hope has been realized in the later development of quantum cosmology.

MWI is a realist, deterministic and local theory. It achieves this by removing wave function collapse, which is indeterministic and nonlocal, from the deterministic and local equations of quantum theory.

MWI (like other, broader multiverse theories) provides a context for the anthropic principle, which may provide an explanation for the fine-tuned universe.

MWI depends crucially on the linearity of quantum mechanics, which underpins the superposition principle. If the final theory of everything is non-linear with respect to wavefunctions, then many-worlds is invalid. All quantum field theories are linear and compatible with the MWI, a point Everett emphasized as a motivation for the MWI. While quantum gravity or string theory may be non-linear in this respect, there is as yet no evidence of this.

Weingarten and Taylor & McCulloch have made separate proposals for how to define wavefunction branches in terms of quantum circuit complexity.

Alternative to wavefunction collapse

As with the other interpretations of quantum mechanics, the many-worlds interpretation is motivated by behavior that can be illustrated by the double-slit experiment. When particles of light (or anything else) pass through the double slit, a calculation assuming wavelike behavior of light can be used to identify where the particles are likely to be observed. Yet when the particles are observed in this experiment, they appear as particles (i.e., at definite places) and not as non-localized waves.

Some versions of the Copenhagen interpretation of quantum mechanics proposed a process of "collapse" in which an indeterminate quantum system would probabilistically collapse onto, or select, just one determinate outcome to "explain" this phenomenon of observation. Wave function collapse was widely regarded as artificial and ad hoc, so an alternative interpretation in which the behavior of measurement could be understood from more fundamental physical principles was considered desirable.

Everett's PhD work provided such an interpretation. He argued that for a composite system—such as a subject (the "observer" or measuring apparatus) observing an object (the "observed" system, such as a particle)—the claim that either the observer or the observed has a well-defined state is meaningless; in modern parlance, the observer and the observed have become entangled: we can only specify the state of one relative to the other, i.e., the state of the observer and the observed are correlated after the observation is made. This led Everett to derive from the unitary, deterministic dynamics alone (i.e., without assuming wave function collapse) the notion of a relativity of states.

Everett noticed that the unitary, deterministic dynamics alone entailed that after an observation is made each element of the quantum superposition of the combined subject–object wave function contains two "relative states": a "collapsed" object state and an associated observer who has observed the same collapsed outcome; what the observer sees and the state of the object have become correlated by the act of measurement or observation. The subsequent evolution of each pair of relative subject–object states proceeds with complete indifference as to the presence or absence of the other elements, as if wave function collapse has occurred, which has the consequence that later observations are always consistent with the earlier observations. Thus the appearance of the object's wave function's collapse has emerged from the unitary, deterministic theory itself. (This answered Einstein's early criticism of quantum theory: that the theory should define what is observed, not for the observables to define the theory.) Since the wave function appears to have collapsed then, Everett reasoned, there was no need to actually assume that it had collapsed. And so, invoking Occam's razor, he removed the postulate of wave function collapse from the theory.

Testability

In 1985, David Deutsch proposed a variant of the Wigner's friend thought experiment as a test of many-worlds versus the Copenhagen interpretation. It consists of an experimenter (Wigner's friend) making a measurement on a quantum system in an isolated laboratory, and another experimenter (Wigner) who would make a measurement on the first one. According to the many-worlds theory, the first experimenter would end up in a macroscopic superposition of seeing one result of the measurement in one branch, and another result in another branch. The second experimenter could then interfere these two branches in order to test whether it is in fact in a macroscopic superposition or has collapsed into a single branch, as predicted by the Copenhagen interpretation. Since then Lockwood, Vaidman, and others have made similar proposals, which require placing macroscopic objects in a coherent superposition and interfering them, a task currently beyond experimental capability.

Probability and the Born rule

Since the many-worlds interpretation's inception, physicists have been puzzled about the role of probability in it. As put by Wallace, there are two facets to the question: the incoherence problem, which asks why we should assign probabilities at all to outcomes that are certain to occur in some worlds, and the quantitative problem, which asks why the probabilities should be given by the Born rule.

Everett tried to answer these questions in the paper that introduced many-worlds. To address the incoherence problem, he argued that an observer who makes a sequence of measurements on a quantum system will in general have an apparently random sequence of results in their memory, which justifies the use of probabilities to describe the measurement process. To address the quantitative problem, Everett proposed a derivation of the Born rule based on the properties that a measure on the branches of the wave function should have. His derivation has been criticized as relying on unmotivated assumptions. Since then several other derivations of the Born rule in the many-worlds framework have been proposed. There is no consensus on whether this has been successful.

Frequentism

DeWitt and Graham and Farhi et al., among others, have proposed derivations of the Born rule based on a frequentist interpretation of probability. They try to show that in the limit of uncountably many measurements, no worlds would have relative frequencies that didn't match the probabilities given by the Born rule, but these derivations have been shown to be mathematically incorrect.

Decision theory

A decision-theoretic derivation of the Born rule was produced by David Deutsch (1999) and refined by Wallace and Saunders. They consider an agent who takes part in a quantum gamble: the agent makes a measurement on a quantum system, branches as a consequence, and each of the agent's future selves receives a reward that depends on the measurement result. The agent uses decision theory to evaluate the price they would pay to take part in such a gamble, and concludes that the price is given by the utility of the rewards weighted according to the Born rule. Some reviews have been positive, although these arguments remain highly controversial; some theoretical physicists have taken them as supporting the case for parallel universes. For example, a New Scientist story on a 2007 conference about Everettian interpretations quoted physicist Andy Albrecht as saying, "This work will go down as one of the most important developments in the history of science." In contrast, the philosopher Huw Price, also attending the conference, found the Deutsch–Wallace–Saunders approach fundamentally flawed.

Symmetries and invariance

In 2005, Zurek produced a derivation of the Born rule based on the symmetries of entangled states; Schlosshauer and Fine argue that Zurek's derivation is not rigorous, as it does not define what probability is and has several unstated assumptions about how it should behave.

In 2016, Charles Sebens and Sean M. Carroll, building on work by Lev Vaidman, proposed a similar approach based on self-locating uncertainty. In this approach, decoherence creates multiple identical copies of observers, who can assign credences to being on different branches using the Born rule. The Sebens–Carroll approach has been criticized by Adrian Kent, and Vaidman does not find it satisfactory.

Branch counting

In 2021, Simon Saunders produced a branch counting derivation of the Born rule. The crucial feature of this approach is to define the branches so that they all have the same magnitude or 2-norm. The ratios of the numbers of branches thus defined give the probabilities of the various outcomes of a measurement, in accordance with the Born rule.

Preferred basis problem

As originally formulated by Everett and DeWitt, the many-worlds interpretation had a privileged role for measurements: they determined which basis of a quantum system would give rise to the eponymous worlds. Without this the theory was ambiguous, as a quantum state can equally well be described (e.g.) as having a well-defined position or as being a superposition of two delocalized states. The assumption is that the preferred basis to use is the one which assigns a unique measurement outcome to each world. This special role for measurements is problematic for the theory, as it contradicts Everett and DeWitt's goal of having a reductionist theory and undermines their criticism of the ill-defined measurement postulate of the Copenhagen interpretation. This is known today as the preferred basis problem.

The preferred basis problem has been solved, according to Saunders and Wallace, among others, by incorporating decoherence into the many-worlds theory. In this approach, the preferred basis does not have to be postulated, but rather is identified as the basis stable under environmental decoherence. In this way measurements no longer play a special role; rather, any interaction that causes decoherence causes the world to split. Since decoherence is never complete, there will always remain some infinitesimal overlap between two worlds, making it arbitrary whether a pair of worlds has split or not. Wallace argues that this is not problematic: it only shows that worlds are not a part of the fundamental ontology, but rather of the emergent ontology, where these approximate, effective descriptions are routine in the physical sciences. Since in this approach the worlds are derived, it follows that they must be present in any other interpretation of quantum mechanics that does not have a collapse mechanism, such as Bohmian mechanics.

This approach to deriving the preferred basis has been criticized as creating circularity with derivations of probability in the many-worlds interpretation, as decoherence theory depends on probability and probability depends on the ontology derived from decoherence. Wallace contends that decoherence theory depends not on probability but only on the notion that one is allowed to do approximations in physics.

History

MWI originated in Everett's Princeton University PhD thesis "The Theory of the Universal Wave Function", developed under his thesis advisor John Archibald Wheeler, a shorter summary of which was published in 1957 under the title "Relative State Formulation of Quantum Mechanics" (Wheeler contributed the title "relative state"; Everett originally called his approach the "Correlation Interpretation", where "correlation" refers to quantum entanglement). The phrase "many-worlds" is due to Bryce DeWitt, who was responsible for the wider popularization of Everett's theory, which had been largely ignored for a decade after publication in 1957.

Everett's proposal was not without precedent. In 1952, Erwin Schrödinger gave a lecture in Dublin in which at one point he jocularly warned his audience that what he was about to say might "seem lunatic". He went on to assert that while the Schrödinger equation seemed to be describing several different histories, they were "not alternatives but all really happen simultaneously". According to David Deutsch, this is the earliest known reference to many-worlds; Jeffrey A. Barrett describes it as indicating the similarity of "general views" between Everett and Schrödinger. Schrödinger's writings from the period also contain elements resembling the modal interpretation originated by Bas van Fraassen. Because Schrödinger subscribed to a kind of post-Machian neutral monism, in which "matter" and "mind" are only different aspects or arrangements of the same common elements, treating the wave function as physical and treating it as information became interchangeable.

Leon Cooper and Deborah Van Vechten developed a very similar approach before reading Everett's work. Zeh also came to the same conclusions as Everett before reading his work, then built a new theory of quantum decoherence based on these ideas.

According to people who knew him, Everett believed in the literal reality of the other quantum worlds. His son and wife reported that he "never wavered in his belief over his many-worlds theory". In their detailed review of Everett's work, Osnaghi, Freitas, and Freire Jr. note that Everett consistently used quotes around "real" to indicate a meaning within scientific practice.

Reception

MWI's initial reception was overwhelmingly negative, in the sense that it was ignored, with the notable exception of DeWitt. Wheeler made considerable efforts to formulate the theory in a way that would be palatable to Bohr, visited Copenhagen in 1956 to discuss it with him, and convinced Everett to visit as well, which happened in 1959. Nevertheless, Bohr and his collaborators completely rejected the theory. Everett had already left academia in 1957, never to return, and in 1980, Wheeler disavowed the theory.

Support

One of the strongest longtime advocates of MWI is David Deutsch. According to him, the single photon interference pattern observed in the double slit experiment can be explained by interference of photons in multiple universes. Viewed this way, the single photon interference experiment is indistinguishable from the multiple photon interference experiment. In a more practical vein, in one of the earliest papers on quantum computing, Deutsch suggested that parallelism that results from MWI could lead to "a method by which certain probabilistic tasks can be performed faster by a universal quantum computer than by any classical restriction of it". He also proposed that MWI will be testable (at least against "naive" Copenhagenism) when reversible computers become conscious via the reversible observation of spin.

Equivocal

Philosophers of science James Ladyman and Don Ross say that MWI could be true, but do not embrace it. They note that no quantum theory is yet empirically adequate for describing all of reality, given its lack of unification with general relativity, and so do not see a reason to regard any interpretation of quantum mechanics as the final word in metaphysics. They also suggest that the multiple branches may be an artifact of incomplete descriptions and of using quantum mechanics to represent the states of macroscopic objects. They argue that macroscopic objects are significantly different from microscopic objects in not being isolated from the environment, and that using quantum formalism to describe them lacks explanatory and descriptive power and accuracy.

Rejection

Some scientists consider some aspects of MWI to be unfalsifiable and hence unscientific because the multiple parallel universes are non-communicating, in the sense that no information can be passed between them.

Victor J. Stenger remarked that Murray Gell-Mann's published work explicitly rejects the existence of simultaneous parallel universes. Collaborating with James Hartle, Gell-Mann worked toward the development of a more "palatable" post-Everett quantum mechanics. Stenger thought it fair to say that most physicists find MWI too extreme, though it "has merit in finding a place for the observer inside the system being analyzed and doing away with the troublesome notion of wave function collapse".

Roger Penrose argues that the idea is flawed because it is based on an oversimplified version of quantum mechanics that does not account for gravity. In his view, applying conventional quantum mechanics to the universe implies the MWI, but the lack of a successful theory of quantum gravity negates the claimed universality of conventional quantum mechanics. According to Penrose, "the rules must change when gravity is involved". He further asserts that gravity helps anchor reality and "blurry" events have only one allowable outcome: "electrons, atoms, molecules, etc., are so minute that they require almost no amount of energy to maintain their gravity, and therefore their overlapping states. They can stay in that state forever, as described in standard quantum theory". On the other hand, "in the case of large objects, the duplicate states disappear in an instant due to the fact that these objects create a large gravitational field".

Philosopher of science Robert P. Crease says that MWI is "one of the most implausible and unrealistic ideas in the history of science" because it means that everything conceivable happens. Science writer Philip Ball calls MWI's implications fantasies, since "beneath their apparel of scientific equations or symbolic logic, they are acts of imagination, of 'just supposing'".

Theoretical physicist Gerard 't Hooft also dismisses the idea: "I do not believe that we have to live with the many-worlds interpretation. Indeed, it would be a stupendous number of parallel worlds, which are only there because physicists couldn't decide which of them is real."

Asher Peres was an outspoken critic of MWI. A section of his 1993 textbook had the title Everett's interpretation and other bizarre theories. Peres argued that the various many-worlds interpretations merely shift the arbitrariness or vagueness of the collapse postulate to the question of when "worlds" can be regarded as separate, and that no objective criterion for that separation can actually be formulated.

Polls

A poll of 72 "leading quantum cosmologists and other quantum field theorists" conducted before 1991 by L. David Raub showed 58% agreement with "Yes, I think MWI is true".

Max Tegmark reports the result of a "highly unscientific" poll taken at a 1997 quantum mechanics workshop. According to Tegmark, "The many worlds interpretation (MWI) scored second, comfortably ahead of the consistent histories and Bohm interpretations."

In response to Sean M. Carroll's statement "As crazy as it sounds, most working physicists buy into the many-worlds theory", Michael Nielsen counters: "at a quantum computing conference at Cambridge in 1998, a many-worlder surveyed the audience of approximately 200 people ... Many-worlds did just fine, garnering support on a level comparable to, but somewhat below, Copenhagen and decoherence." But Nielsen notes that it seemed most attendees found it to be a waste of time: Peres "got a huge and sustained round of applause…when he got up at the end of the polling and asked 'And who here believes the laws of physics are decided by a democratic vote?'"

A 2005 poll of fewer than 40 students and researchers taken after a course on the Interpretation of Quantum Mechanics at the Institute for Quantum Computing University of Waterloo found "Many Worlds (and decoherence)" to be the least favored.

A 2011 poll of 33 participants at an Austrian conference on quantum foundations found 6 endorsed MWI, 8 "Information-based/information-theoretical", and 14 Copenhagen; the authors remark that MWI received a similar percentage of votes as in Tegmark's 1997 poll.

Speculative implications

DeWitt has said that Everett, Wheeler, and Graham "do not in the end exclude any element of the superposition. All the worlds are there, even those in which everything goes wrong and all the statistical laws break down." Tegmark affirmed that absurd or highly unlikely events are rare but inevitable under MWI: "Things inconsistent with the laws of physics will never happen—everything else will ... it's important to keep track of the statistics, since even if everything conceivable happens somewhere, really freak events happen only exponentially rarely." David Deutsch speculates in his book The Beginning of Infinity that some fiction, such as alternate history, could occur somewhere in the multiverse, as long as it is consistent with the laws of physics. According to Ladyman and Ross, many seemingly physically plausible but unrealized possibilities, such as those discussed in other scientific fields, generally have no counterparts in other branches, because they are in fact incompatible with the universal wave function. According to Carroll, human decision-making, contrary to common misconceptions, is best thought of as a classical process, not a quantum one, because it works on the level of neurochemistry rather than fundamental particles. Human decisions do not cause the world to branch into equally realized outcomes; even for subjectively difficult decisions, the "weight" of realized outcomes is almost entirely concentrated in a single branch.

Quantum suicide is a thought experiment in quantum mechanics and the philosophy of physics that can purportedly distinguish between the Copenhagen interpretation of quantum mechanics and the many-worlds interpretation by a variation of the Schrödinger's cat thought experiment, from the cat's point of view. Quantum immortality refers to the subjective experience of surviving quantum suicide. Most experts believe the experiment would not work in the real world, because the world with the surviving experimenter has a lower "measure" than the world before the experiment, making it less likely that the experimenter will experience their survival.

Time in physics

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Time_in_physics
 
Time
Common symbols
t
SI unitsecond (s)
Other units
see unit of time
Dimension
Foucault's pendulum in the Panthéon of Paris can measure time as well as demonstrate the rotation of Earth.

In physics, time is defined by its measurement: time is what a clock reads. In classical, non-relativistic physics, it is a scalar quantity (often denoted by the symbol ) and, like length, mass, and charge, is usually described as a fundamental quantity. Time can be combined mathematically with other physical quantities to derive other concepts such as motion, kinetic energy and time-dependent fields. Timekeeping is a complex of technological and scientific issues, and part of the foundation of recordkeeping.

Markers of time

Before there were clocks, time was measured by those physical processes which were understandable to each epoch of civilization:

  • the first appearance (see: heliacal rising) of Sirius to mark the flooding of the Nile each year
  • the periodic succession of night and day, seemingly eternally
  • the position on the horizon of the first appearance of the sun at dawn
  • the position of the sun in the sky
  • the marking of the moment of noontime during the day
  • the length of the shadow cast by a gnomon

Eventually, it became possible to characterize the passage of time with instrumentation, using operational definitions. Simultaneously, our conception of time has evolved, as shown below.

Unit of measurement of time

In the International System of Units (SI), the unit of time is the second (symbol: s). It has been defined since 1967 as "the duration of 9192631770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom", and is an SI base unit. This definition is based on the operation of a caesium atomic clock. These clocks became practical for use as primary reference standards after about 1955, and have been in use ever since.

State of the art in timekeeping

The UTC timestamp in use worldwide is an atomic time standard. The relative accuracy of such a time standard is currently on the order of 10−15 (corresponding to 1 second in approximately 30 million years). The smallest time step considered theoretically observable is called the Planck time, which is approximately 5.391×10−44 seconds – many orders of magnitude below the resolution of current time standards.

The caesium atomic clock became practical after 1950, when advances in electronics enabled reliable measurement of the microwave frequencies it generates. As further advances occurred, atomic clock research has progressed to ever-higher frequencies, which can provide higher accuracy and higher precision. Clocks based on these techniques have been developed, but are not yet in use as primary reference standards.

Conceptions of time

Andromeda galaxy (M31) is two million light-years away. Thus we are viewing M31's light from two million years ago, a time before humans existed on Earth.

Galileo, Newton, and most people up until the 20th century thought that time was the same for everyone everywhere. This is the basis for timelines, where time is a parameter. The modern understanding of time is based on Einstein's theory of relativity, in which rates of time run differently depending on relative motion, and space and time are merged into spacetime, where we live on a world line rather than a timeline. In this view time is a coordinate. According to the prevailing cosmological model of the Big Bang theory, time itself began as part of the entire Universe about 13.8 billion years ago.

Regularities in nature

In order to measure time, one can record the number of occurrences (events) of some periodic phenomenon. The regular recurrences of the seasons, the motions of the sun, moon and stars were noted and tabulated for millennia, before the laws of physics were formulated. The sun was the arbiter of the flow of time, but time was known only to the hour for millennia, hence, the use of the gnomon was known across most of the world, especially Eurasia, and at least as far southward as the jungles of Southeast Asia.

In particular, the astronomical observatories maintained for religious purposes became accurate enough to ascertain the regular motions of the stars, and even some of the planets.

At first, timekeeping was done by hand by priests, and then for commerce, with watchmen to note time as part of their duties. The tabulation of the equinoxes, the sandglass, and the water clock became more and more accurate, and finally reliable. For ships at sea, marine sandglasses were used. These devices allowed sailors to call the hours, and to calculate sailing velocity.

Mechanical clocks

Richard of Wallingford (1292–1336), abbot of St. Albans Abbey, famously built a mechanical clock as an astronomical orrery about 1330.

By the time of Richard of Wallingford, the use of ratchets and gears allowed the towns of Europe to create mechanisms to display the time on their respective town clocks; by the time of the scientific revolution, the clocks became miniaturized enough for families to share a personal clock, or perhaps a pocket watch. At first, only kings could afford them. Pendulum clocks were widely used in the 18th and 19th century. They have largely been replaced in general use by quartz and digital clocks. Atomic clocks can theoretically keep accurate time for millions of years. They are appropriate for standards and scientific use.

Galileo: the flow of time

In 1583, Galileo Galilei (1564–1642) discovered that a pendulum's harmonic motion has a constant period, which he learned by timing the motion of a swaying lamp in harmonic motion at mass at the cathedral of Pisa, with his pulse.

In his Two New Sciences (1638), Galileo used a water clock to measure the time taken for a bronze ball to roll a known distance down an inclined plane; this clock was:

...a large vessel of water placed in an elevated position; to the bottom of this vessel was soldered a pipe of small diameter giving a thin jet of water, which we collected in a small glass during the time of each descent, whether for the whole length of the channel or for a part of its length; the water thus collected was weighed, after each descent, on a very accurate balance; the differences and ratios of these weights gave us the differences and ratios of the times, and this with such accuracy that although the operation was repeated many, many times, there was no appreciable discrepancy in the results.

Galileo's experimental setup to measure the literal flow of time, in order to describe the motion of a ball, preceded Isaac Newton's statement in his Principia, "I do not define time, space, place and motion, as being well known to all."

The Galilean transformations assume that time is the same for all reference frames.

Newtonian physics: linear time

In or around 1665, when Isaac Newton (1643–1727) derived the motion of objects falling under gravity, the first clear formulation for mathematical physics of a treatment of time began: linear time, conceived as a universal clock.

Absolute, true, and mathematical time, of itself, and from its own nature flows equably without regard to anything external, and by another name is called duration: relative, apparent, and common time, is some sensible and external (whether accurate or unequable) measure of duration by the means of motion, which is commonly used instead of true time; such as an hour, a day, a month, a year.

The water clock mechanism described by Galileo was engineered to provide laminar flow of the water during the experiments, thus providing a constant flow of water for the durations of the experiments, and embodying what Newton called duration.

In this section, the relationships listed below treat time as a parameter which serves as an index to the behavior of the physical system under consideration. Because Newton's fluents treat a linear flow of time (what he called mathematical time), time could be considered to be a linearly varying parameter, an abstraction of the march of the hours on the face of a clock. Calendars and ship's logs could then be mapped to the march of the hours, days, months, years and centuries.

Thermodynamics and the paradox of irreversibility

By 1798, Benjamin Thompson (1753–1814) had discovered that work could be transformed to heat without limit – a precursor of the conservation of energy or

In 1824 Sadi Carnot (1796–1832) scientifically analyzed the steam engine with his Carnot cycle, an abstract engine. Rudolf Clausius (1822–1888) noted a measure of disorder, or entropy, which affects the continually decreasing amount of free energy which is available to a Carnot engine in the:

Thus the continual march of a thermodynamic system, from lesser to greater entropy, at any given temperature, defines an arrow of time. In particular, Stephen Hawking identifies three arrows of time:

  • Psychological arrow of time – our perception of an inexorable flow.
  • Thermodynamic arrow of time – distinguished by the growth of entropy.
  • Cosmological arrow of time – distinguished by the expansion of the universe.

With time, entropy increases in an isolated thermodynamic system. In contrast, Erwin Schrödinger (1887–1961) pointed out that life depends on a "negative entropy flow"Ilya Prigogine (1917–2003) stated that other thermodynamic systems which, like life, are also far from equilibrium, can also exhibit stable spatio-temporal structures that reminisce life. Soon afterward, the Belousov–Zhabotinsky reactions were reported, which demonstrate oscillating colors in a chemical solution. These nonequilibrium thermodynamic branches reach a bifurcation point, which is unstable, and another thermodynamic branch becomes stable in its stead.

Electromagnetism and the speed of light

In 1864, James Clerk Maxwell (1831–1879) presented a combined theory of electricity and magnetism. He combined all the laws then known relating to those two phenomenon into four equations. These equations are known as Maxwell's equations for electromagnetism; they allow for solutions in the form of electromagnetic waves and propagate at a fixed speed, c, regardless of the velocity of the electric charge that generated them.

The fact that light is predicted to always travel at speed c would be incompatible with Galilean relativity if Maxwell's equations were assumed to hold in any inertial frame (reference frame with constant velocity), because the Galilean transformations predict the speed to decrease (or increase) in the reference frame of an observer traveling parallel (or antiparallel) to the light.

It was expected that there was one absolute reference frame, that of the luminiferous aether, in which Maxwell's equations held unmodified in the known form.

The Michelson–Morley experiment failed to detect any difference in the relative speed of light due to the motion of the Earth relative to the luminiferous aether, suggesting that Maxwell's equations did, in fact, hold in all frames. In 1875, Hendrik Lorentz (1853–1928) discovered Lorentz transformations, which left Maxwell's equations unchanged, allowing Michelson and Morley's negative result to be explained. Henri Poincaré (1854–1912) noted the importance of Lorentz's transformation and popularized it. In particular, the railroad car description can be found in Science and Hypothesis,[27] which was published before Einstein's articles of 1905.

The Lorentz transformation predicted space contraction and time dilation; until 1905, the former was interpreted as a physical contraction of objects moving with respect to the aether, due to the modification of the intermolecular forces (of electric nature), while the latter was thought to be just a mathematical stipulation.

Relativistic physics: spacetime

Albert Einstein's 1905 special relativity challenged the notion of absolute time, and could only formulate a definition of synchronization for clocks that mark a linear flow of time:

If at the point A of space there is a clock, an observer at A can determine the time values of events in the immediate proximity of A by finding the positions of the hands which are simultaneous with these events. If there is at the point B of space another clock in all respects resembling the one at A, it is possible for an observer at B to determine the time values of events in the immediate neighbourhood of B.

But it is not possible without further assumption to compare, in respect of time, an event at A with an event at B. We have so far defined only an "A time" and a "B time".

We have not defined a common "time" for A and B, for the latter cannot be defined at all unless we establish by definition that the "time" required by light to travel from A to B equals the "time" it requires to travel from B to A. Let a ray of light start at the "A time" tA from A towards B, let it at the "B time" tB be reflected at B in the direction of A, and arrive again at A at the “A time” tA.

In accordance with definition the two clocks synchronize if

We assume that this definition of synchronism is free from contradictions, and possible for any number of points; and that the following relations are universally valid:—

  1. If the clock at B synchronizes with the clock at A, the clock at A synchronizes with the clock at B.
  2. If the clock at A synchronizes with the clock at B and also with the clock at C, the clocks at B and C also synchronize with each other.

— Albert Einstein, "On the Electrodynamics of Moving Bodies"

Einstein showed that if the speed of light is not changing between reference frames, space and time must be so that the moving observer will measure the same speed of light as the stationary one because velocity is defined by space and time:

where r is position and t is time.

Indeed, the Lorentz transformation (for two reference frames in relative motion, whose x axis is directed in the direction of the relative velocity)

can be said to "mix" space and time in a way similar to the way a Euclidean rotation around the z axis mixes x and y coordinates. Consequences of this include relativity of simultaneity.

Event B is simultaneous with A in the green reference frame, but it occurred before in the blue frame, and will occur later in the red frame.

More specifically, the Lorentz transformation is a hyperbolic rotation

which is a change of coordinates in the four-dimensional Minkowski space, a dimension of which is ct. (In Euclidean space an ordinary rotation

is the corresponding change of coordinates.) The speed of light c can be seen as just a conversion factor needed because we measure the dimensions of spacetime in different units; since the metre is currently defined in terms of the second, it has the exact value of 299 792 458 m/s. We would need a similar factor in Euclidean space if, for example, we measured width in nautical miles and depth in feet. In physics, sometimes units of measurement in which c = 1 are used to simplify equations.

Time in a "moving" reference frame is shown to run more slowly than in a "stationary" one by the following relation (which can be derived by the Lorentz transformation by putting ∆x′ = 0, ∆τ = ∆t′):

where:

  • is the time between two events as measured in the moving reference frame in which they occur at the same place (e.g. two ticks on a moving clock); it is called the proper time between the two events;
  • t is the time between these same two events, but as measured in the stationary reference frame;
  • v is the speed of the moving reference frame relative to the stationary one;
  • c is the speed of light.

Moving objects therefore are said to show a slower passage of time. This is known as time dilation.

These transformations are only valid for two frames at constant relative velocity. Naively applying them to other situations gives rise to such paradoxes as the twin paradox.

That paradox can be resolved using for instance Einstein's General theory of relativity, which uses Riemannian geometry, geometry in accelerated, noninertial reference frames. Employing the metric tensor which describes Minkowski space:

Einstein developed a geometric solution to Lorentz's transformation that preserves Maxwell's equations. His field equations give an exact relationship between the measurements of space and time in a given region of spacetime and the energy density of that region.

Einstein's equations predict that time should be altered by the presence of gravitational fields (see the Schwarzschild metric):

where:

  • is the gravitational time dilation of an object at a distance of .
  • is the change in coordinate time, or the interval of coordinate time.
  • is the gravitational constant
  • is the mass generating the field
  • is the change in proper time , or the interval of proper time.

Or one could use the following simpler approximation:

That is, the stronger the gravitational field (and, thus, the larger the acceleration), the more slowly time runs. The predictions of time dilation are confirmed by particle acceleration experiments and cosmic ray evidence, where moving particles decay more slowly than their less energetic counterparts. Gravitational time dilation gives rise to the phenomenon of gravitational redshift and Shapiro signal travel time delays near massive objects such as the sun. The Global Positioning System must also adjust signals to account for this effect.

According to Einstein's general theory of relativity, a freely moving particle traces a history in spacetime that maximises its proper time. This phenomenon is also referred to as the principle of maximal aging, and was described by Taylor and Wheeler as:

"Principle of Extremal Aging: The path a free object takes between two events in spacetime is the path for which the time lapse between these events, recorded on the object's wristwatch, is an extremum."

Einstein's theory was motivated by the assumption that every point in the universe can be treated as a 'center', and that correspondingly, physics must act the same in all reference frames. His simple and elegant theory shows that time is relative to an inertial frame. In an inertial frame, Newton's first law holds; it has its own local geometry, and therefore its own measurements of space and time; there is no 'universal clock'. An act of synchronization must be performed between two systems, at the least.

Time in quantum mechanics

There is a time parameter in the equations of quantum mechanics. The Schrödinger equation is

One solution can be

.

where is called the time evolution operator, and H is the Hamiltonian.

But the Schrödinger picture shown above is equivalent to the Heisenberg picture, which enjoys a similarity to the Poisson brackets of classical mechanics. The Poisson brackets are superseded by a nonzero commutator, say [H, A] for observable A, and Hamiltonian H:

This equation denotes an uncertainty relation in quantum physics. For example, with time (the observable A), the energy E (from the Hamiltonian H) gives:

where

  • is the uncertainty in energy
  • is the uncertainty in time
  • is the reduced Planck constant

The more precisely one measures the duration of a sequence of events, the less precisely one can measure the energy associated with that sequence, and vice versa. This equation is different from the standard uncertainty principle, because time is not an operator in quantum mechanics.

Corresponding commutator relations also hold for momentum p and position q, which are conjugate variables of each other, along with a corresponding uncertainty principle in momentum and position, similar to the energy and time relation above.

Quantum mechanics explains the properties of the periodic table of the elements. Starting with Otto Stern's and Walter Gerlach's experiment with molecular beams in a magnetic field, Isidor Rabi (1898–1988), was able to modulate the magnetic resonance of the beam. In 1945 Rabi then suggested that this technique be the basis of a clock using the resonant frequency of an atomic beam. In 2021 Jun Ye of JILA in Boulder Colorado observed time dilatation in the difference in the rate of optical lattice clock ticks at the top of a cloud of strontium atoms, than at the bottom of that cloud, a column one millimeter tall, under the influence of gravity.

Dynamical systems

One could say that time is a parameterization of a dynamical system that allows the geometry of the system to be manifested and operated on. It has been asserted that time is an implicit consequence of chaos (i.e. nonlinearity/irreversibility): the characteristic time, or rate of information entropy production, of a system. Mandelbrot introduces intrinsic time in his book Multifractals and 1/f noise.

Time crystals

Khemani, Moessner, and Sondhi define a time crystal as a "stable, conservative, macroscopic clock".

Signalling

Signalling is one application of the electromagnetic waves described above. In general, a signal is part of communication between parties and places. One example might be a yellow ribbon tied to a tree, or the ringing of a church bell. A signal can be part of a conversation, which involves a protocol. Another signal might be the position of the hour hand on a town clock or a railway station. An interested party might wish to view that clock, to learn the time. See: Time ball, an early form of Time signal.

Evolution of a world line of an accelerated massive particle. This world line is restricted to the timelike top and bottom sections of this spacetime figure; this world line cannot cross the top (future) or the bottom (past) light cone. The left and right sections (which are outside the light cones) are spacelike.

We as observers can still signal different parties and places as long as we live within their past light cone. But we cannot receive signals from those parties and places outside our past light cone.

Along with the formulation of the equations for the electromagnetic wave, the field of telecommunication could be founded.

In 19th century telegraphy, electrical circuits, some spanning continents and oceans, could transmit codes - simple dots, dashes and spaces. From this, a series of technical issues have emerged; see Category:Synchronization. But it is safe to say that our signalling systems can be only approximately synchronized, a plesiochronous condition, from which jitter need be eliminated.

That said, systems can be synchronized (at an engineering approximation), using technologies like GPS. The GPS satellites must account for the effects of gravitation and other relativistic factors in their circuitry. See: Self-clocking signal.

Technology for timekeeping standards

The primary time standard in the U.S. is currently NIST-F1, a laser-cooled Cs fountain, the latest in a series of time and frequency standards, from the ammonia-based atomic clock (1949) to the caesium-based NBS-1 (1952) to NIST-7 (1993). The respective clock uncertainty declined from 10,000 nanoseconds per day to 0.5 nanoseconds per day in 5 decades. In 2001 the clock uncertainty for NIST-F1 was 0.1 nanoseconds/day. Development of increasingly accurate frequency standards is underway.

In this time and frequency standard, a population of caesium atoms is laser-cooled to temperatures of one microkelvin. The atoms collect in a ball shaped by six lasers, two for each spatial dimension, vertical (up/down), horizontal (left/right), and back/forth. The vertical lasers push the caesium ball through a microwave cavity. As the ball is cooled, the caesium population cools to its ground state and emits light at its natural frequency, stated in the definition of second above. Eleven physical effects are accounted for in the emissions from the caesium population, which are then controlled for in the NIST-F1 clock. These results are reported to BIPM.

Additionally, a reference hydrogen maser is also reported to BIPM as a frequency standard for TAI (international atomic time).

The measurement of time is overseen by BIPM (Bureau International des Poids et Mesures), located in Sèvres, France, which ensures uniformity of measurements and their traceability to the International System of Units (SI) worldwide. BIPM operates under authority of the Metre Convention, a diplomatic treaty between fifty-one nations, the Member States of the Convention, through a series of Consultative Committees, whose members are the respective national metrology laboratories.

Time in cosmology

The equations of general relativity predict a non-static universe. However, Einstein accepted only a static universe, and modified the Einstein field equation to reflect this by adding the cosmological constant, which he later described as his "biggest blunder". But in 1927, Georges Lemaître (1894–1966) argued, on the basis of general relativity, that the universe originated in a primordial explosion. At the fifth Solvay conference, that year, Einstein brushed him off with "Vos calculs sont corrects, mais votre physique est abominable." (“Your math is correct, but your physics is abominable”). In 1929, Edwin Hubble (1889–1953) announced his discovery of the expanding universe. The current generally accepted cosmological model, the Lambda-CDM model, has a positive cosmological constant and thus not only an expanding universe but an accelerating expanding universe.

If the universe were expanding, then it must have been much smaller and therefore hotter and denser in the past. George Gamow (1904–1968) hypothesized that the abundance of the elements in the Periodic Table of the Elements, might be accounted for by nuclear reactions in a hot dense universe. He was disputed by Fred Hoyle (1915–2001), who invented the term 'Big Bang' to disparage it. Fermi and others noted that this process would have stopped after only the light elements were created, and thus did not account for the abundance of heavier elements.

WMAP fluctuations of the cosmic microwave background radiation

Gamow's prediction was a 5–10-kelvin black-body radiation temperature for the universe, after it cooled during the expansion. This was corroborated by Penzias and Wilson in 1965. Subsequent experiments arrived at a 2.7 kelvins temperature, corresponding to an age of the universe of 13.8 billion years after the Big Bang.

This dramatic result has raised issues: what happened between the singularity of the Big Bang and the Planck time, which, after all, is the smallest observable time. When might have time separated out from the spacetime foam; there are only hints based on broken symmetries (see Spontaneous symmetry breaking, Timeline of the Big Bang, and the articles in Category:Physical cosmology).

General relativity gave us our modern notion of the expanding universe that started in the Big Bang. Using relativity and quantum theory we have been able to roughly reconstruct the history of the universe. In our epoch, during which electromagnetic waves can propagate without being disturbed by conductors or charges, we can see the stars, at great distances from us, in the night sky. (Before this epoch, there was a time, before the universe cooled enough for electrons and nuclei to combine into atoms about 377,000 years after the Big Bang, during which starlight would not have been visible over large distances.)

Reprise

Ilya Prigogine's reprise is "Time precedes existence". In contrast to the views of Newton, of Einstein, and of quantum physics, which offer a symmetric view of time (as discussed above), Prigogine points out that statistical and thermodynamic physics can explain irreversible phenomena, as well as the arrow of time and the Big Bang.

Epigenetics of anxiety and stress–related disorders

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Epigenetics_of_anxiety_and_st...