Search This Blog

Tuesday, March 11, 2025

De Broglie–Bohm theory

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/De_Broglie%E2%80%93Bohm_theory

The de Broglie–Bohm theory is an interpretation of quantum mechanics which postulates that, in addition to the wavefunction, an actual configuration of particles exists, even when unobserved. The evolution over time of the configuration of all particles is defined by a guiding equation. The evolution of the wave function over time is given by the Schrödinger equation. The theory is named after Louis de Broglie (1892–1987) and David Bohm (1917–1992).

The theory is deterministic and explicitly nonlocal: the velocity of any one particle depends on the value of the guiding equation, which depends on the configuration of all the particles under consideration.

Measurements are a particular case of quantum processes described by the theory—for which it yields the same quantum predictions as other interpretations of quantum mechanics. The theory does not have a "measurement problem", due to the fact that the particles have a definite configuration at all times. The Born rule in de Broglie–Bohm theory is not a postulate. Rather, in this theory, the link between the probability density and the wave function has the status of a theorem, a result of a separate postulate, the "quantum equilibrium hypothesis", which is additional to the basic principles governing the wave function. There are several equivalent mathematical formulations of the theory.

Overview

De Broglie–Bohm theory is based on the following postulates:

  • There is a configuration of the universe, described by coordinates , which is an element of the configuration space . The configuration space is different for different versions of pilot-wave theory. For example, this may be the space of positions of particles, or, in case of field theory, the space of field configurations . The configuration evolves (for spin=0) according to the guiding equation where is the probability current or probability flux, and is the momentum operator. Here, is the standard complex-valued wavefunction from quantum theory, which evolves according to Schrödinger's equation This completes the specification of the theory for any quantum theory with Hamilton operator of type .
  • The configuration is distributed according to at some moment of time , and this consequently holds for all times. Such a state is named quantum equilibrium. With quantum equilibrium, this theory agrees with the results of standard quantum mechanics.

Even though this latter relation is frequently presented as an axiom of the theory, Bohm presented it as derivable from statistical-mechanical arguments in the original papers of 1952. This argument was further supported by the work of Bohm in 1953 and was substantiated by Vigier and Bohm's paper of 1954, in which they introduced stochastic fluid fluctuations that drive a process of asymptotic relaxation from quantum non-equilibrium to quantum equilibrium (ρ → |ψ|2).

Double-slit experiment

The Bohmian trajectories for an electron going through the two-slit experiment. A similar pattern was also extrapolated from weak measurements of single photons.

The double-slit experiment is an illustration of wave–particle duality. In it, a beam of particles (such as electrons) travels through a barrier that has two slits. If a detector screen is on the side beyond the barrier, the pattern of detected particles shows interference fringes characteristic of waves arriving at the screen from two sources (the two slits); however, the interference pattern is made up of individual dots corresponding to particles that had arrived on the screen. The system seems to exhibit the behaviour of both waves (interference patterns) and particles (dots on the screen).

If this experiment is modified so that one slit is closed, no interference pattern is observed. Thus, the state of both slits affects the final results. It can also be arranged to have a minimally invasive detector at one of the slits to detect which slit the particle went through. When that is done, the interference pattern disappears.

In de Broglie–Bohm theory, the wavefunction is defined at both slits, but each particle has a well-defined trajectory that passes through exactly one of the slits. The final position of the particle on the detector screen and the slit through which the particle passes is determined by the initial position of the particle. Such initial position is not knowable or controllable by the experimenter, so there is an appearance of randomness in the pattern of detection. In Bohm's 1952 papers he used the wavefunction to construct a quantum potential that, when included in Newton's equations, gave the trajectories of the particles streaming through the two slits. In effect the wavefunction interferes with itself and guides the particles by the quantum potential in such a way that the particles avoid the regions in which the interference is destructive and are attracted to the regions in which the interference is constructive, resulting in the interference pattern on the detector screen.

To explain the behavior when the particle is detected to go through one slit, one needs to appreciate the role of the conditional wavefunction and how it results in the collapse of the wavefunction; this is explained below. The basic idea is that the environment registering the detection effectively separates the two wave packets in configuration space.

Theory

Pilot wave

The de Broglie–Bohm theory describes a pilot wave in a configuration space and trajectories of particles as in classical mechanics but defined by non-Newtonian mechanics. At every moment of time there exists not only a wavefunction, but also a well-defined configuration of the whole universe (i.e., the system as defined by the boundary conditions used in solving the Schrödinger equation).

The de Broglie–Bohm theory works on particle positions and trajectories like classical mechanics but the dynamics are different. In classical mechanics, the accelerations of the particles are imparted directly by forces, which exist in physical three-dimensional space. In de Broglie–Bohm theory, the quantum "field exerts a new kind of "quantum-mechanical" force". Bohm hypothesized that each particle has a "complex and subtle inner structure" that provides the capacity to react to the information provided by the wavefunction by the quantum potential. Also, unlike in classical mechanics, physical properties (e.g., mass, charge) are spread out over the wavefunction in de Broglie–Bohm theory, not localized at the position of the particle.

The wavefunction itself, and not the particles, determines the dynamical evolution of the system: the particles do not act back onto the wave function. As Bohm and Hiley worded it, "the Schrödinger equation for the quantum field does not have sources, nor does it have any other way by which the field could be directly affected by the condition of the particles [...] the quantum theory can be understood completely in terms of the assumption that the quantum field has no sources or other forms of dependence on the particles". P. Holland considers this lack of reciprocal action of particles and wave function to be one "[a]mong the many nonclassical properties exhibited by this theory". Holland later called this a merely apparent lack of back reaction, due to the incompleteness of the description.

In what follows below, the setup for one particle moving in is given followed by the setup for N particles moving in 3 dimensions. In the first instance, configuration space and real space are the same, while in the second, real space is still , but configuration space becomes . While the particle positions themselves are in real space, the velocity field and wavefunction are on configuration space, which is how particles are entangled with each other in this theory.

Extensions to this theory include spin and more complicated configuration spaces.

We use variations of for particle positions, while represents the complex-valued wavefunction on configuration space.

Guiding equation

For a spinless single particle moving in , the particle's velocity is

For many particles labeled for the -th particle their velocities are

The main fact to notice is that this velocity field depends on the actual positions of all of the particles in the universe. As explained below, in most experimental situations, the influence of all of those particles can be encapsulated into an effective wavefunction for a subsystem of the universe.

Schrödinger's equation

The one-particle Schrödinger equation governs the time evolution of a complex-valued wavefunction on . The equation represents a quantized version of the total energy of a classical system evolving under a real-valued potential function on :

For many particles, the equation is the same except that and are now on configuration space, :

This is the same wavefunction as in conventional quantum mechanics.

Relation to the Born rule

In Bohm's original papers, he discusses how de Broglie–Bohm theory results in the usual measurement results of quantum mechanics. The main idea is that this is true if the positions of the particles satisfy the statistical distribution given by . And that distribution is guaranteed to be true for all time by the guiding equation if the initial distribution of the particles satisfies .

For a given experiment, one can postulate this as being true and verify it experimentally. But, as argued by Dürr et al., one needs to argue that this distribution for subsystems is typical. The authors argue that , by virtue of its equivariance under the dynamical evolution of the system, is the appropriate measure of typicality for initial conditions of the positions of the particles. The authors then prove that the vast majority of possible initial configurations will give rise to statistics obeying the Born rule (i.e., ) for measurement outcomes. In summary, in a universe governed by the de Broglie–Bohm dynamics, Born rule behavior is typical.

The situation is thus analogous to the situation in classical statistical physics. A low-entropy initial condition will, with overwhelmingly high probability, evolve into a higher-entropy state: behavior consistent with the second law of thermodynamics is typical. There are anomalous initial conditions that would give rise to violations of the second law; however in the absence of some very detailed evidence supporting the realization of one of those conditions, it would be quite unreasonable to expect anything but the actually observed uniform increase of entropy. Similarly in the de Broglie–Bohm theory, there are anomalous initial conditions that would produce measurement statistics in violation of the Born rule (conflicting the predictions of standard quantum theory), but the typicality theorem shows that absent some specific reason to believe one of those special initial conditions was in fact realized, the Born rule behavior is what one should expect.

It is in this qualified sense that the Born rule is, for the de Broglie–Bohm theory, a theorem rather than (as in ordinary quantum theory) an additional postulate.

It can also be shown that a distribution of particles which is not distributed according to the Born rule (that is, a distribution "out of quantum equilibrium") and evolving under the de Broglie–Bohm dynamics is overwhelmingly likely to evolve dynamically into a state distributed as .

The conditional wavefunction of a subsystem

In the formulation of the de Broglie–Bohm theory, there is only a wavefunction for the entire universe (which always evolves by the Schrödinger equation). Here, the "universe" is simply the system limited by the same boundary conditions used to solve the Schrödinger equation. However, once the theory is formulated, it is convenient to introduce a notion of wavefunction also for subsystems of the universe. Let us write the wavefunction of the universe as , where denotes the configuration variables associated to some subsystem (I) of the universe, and denotes the remaining configuration variables. Denote respectively by and the actual configuration of subsystem (I) and of the rest of the universe. For simplicity, we consider here only the spinless case. The conditional wavefunction of subsystem (I) is defined by

It follows immediately from the fact that satisfies the guiding equation that also the configuration satisfies a guiding equation identical to the one presented in the formulation of the theory, with the universal wavefunction replaced with the conditional wavefunction . Also, the fact that is random with probability density given by the square modulus of implies that the conditional probability density of given is given by the square modulus of the (normalized) conditional wavefunction (in the terminology of Dürr et al. this fact is called the fundamental conditional probability formula).

Unlike the universal wavefunction, the conditional wavefunction of a subsystem does not always evolve by the Schrödinger equation, but in many situations it does. For instance, if the universal wavefunction factors as

then the conditional wavefunction of subsystem (I) is (up to an irrelevant scalar factor) equal to (this is what standard quantum theory would regard as the wavefunction of subsystem (I)). If, in addition, the Hamiltonian does not contain an interaction term between subsystems (I) and (II), then does satisfy a Schrödinger equation. More generally, assume that the universal wave function can be written in the form

where solves Schrödinger equation and, for all and . Then, again, the conditional wavefunction of subsystem (I) is (up to an irrelevant scalar factor) equal to , and if the Hamiltonian does not contain an interaction term between subsystems (I) and (II), then satisfies a Schrödinger equation.

The fact that the conditional wavefunction of a subsystem does not always evolve by the Schrödinger equation is related to the fact that the usual collapse rule of standard quantum theory emerges from the Bohmian formalism when one considers conditional wavefunctions of subsystems.

Extensions

Relativity

Pilot-wave theory is explicitly nonlocal, which is in ostensible conflict with special relativity. Various extensions of "Bohm-like" mechanics exist that attempt to resolve this problem. Bohm himself in 1953 presented an extension of the theory satisfying the Dirac equation for a single particle. However, this was not extensible to the many-particle case because it used an absolute time.

A renewed interest in constructing Lorentz-invariant extensions of Bohmian theory arose in the 1990s; see Bohm and Hiley: The Undivided Universe and references therein. Another approach is given by Dürr et al., who use Bohm–Dirac models and a Lorentz-invariant foliation of space-time.

Thus, Dürr et al. (1999) showed that it is possible to formally restore Lorentz invariance for the Bohm–Dirac theory by introducing additional structure. This approach still requires a foliation of space-time. While this is in conflict with the standard interpretation of relativity, the preferred foliation, if unobservable, does not lead to any empirical conflicts with relativity. In 2013, Dürr et al. suggested that the required foliation could be covariantly determined by the wavefunction.

The relation between nonlocality and preferred foliation can be better understood as follows. In de Broglie–Bohm theory, nonlocality manifests as the fact that the velocity and acceleration of one particle depends on the instantaneous positions of all other particles. On the other hand, in the theory of relativity the concept of instantaneousness does not have an invariant meaning. Thus, to define particle trajectories, one needs an additional rule that defines which space-time points should be considered instantaneous. The simplest way to achieve this is to introduce a preferred foliation of space-time by hand, such that each hypersurface of the foliation defines a hypersurface of equal time.

Initially, it had been considered impossible to set out a description of photon trajectories in the de Broglie–Bohm theory in view of the difficulties of describing bosons relativistically. In 1996, Partha Ghose presented a relativistic quantum-mechanical description of spin-0 and spin-1 bosons starting from the Duffin–Kemmer–Petiau equation, setting out Bohmian trajectories for massive bosons and for massless bosons (and therefore photons). In 2001, Jean-Pierre Vigier emphasized the importance of deriving a well-defined description of light in terms of particle trajectories in the framework of either the Bohmian mechanics or the Nelson stochastic mechanics. The same year, Ghose worked out Bohmian photon trajectories for specific cases. Subsequent weak-measurement experiments yielded trajectories that coincide with the predicted trajectories. The significance of these experimental findings is controversial.

Chris Dewdney and G. Horton have proposed a relativistically covariant, wave-functional formulation of Bohm's quantum field theory and have extended it to a form that allows the inclusion of gravity.

Nikolić has proposed a Lorentz-covariant formulation of the Bohmian interpretation of many-particle wavefunctions. He has developed a generalized relativistic-invariant probabilistic interpretation of quantum theory. in which is no longer a probability density in space, but a probability density in space-time. He uses this generalized probabilistic interpretation to formulate a relativistic-covariant version of de Broglie–Bohm theory without introducing a preferred foliation of space-time. His work also covers the extension of the Bohmian interpretation to a quantization of fields and strings.

Roderick I. Sutherland at the University in Sydney has a Lagrangian formalism for the pilot wave and its beables. It draws on Yakir Aharonov's retrocasual weak measurements to explain many-particle entanglement in a special relativistic way without the need for configuration space. The basic idea was already published by Olivier Costa de Beauregard in the 1950s and is also used by John Cramer in his transactional interpretation except the beables that exist between the von Neumann strong projection operator measurements. Sutherland's Lagrangian includes two-way action-reaction between pilot wave and beables. Therefore, it is a post-quantum non-statistical theory with final boundary conditions that violate the no-signal theorems of quantum theory. Just as special relativity is a limiting case of general relativity when the spacetime curvature vanishes, so, too is statistical no-entanglement signaling quantum theory with the Born rule a limiting case of the post-quantum action-reaction Lagrangian when the reaction is set to zero and the final boundary condition is integrated out.

Spin

To incorporate spin, the wavefunction becomes complex-vector-valued. The value space is called spin space; for a spin-1/2 particle, spin space can be taken to be . The guiding equation is modified by taking inner products in spin space to reduce the complex vectors to complex numbers. The Schrödinger equation is modified by adding a Pauli spin term:

where

  • — the mass, charge and magnetic moment of the –th particle
  • — the appropriate spin operator acting in the –th particle's spin space
  • spin quantum number of the –th particle ( for electron)
  • is vector potential in
  • is the magnetic field in
  • is the covariant derivative, involving the vector potential, ascribed to the coordinates of –th particle (in SI units)
  • — the wavefunction defined on the multidimensional configuration space; e.g. a system consisting of two spin-1/2 particles and one spin-1 particle has a wavefunction of the form where is a tensor product, so this spin space is 12-dimensional
  • is the inner product in spin space :

Stochastic electrodynamics

Stochastic electrodynamics (SED) is an extension of the de Broglie–Bohm interpretation of quantum mechanics, with the electromagnetic zero-point field (ZPF) playing a central role as the guiding pilot-wave. Modern approaches to SED, like those proposed by the group around late Gerhard Grössing, among others, consider wave and particle-like quantum effects as well-coordinated emergent systems. These emergent systems are the result of speculated and calculated sub-quantum interactions with the zero-point field.

Quantum field theory

In Dürr et al. the authors describe an extension of de Broglie–Bohm theory for handling creation and annihilation operators, which they refer to as "Bell-type quantum field theories". The basic idea is that configuration space becomes the (disjoint) space of all possible configurations of any number of particles. For part of the time, the system evolves deterministically under the guiding equation with a fixed number of particles. But under a stochastic process, particles may be created and annihilated. The distribution of creation events is dictated by the wavefunction. The wavefunction itself is evolving at all times over the full multi-particle configuration space.

Hrvoje Nikolić introduces a purely deterministic de Broglie–Bohm theory of particle creation and destruction, according to which particle trajectories are continuous, but particle detectors behave as if particles have been created or destroyed even when a true creation or destruction of particles does not take place.

Curved space

To extend de Broglie–Bohm theory to curved space (Riemannian manifolds in mathematical parlance), one simply notes that all of the elements of these equations make sense, such as gradients and Laplacians. Thus, we use equations that have the same form as above. Topological and boundary conditions may apply in supplementing the evolution of Schrödinger's equation.

For a de Broglie–Bohm theory on curved space with spin, the spin space becomes a vector bundle over configuration space, and the potential in Schrödinger's equation becomes a local self-adjoint operator acting on that space. The field equations for the de Broglie–Bohm theory in the relativistic case with spin can also be given for curved space-times with torsion.

In a general spacetime with curvature and torsion, the guiding equation for the four-velocity of an elementary fermion particle iswhere the wave function is a spinor, is the corresponding adjoint, are the Dirac matrices, and is a tetrad. If the wave function propagates according to the curved Dirac equation, then the particle moves according to the Mathisson-Papapetrou equations of motion, which are an extension of the geodesic equation. This relativistic wave-particle duality follows from the conservation laws for the spin tensor and energy-momentum tensor, and also from the covariant Heisenberg picture equation of motion.

Exploiting nonlocality

Diagram made by Antony Valentini in a lecture about the De Broglie–Bohm theory. Valentini argues quantum theory is a special equilibrium case of a wider physics and that it may be possible to observe and exploit quantum non-equilibrium

De Broglie and Bohm's causal interpretation of quantum mechanics was later extended by Bohm, Vigier, Hiley, Valentini and others to include stochastic properties. Bohm and other physicists, including Valentini, view the Born rule linking to the probability density function as representing not a basic law, but a result of a system having reached quantum equilibrium during the course of the time development under the Schrödinger equation. It can be shown that, once an equilibrium has been reached, the system remains in such equilibrium over the course of its further evolution: this follows from the continuity equation associated with the Schrödinger evolution of . It is less straightforward to demonstrate whether and how such an equilibrium is reached in the first place.

Antony Valentini has extended de Broglie–Bohm theory to include signal nonlocality that would allow entanglement to be used as a stand-alone communication channel without a secondary classical "key" signal to "unlock" the message encoded in the entanglement. This violates orthodox quantum theory but has the virtue of making the parallel universes of the chaotic inflation theory observable in principle.

Unlike de Broglie–Bohm theory, Valentini's theory the wavefunction evolution also depends on the ontological variables. This introduces an instability, a feedback loop that pushes the hidden variables out of "sub-quantal heat death". The resulting theory becomes nonlinear and non-unitary. Valentini argues that the laws of quantum mechanics are emergent and form a "quantum equilibrium" that is analogous to thermal equilibrium in classical dynamics, such that other "quantum non-equilibrium" distributions may in principle be observed and exploited, for which the statistical predictions of quantum theory are violated. It is controversially argued that quantum theory is merely a special case of a much wider nonlinear physics, a physics in which non-local (superluminal) signalling is possible, and in which the uncertainty principle can be violated.

Results

Below are some highlights of the results that arise out of an analysis of de Broglie–Bohm theory. Experimental results agree with all of quantum mechanics' standard predictions insofar as it has them. But while standard quantum mechanics is limited to discussing the results of "measurements", de Broglie–Bohm theory governs the dynamics of a system without the intervention of outside observers (p. 117 in Bell).

The basis for agreement with standard quantum mechanics is that the particles are distributed according to . This is a statement of observer ignorance: the initial positions are represented by a statistical distribution so deterministic trajectories will result in a statistical distribution.

Measuring spin and polarization

According to ordinary quantum theory, it is not possible to measure the spin or polarization of a particle directly; instead, the component in one direction is measured; the outcome from a single particle may be 1, meaning that the particle is aligned with the measuring apparatus, or −1, meaning that it is aligned the opposite way. An ensemble of particles prepared by a polarizer to be in state 1 will all measure polarized in state 1 in a subsequent apparatus. A polarized ensemble sent through a polarizer set at angle to the first pass will result in some values of 1 and some of −1 with a probability that depends on the relative alignment. For a full explanation of this, see the Stern–Gerlach experiment.

In de Broglie–Bohm theory, the results of a spin experiment cannot be analyzed without some knowledge of the experimental setup. It is possible to modify the setup so that the trajectory of the particle is unaffected, but that the particle with one setup registers as spin-up, while in the other setup it registers as spin-down. Thus, for the de Broglie–Bohm theory, the particle's spin is not an intrinsic property of the particle; instead spin is, so to speak, in the wavefunction of the particle in relation to the particular device being used to measure the spin. This is an illustration of what is sometimes referred to as contextuality and is related to naive realism about operators. Interpretationally, measurement results are a deterministic property of the system and its environment, which includes information about the experimental setup including the context of co-measured observables; in no sense does the system itself possess the property being measured, as would have been the case in classical physics.

Measurements, the quantum formalism, and observer independence

De Broglie–Bohm theory gives almost the same results as (non-relativisitic) quantum mechanics. It treats the wavefunction as a fundamental object in the theory, as the wavefunction describes how the particles move. This means that no experiment can distinguish between the two theories. This section outlines the ideas as to how the standard quantum formalism arises out of quantum mechanics.

Collapse of the wavefunction

De Broglie–Bohm theory is a theory that applies primarily to the whole universe. That is, there is a single wavefunction governing the motion of all of the particles in the universe according to the guiding equation. Theoretically, the motion of one particle depends on the positions of all of the other particles in the universe. In some situations, such as in experimental systems, we can represent the system itself in terms of a de Broglie–Bohm theory in which the wavefunction of the system is obtained by conditioning on the environment of the system. Thus, the system can be analyzed with Schrödinger's equation and the guiding equation, with an initial distribution for the particles in the system (see the section on the conditional wavefunction of a subsystem for details).

It requires a special setup for the conditional wavefunction of a system to obey a quantum evolution. When a system interacts with its environment, such as through a measurement, the conditional wavefunction of the system evolves in a different way. The evolution of the universal wavefunction can become such that the wavefunction of the system appears to be in a superposition of distinct states. But if the environment has recorded the results of the experiment, then using the actual Bohmian configuration of the environment to condition on, the conditional wavefunction collapses to just one alternative, the one corresponding with the measurement results.

Collapse of the universal wavefunction never occurs in de Broglie–Bohm theory. Its entire evolution is governed by Schrödinger's equation, and the particles' evolutions are governed by the guiding equation. Collapse only occurs in a phenomenological way for systems that seem to follow their own Schrödinger's equation. As this is an effective description of the system, it is a matter of choice as to what to define the experimental system to include, and this will affect when "collapse" occurs.

Operators as observables

In the standard quantum formalism, measuring observables is generally thought of as measuring operators on the Hilbert space. For example, measuring position is considered to be a measurement of the position operator. This relationship between physical measurements and Hilbert space operators is, for standard quantum mechanics, an additional axiom of the theory. The de Broglie–Bohm theory, by contrast, requires no such measurement axioms (and measurement as such is not a dynamically distinct or special sub-category of physical processes in the theory). In particular, the usual operators-as-observables formalism is, for de Broglie–Bohm theory, a theorem. A major point of the analysis is that many of the measurements of the observables do not correspond to properties of the particles; they are (as in the case of spin discussed above) measurements of the wavefunction.

In the history of de Broglie–Bohm theory, the proponents have often had to deal with claims that this theory is impossible. Such arguments are generally based on inappropriate analysis of operators as observables. If one believes that spin measurements are indeed measuring the spin of a particle that existed prior to the measurement, then one does reach contradictions. De Broglie–Bohm theory deals with this by noting that spin is not a feature of the particle, but rather that of the wavefunction. As such, it only has a definite outcome once the experimental apparatus is chosen. Once that is taken into account, the impossibility theorems become irrelevant. There are also objections to this theory based on what it says about particular situations usually involving eigenstates of an operator. For example, the ground state of hydrogen is a real wavefunction. According to the guiding equation, this means that the electron is at rest when in this state. Nevertheless, it is distributed according to , and no contradiction to experimental results is possible to detect.

Operators as observables leads many to believe that many operators are equivalent. De Broglie–Bohm theory, from this perspective, chooses the position observable as a favored observable rather than, say, the momentum observable. Again, the link to the position observable is a consequence of the dynamics. The motivation for de Broglie–Bohm theory is to describe a system of particles. This implies that the goal of the theory is to describe the positions of those particles at all times. Other observables do not have this compelling ontological status. Having definite positions explains having definite results such as flashes on a detector screen. Other observables would not lead to that conclusion, but there need not be any problem in defining a mathematical theory for other observables; see Hyman et al. for an exploration of the fact that a probability density and probability current can be defined for any set of commuting operators.

Hidden variables

De Broglie–Bohm theory is often referred to as a "hidden-variable" theory. Bohm used this description in his original papers on the subject, writing: "From the point of view of the usual interpretation, these additional elements or parameters [permitting a detailed causal and continuous description of all processes] could be called 'hidden' variables." Bohm and Hiley later stated that they found Bohm's choice of the term "hidden variables" to be too restrictive. In particular, they argued that a particle is not actually hidden but rather "is what is most directly manifested in an observation [though] its properties cannot be observed with arbitrary precision (within the limits set by uncertainty principle)".[57] However, others nevertheless treat the term "hidden variable" as a suitable description.[58]

Generalized particle trajectories can be extrapolated from numerous weak measurements on an ensemble of equally prepared systems, and such trajectories coincide with the de Broglie–Bohm trajectories. In particular, an experiment with two entangled photons, in which a set of Bohmian trajectories for one of the photons was determined using weak measurements and postselection, can be understood in terms of a nonlocal connection between that photon's trajectory and the other photon's polarization.[59][60] However, not only the De Broglie–Bohm interpretation, but also many other interpretations of quantum mechanics that do not include such trajectories are consistent with such experimental evidence.

Different predictions

A specialized version of the double slit experiment has been devised to test characteristics of the trajectory predictions. Experimental realization of this concept disagreed with the Bohm predictions. where they differed from standard quantum mechanics. These conclusions have been the subject of debate.

Heisenberg's uncertainty principle

The Heisenberg's uncertainty principle states that when two complementary measurements are made, there is a limit to the product of their accuracy. As an example, if one measures the position with an accuracy of and the momentum with an accuracy of , then

In de Broglie–Bohm theory, there is always a matter of fact about the position and momentum of a particle. Each particle has a well-defined trajectory, as well as a wavefunction. Observers have limited knowledge as to what this trajectory is (and thus of the position and momentum). It is the lack of knowledge of the particle's trajectory that accounts for the uncertainty relation. What one can know about a particle at any given time is described by the wavefunction. Since the uncertainty relation can be derived from the wavefunction in other interpretations of quantum mechanics, it can be likewise derived (in the epistemic sense mentioned above) on the de Broglie–Bohm theory.

To put the statement differently, the particles' positions are only known statistically. As in classical mechanics, successive observations of the particles' positions refine the experimenter's knowledge of the particles' initial conditions. Thus, with succeeding observations, the initial conditions become more and more restricted. This formalism is consistent with the normal use of the Schrödinger equation.

For the derivation of the uncertainty relation, see Heisenberg uncertainty principle, noting that this article describes the principle from the viewpoint of the Copenhagen interpretation.

Quantum entanglement, Einstein–Podolsky–Rosen paradox, Bell's theorem, and nonlocality

De Broglie–Bohm theory highlighted the issue of nonlocality: it inspired John Stewart Bell to prove his now-famous theorem, which in turn led to the Bell test experiments.

In the Einstein–Podolsky–Rosen paradox, the authors describe a thought experiment that one could perform on a pair of particles that have interacted, the results of which they interpreted as indicating that quantum mechanics is an incomplete theory.

Decades later John Bell proved Bell's theorem (see p. 14 in Bell), in which he showed that, if they are to agree with the empirical predictions of quantum mechanics, all such "hidden-variable" completions of quantum mechanics must either be nonlocal (as the Bohm interpretation is) or give up the assumption that experiments produce unique results (see counterfactual definiteness and many-worlds interpretation). In particular, Bell proved that any local theory with unique results must make empirical predictions satisfying a statistical constraint called "Bell's inequality".

Alain Aspect performed a series of Bell test experiments that test Bell's inequality using an EPR-type setup. Aspect's results show experimentally that Bell's inequality is in fact violated, meaning that the relevant quantum-mechanical predictions are correct. In these Bell test experiments, entangled pairs of particles are created; the particles are separated, traveling to remote measuring apparatus. The orientation of the measuring apparatus can be changed while the particles are in flight, demonstrating the apparent nonlocality of the effect.

The de Broglie–Bohm theory makes the same (empirically correct) predictions for the Bell test experiments as ordinary quantum mechanics. It is able to do this because it is manifestly nonlocal. It is often criticized or rejected based on this; Bell's attitude was: "It is a merit of the de Broglie–Bohm version to bring this [nonlocality] out so explicitly that it cannot be ignored."

The de Broglie–Bohm theory describes the physics in the Bell test experiments as follows: to understand the evolution of the particles, we need to set up a wave equation for both particles; the orientation of the apparatus affects the wavefunction. The particles in the experiment follow the guidance of the wavefunction. It is the wavefunction that carries the faster-than-light effect of changing the orientation of the apparatus. Maudlin provides an analysis of exactly what kind of nonlocality is present and how it is compatible with relativity. Bell has shown that the nonlocality does not allow superluminal communication. Maudlin has shown this in greater detail.

Classical limit

Bohm's formulation of de Broglie–Bohm theory in a classical-looking version has the merits that the emergence of classical behavior seems to follow immediately for any situation in which the quantum potential is negligible, as noted by Bohm in 1952. Modern methods of decoherence are relevant to an analysis of this limit. See Allori et al. for steps towards a rigorous analysis.

Quantum trajectory method

Work by Robert E. Wyatt in the early 2000s attempted to use the Bohm "particles" as an adaptive mesh that follows the actual trajectory of a quantum state in time and space. In the "quantum trajectory" method, one samples the quantum wavefunction with a mesh of quadrature points. One then evolves the quadrature points in time according to the Bohm equations of motion. At each time step, one then re-synthesizes the wavefunction from the points, recomputes the quantum forces, and continues the calculation. (QuickTime movies of this for H + H2 reactive scattering can be found on the Wyatt group web-site at UT Austin.) This approach has been adapted, extended, and used by a number of researchers in the chemical physics community as a way to compute semi-classical and quasi-classical molecular dynamics. A 2007 issue of The Journal of Physical Chemistry A was dedicated to Prof. Wyatt and his work on "computational Bohmian dynamics".

Eric R. Bittner's group at the University of Houston has advanced a statistical variant of this approach that uses Bayesian sampling technique to sample the quantum density and compute the quantum potential on a structureless mesh of points. This technique was recently used to estimate quantum effects in the heat capacity of small clusters Nen for n ≈ 100.

There remain difficulties using the Bohmian approach, mostly associated with the formation of singularities in the quantum potential due to nodes in the quantum wavefunction. In general, nodes forming due to interference effects lead to the case where This results in an infinite force on the sample particles forcing them to move away from the node and often crossing the path of other sample points (which violates single-valuedness). Various schemes have been developed to overcome this; however, no general solution has yet emerged.

These methods, as does Bohm's Hamilton–Jacobi formulation, do not apply to situations in which the full dynamics of spin need to be taken into account.

The properties of trajectories in the de Broglie–Bohm theory differ significantly from the Moyal quantum trajectories as well as the quantum trajectories from the unraveling of an open quantum system.

Similarities with the many-worlds interpretation

Kim Joris Boström has proposed a non-relativistic quantum mechanical theory that combines elements of de Broglie-Bohm mechanics and Everett's many-worlds. In particular, the unreal many-worlds interpretation of Hawking and Weinberg is similar to the Bohmian concept of unreal empty branch worlds:

The second issue with Bohmian mechanics may, at first sight, appear rather harmless, but which on a closer look develops considerable destructive power: the issue of empty branches. These are the components of the post-measurement state that do not guide any particles because they do not have the actual configuration q in their support. At first sight, the empty branches do not appear problematic but on the contrary very helpful as they enable the theory to explain unique outcomes of measurements. Also, they seem to explain why there is an effective "collapse of the wavefunction", as in ordinary quantum mechanics. On a closer view, though, one must admit that these empty branches do not actually disappear. As the wavefunction is taken to describe a really existing field, all their branches really exist and will evolve forever by the Schrödinger dynamics, no matter how many of them will become empty in the course of the evolution. Every branch of the global wavefunction potentially describes a complete world which is, according to Bohm's ontology, only a possible world that would be the actual world if only it were filled with particles, and which is in every respect identical to a corresponding world in Everett's theory. Only one branch at a time is occupied by particles, thereby representing the actual world, while all other branches, though really existing as part of a really existing wavefunction, are empty and thus contain some sort of "zombie worlds" with planets, oceans, trees, cities, cars and people who talk like us and behave like us, but who do not actually exist. Now, if the Everettian theory may be accused of ontological extravagance, then Bohmian mechanics could be accused of ontological wastefulness. On top of the ontology of empty branches comes the additional ontology of particle positions that are, on account of the quantum equilibrium hypothesis, forever unknown to the observer. Yet, the actual configuration is never needed for the calculation of the statistical predictions in experimental reality, for these can be obtained by mere wavefunction algebra. From this perspective, Bohmian mechanics may appear as a wasteful and redundant theory. I think it is considerations like these that are the biggest obstacle in the way of a general acceptance of Bohmian mechanics.

Many authors have expressed critical views of de Broglie–Bohm theory by comparing it to Everett's many-worlds approach. Many (but not all) proponents of de Broglie–Bohm theory (such as Bohm and Bell) interpret the universal wavefunction as physically real. According to some supporters of Everett's theory, if the (never collapsing) wavefunction is taken to be physically real, then it is natural to interpret the theory as having the same many worlds as Everett's theory. In the Everettian view the role of the Bohmian particle is to act as a "pointer", tagging, or selecting, just one branch of the universal wavefunction (the assumption that this branch indicates which wave packet determines the observed result of a given experiment is called the "result assumption"); the other branches are designated "empty" and implicitly assumed by Bohm to be devoid of conscious observers. H. Dieter Zeh comments on these "empty" branches:

It is usually overlooked that Bohm's theory contains the same "many worlds" of dynamically separate branches as the Everett interpretation (now regarded as "empty" wave components), since it is based on precisely the same ... global wave function ...

David Deutsch has expressed the same point more "acerbically":

Pilot-wave theories are parallel-universe theories in a state of chronic denial.

This conclusion has been challenged by Detlef Dürr and Justin Lazarovici:

The Bohmian, of course, cannot accept this argument. For her, it is decidedly the particle configuration in three-dimensional space and not the wave function on the abstract configuration space that constitutes a world (or rather, the world). Instead, she will accuse the Everettian of not having local beables (in Bell's sense) in her theory, that is, the ontological variables that refer to localized entities in three-dimensional space or four-dimensional spacetime. The many worlds of her theory thus merely appear as a grotesque consequence of this omission.

Occam's-razor criticism

Both Hugh Everett III and Bohm treated the wavefunction as a physically real field. Everett's many-worlds interpretation is an attempt to demonstrate that the wavefunction alone is sufficient to account for all our observations. When we see the particle detectors flash or hear the click of a Geiger counter, Everett's theory interprets this as our wavefunction responding to changes in the detector's wavefunction, which is responding in turn to the passage of another wavefunction (which we think of as a "particle", but is actually just another wave packet). No particle (in the Bohm sense of having a defined position and velocity) exists according to that theory. For this reason Everett sometimes referred to his own many-worlds approach as the "pure wave theory". Of Bohm's 1952 approach, Everett said:

Our main criticism of this view is on the grounds of simplicity – if one desires to hold the view that is a real field, then the associated particle is superfluous, since, as we have endeavored to illustrate, the pure wave theory is itself satisfactory.

In the Everettian view, then, the Bohm particles are superfluous entities, similar to, and equally as unnecessary as, for example, the luminiferous ether, which was found to be unnecessary in special relativity. This argument is sometimes called the "redundancy argument", since the superfluous particles are redundant in the sense of Occam's razor.

According to Brown & Wallace, the de Broglie–Bohm particles play no role in the solution of the measurement problem. For these authors, the "result assumption" (see above) is inconsistent with the view that there is no measurement problem in the predictable outcome (i.e. single-outcome) case. They also say that a standard tacit assumption of de Broglie–Bohm theory (that an observer becomes aware of configurations of particles of ordinary objects by means of correlations between such configurations and the configuration of the particles in the observer's brain) is unreasonable. This conclusion has been challenged by Valentini, who argues that the entirety of such objections arises from a failure to interpret de Broglie–Bohm theory on its own terms.

According to Peter R. Holland, in a wider Hamiltonian framework, theories can be formulated in which particles do act back on the wave function.

Derivations

De Broglie–Bohm theory has been derived many times and in many ways. Below are six derivations, all of which are very different and lead to different ways of understanding and extending this theory.

The guiding equation can be derived in a similar fashion. We assume a plane wave: . Notice that . Assuming that for the particle's actual velocity, we have that . Thus, we have the guiding equation.
Notice that this derivation does not use Schrödinger's equation.
  • Preserving the density under the time evolution is another method of derivation. This is the method that Bell cites. It is this method that generalizes to many possible alternative theories. The starting point is the continuity equation for the density . This equation describes a probability flow along a current. We take the velocity field associated with this current as the velocity field whose integral curves yield the motion of the particle.
  • A method applicable for particles without spin is to do a polar decomposition of the wavefunction and transform Schrödinger's equation into two coupled equations: the continuity equation from above and the Hamilton–Jacobi equation. This is the method used by Bohm in 1952. The decomposition and equations are as follows:
Decomposition: Note that corresponds to the probability density .
Continuity equation: .
Hamilton–Jacobi equation:
The Hamilton–Jacobi equation is the equation derived from a Newtonian system with potential and velocity field The potential is the classical potential that appears in Schrödinger's equation, and the other term involving is the quantum potential, terminology introduced by Bohm.
This leads to viewing the quantum theory as particles moving under the classical force modified by a quantum force. However, unlike standard Newtonian mechanics, the initial velocity field is already specified by , which is a symptom of this being a first-order theory, not a second-order theory.
  • A fourth derivation was given by Dürr et al. In their derivation, they derive the velocity field by demanding the appropriate transformation properties given by the various symmetries that Schrödinger's equation satisfies, once the wavefunction is suitably transformed. The guiding equation is what emerges from that analysis.
  • A fifth derivation, given by Dürr et al. is appropriate for generalization to quantum field theory and the Dirac equation. The idea is that a velocity field can also be understood as a first-order differential operator acting on functions. Thus, if we know how it acts on functions, we know what it is. Then given the Hamiltonian operator , the equation to satisfy for all functions (with associated multiplication operator ) is , where is the local Hermitian inner product on the value space of the wavefunction.
This formulation allows for stochastic theories such as the creation and annihilation of particles.
  • A further derivation has been given by Peter R. Holland, on which he bases his quantum-physics textbook The Quantum Theory of Motion. It is based on three basic postulates and an additional fourth postulate that links the wavefunction to measurement probabilities:
    1. A physical system consists in a spatiotemporally propagating wave and a point particle guided by it.
    2. The wave is described mathematically by a solution to Schrödinger's wave equation.
    3. The particle motion is described by a solution to in dependence on initial condition , with the phase of .
      The fourth postulate is subsidiary yet consistent with the first three:
    4. The probability to find the particle in the differential volume at time t equals .

History

The theory was historically developed in the 1920s by de Broglie, who, in 1927, was persuaded to abandon it in favour of the then-mainstream Copenhagen interpretation. David Bohm, dissatisfied with the prevailing orthodoxy, rediscovered de Broglie's pilot-wave theory in 1952. Bohm's suggestions were not then widely received, partly due to reasons unrelated to their content, such as Bohm's youthful communist affiliations. The de Broglie–Bohm theory was widely deemed unacceptable by mainstream theorists, mostly because of its explicit non-locality. On the theory, John Stewart Bell, author of the 1964 Bell's theorem wrote in 1982:

Bohm showed explicitly how parameters could indeed be introduced, into nonrelativistic wave mechanics, with the help of which the indeterministic description could be transformed into a deterministic one. More importantly, in my opinion, the subjectivity of the orthodox version, the necessary reference to the "observer", could be eliminated. ...

But why then had Born not told me of this "pilot wave"? If only to point out what was wrong with it? Why did von Neumann not consider it? More extraordinarily, why did people go on producing "impossibility" proofs, after 1952, and as recently as 1978?... Why is the pilot wave picture ignored in text books? Should it not be taught, not as the only way, but as an antidote to the prevailing complacency? To show us that vagueness, subjectivity, and indeterminism, are not forced on us by experimental facts, but by deliberate theoretical choice?

Since the 1990s, there has been renewed interest in formulating extensions to de Broglie–Bohm theory, attempting to reconcile it with special relativity and quantum field theory, besides other features such as spin or curved spatial geometries.

De Broglie–Bohm theory has a history of different formulations and names. In this section, each stage is given a name and a main reference.

Pilot-wave theory

Louis de Broglie presented his pilot wave theory at the 1927 Solvay Conference, after close collaboration with Schrödinger, who developed his wave equation for de Broglie's theory.[clarification needed] At the end of the presentation, Wolfgang Pauli pointed out that it was not compatible with a semi-classical technique Fermi had previously adopted in the case of inelastic scattering. Contrary to a popular legend, de Broglie actually gave the correct rebuttal that the particular technique could not be generalized for Pauli's purpose, although the audience might have been lost in the technical details and de Broglie's mild manner left the impression that Pauli's objection was valid. He was eventually persuaded to abandon this theory nonetheless because he was "discouraged by criticisms which [it] roused". De Broglie's theory already applies to multiple spin-less particles, but lacks an adequate theory of measurement as no one understood quantum decoherence at the time. An analysis of de Broglie's presentation is given in Bacciagaluppi et al. Also, in 1932 John von Neumann published a no hidden variables proof in his book Mathematical Foundations of Quantum Mechanics, that was widely believed to prove that all hidden-variable theories are impossible. This sealed the fate of de Broglie's theory for the next two decades.

In 1926, Erwin Madelung had developed a hydrodynamic version of Schrödinger's equation, which is incorrectly considered as a basis for the density current derivation of the de Broglie–Bohm theory. The Madelung equations, being quantum analog of Euler equations of fluid dynamics, differ philosophically from the de Broglie–Bohm mechanics and are the basis of the stochastic interpretation of quantum mechanics.

Peter R. Holland has pointed out that, earlier in 1927, Einstein had actually submitted a preprint with a similar proposal but, not convinced, had withdrawn it before publication. According to Holland, failure to appreciate key points of the de Broglie–Bohm theory has led to confusion, the key point being "that the trajectories of a many-body quantum system are correlated not because the particles exert a direct force on one another (à la Coulomb) but because all are acted upon by an entity – mathematically described by the wavefunction or functions of it – that lies beyond them". This entity is the quantum potential.

After publishing his popular textbook Quantum Theory that adhered entirely to the Copenhagen orthodoxy, Bohm was persuaded by Einstein to take a critical look at von Neumann's no hidden variables proof. The result was 'A Suggested Interpretation of the Quantum Theory in Terms of "Hidden Variables" I and II' [Bohm 1952]. It was an independent origination of the pilot wave theory, and extended it to incorporate a consistent theory of measurement, and to address a criticism of Pauli that de Broglie did not properly respond to; it is taken to be deterministic (though Bohm hinted in the original papers that there should be disturbances to this, in the way Brownian motion disturbs Newtonian mechanics). This stage is known as the de Broglie–Bohm Theory in Bell's work [Bell 1987] and is the basis for 'The Quantum Theory of Motion' [Holland 1993].

This stage applies to multiple particles, and is deterministic.

The de Broglie–Bohm theory is an example of a hidden-variables theory. Bohm originally hoped that hidden variables could provide a local, causal, objective description that would resolve or eliminate many of the paradoxes of quantum mechanics, such as Schrödinger's cat, the measurement problem and the collapse of the wavefunction. However, Bell's theorem complicates this hope, as it demonstrates that there can be no local hidden-variable theory that is compatible with the predictions of quantum mechanics. The Bohmian interpretation is causal but not local.

Bohm's paper was largely ignored or panned by other physicists. Albert Einstein, who had suggested that Bohm search for a realist alternative to the prevailing Copenhagen approach, did not consider Bohm's interpretation to be a satisfactory answer to the quantum nonlocality question, calling it "too cheap", while Werner Heisenberg considered it a "superfluous 'ideological superstructure' ". Wolfgang Pauli, who had been unconvinced by de Broglie in 1927, conceded to Bohm as follows:

I just received your long letter of 20th November, and I also have studied more thoroughly the details of your paper. I do not see any longer the possibility of any logical contradiction as long as your results agree completely with those of the usual wave mechanics and as long as no means is given to measure the values of your hidden parameters both in the measuring apparatus and in the observe [sic] system. As far as the whole matter stands now, your 'extra wave-mechanical predictions' are still a check, which cannot be cashed.

He subsequently described Bohm's theory as "artificial metaphysics".

According to physicist Max Dresden, when Bohm's theory was presented at the Institute for Advanced Study in Princeton, many of the objections were ad hominem, focusing on Bohm's sympathy with communists as exemplified by his refusal to give testimony to the House Un-American Activities Committee.

In 1979, Chris Philippidis, Chris Dewdney and Basil Hiley were the first to perform numeric computations on the basis of the quantum potential to deduce ensembles of particle trajectories. Their work renewed the interests of physicists in the Bohm interpretation of quantum physics.

Eventually John Bell began to defend the theory. In "Speakable and Unspeakable in Quantum Mechanics" [Bell 1987], several of the papers refer to hidden-variables theories (which include Bohm's).

The trajectories of the Bohm model that would result for particular experimental arrangements were termed "surreal" by some. Still in 2016, mathematical physicist Sheldon Goldstein said of Bohm's theory: "There was a time when you couldn't even talk about it because it was heretical. It probably still is the kiss of death for a physics career to be actually working on Bohm, but maybe that's changing."

Bohmian mechanics

Bohmian mechanics is the same theory, but with an emphasis on the notion of current flow, which is determined on the basis of the quantum equilibrium hypothesis that the probability follows the Born rule. The term "Bohmian mechanics" is also often used to include most of the further extensions past the spin-less version of Bohm. While de Broglie–Bohm theory has Lagrangians and Hamilton-Jacobi equations as a primary focus and backdrop, with the icon of the quantum potential, Bohmian mechanics considers the continuity equation as primary and has the guiding equation as its icon. They are mathematically equivalent in so far as the Hamilton-Jacobi formulation applies, i.e., spin-less particles.

All of non-relativistic quantum mechanics can be fully accounted for in this theory. Recent studies have used this formalism to compute the evolution of many-body quantum systems, with a considerable increase in speed as compared to other quantum-based methods.

Causal interpretation and ontological interpretation

Bohm developed his original ideas, calling them the Causal Interpretation. Later he felt that causal sounded too much like deterministic and preferred to call his theory the Ontological Interpretation. The main reference is "The Undivided Universe" (Bohm, Hiley 1993).

This stage covers work by Bohm and in collaboration with Jean-Pierre Vigier and Basil Hiley. Bohm is clear that this theory is non-deterministic (the work with Hiley includes a stochastic theory). As such, this theory is not strictly speaking a formulation of de Broglie–Bohm theory, but it deserves mention here because the term "Bohm Interpretation" is ambiguous between this theory and de Broglie–Bohm theory.

In 1996 philosopher of science Arthur Fine gave an in-depth analysis of possible interpretations of Bohm's model of 1952.

William Simpson has suggested a hylomorphic interpretation of Bohmian mechanics, in which the cosmos is an Aristotelian substance composed of material particles and a substantial form. The wave function is assigned a dispositional role in choreographing the trajectories of the particles.

Hydrodynamic quantum analogs

Experiments on hydrodynamical analogs of quantum mechanics beginning with the work of Couder and Fort (2006) have purported to show that macroscopic classical pilot-waves can exhibit characteristics previously thought to be restricted to the quantum realm. Hydrodynamic pilot-wave analogs have been claimed to duplicate the double slit experiment, tunneling, quantized orbits, and numerous other quantum phenomena which have led to a resurgence in interest in pilot wave theories. The analogs have been compared to the Faraday wave. These results have been disputed: experiments fail to reproduce aspects of the double-slit experiments. High precision measurements in the tunneling case point to a different origin of the unpredictable crossing: rather than initial position uncertainty or environmental noise, interactions at the barrier seem to be involved.

Another classical analog has been reported in surface gravity waves.

A comparison by Bush (2015) among the walking droplet system, de Broglie's double-solution pilot-wave theory and its extension to SED

Hydrodynamic walkers de Broglie SED pilot wave
Driving bath vibration internal clock vacuum fluctuations
Spectrum monochromatic monochromatic broad
Trigger bouncing zitterbewegung zitterbewegung
Trigger frequency
Energetics GPE wave EM
Resonance droplet-wave harmony of phases unspecified
Dispersion
Carrier
Statistical

Surrealistic trajectories

In 1992, Englert, Scully, Sussman, and Walther proposed experiments that would show particles taking paths that differ from the Bohm trajectories.  They described the Bohm trajectories as "surrealistic"; their proposal was later referred to as ESSW after the last names of the authors. In 2016, Mahler et al. verified the ESSW predictions. However they propose the surrealistic effect is a consequence of the nonlocality inherent in Bohm's theory.

Monday, March 10, 2025

Self-replicating spacecraft

From Wikipedia, the free encyclopedia

The concept of self-replicating spacecraft, as envisioned by mathematician John von Neumann, has been described by futurists and has been discussed across a wide breadth of hard science fiction novels and stories. Self-replicating probes are sometimes referred to as von Neumann probes. Self-replicating spacecraft would in some ways either mimic or echo the features of living organisms or viruses.

Theory

Von Neumann argued that the most effective way of performing large-scale mining operations such as mining an entire moon or asteroid belt would be by self-replicating spacecraft, taking advantage of their exponential growth. In theory, a self-replicating spacecraft could be sent to a neighboring planetary system, where it would seek out raw materials (extracted from asteroids, moons, gas giants, etc.) to create replicas of itself. These replicas would then be sent out to other planetary systems. The original "parent" probe could then pursue its primary purpose within the star system. This mission varies widely depending on the variant of self-replicating starship proposed.

Given this pattern, and its similarity to the reproduction patterns of bacteria, it has been pointed out that von Neumann machines might be considered a form of life. In his short story "Lungfish", David Brin touches on this idea, pointing out that self-replicating machines launched by different species might actually compete with one another (in a Darwinistic fashion) for raw material, or even have conflicting missions. Given enough variety of "species" they might even form a type of ecology, or – should they also have a form of artificial intelligence – a society. They may even mutate with thousands of "generations".

The first quantitative engineering analysis of such a spacecraft was published in 1980 by Robert Freitas, in which the non-replicating Project Daedalus design was modified to include all subsystems necessary for self-replication. The design's strategy was to use the probe to deliver a "seed" factory with a mass of about 443 tons to a distant site, have the seed factory produce many copies of itself there to increase its total manufacturing capacity over a 500-year period, and then use the resulting automated industrial complex to construct more probes with a single seed factory on board each.

It has been theorized that a self-replicating starship utilizing relatively conventional theoretical methods of interstellar travel (i.e., no exotic faster-than-light propulsion, and speeds limited to an "average cruising speed" of 0.1c.) could spread throughout a galaxy the size of the Milky Way in as little as half a million years.

Debate on Fermi's paradox

In 1981, Frank Tipler put forth an argument that extraterrestrial intelligences do not exist, based on the fact that von Neumann probes have not been observed. Given even a moderate rate of replication and the history of the galaxy, such probes should already be common throughout space and thus, we should have already encountered them. Because we have not, this shows that extraterrestrial intelligences do not exist. This is thus a resolution to the Fermi paradox – that is, the question of why we have not already encountered extraterrestrial intelligence if it is common throughout the universe.

A response came from Carl Sagan and William Newman. Now known as Sagan's Response, it pointed out that in fact Tipler had underestimated the rate of replication, and that von Neumann probes should have already started to consume most of the mass in the galaxy. Any intelligent race would therefore, Sagan and Newman reasoned, not design von Neumann probes in the first place, and would try to destroy any von Neumann probes found as soon as they were detected. As Robert Freitas has pointed out, the assumed capacity of von Neumann probes described by both sides of the debate is unlikely in reality, and more modestly reproducing systems are unlikely to be observable in their effects on our solar system or the galaxy as a whole.

Another objection to the prevalence of von Neumann probes is that civilizations that could potentially create such devices may have a high probability of self-destruction before being capable of producing such machines. This could be through events such as biological or nuclear warfare, nanoterrorism, resource exhaustion, ecological catastrophe, or pandemics. This obstacle to the creation of von Neumann probes is one potential candidate for the concept of a Great Filter.

Simple workarounds exist to avoid the over-replication scenario. Radio transmitters, or other means of wireless communication, could be used by probes programmed not to replicate beyond a certain density (such as five probes per cubic parsec) or arbitrary limit (such as ten million within one century), analogous to the Hayflick limit in cell reproduction. One problem with this defence against uncontrolled replication is that it would only require a single probe to malfunction and begin unrestricted reproduction for the entire approach to fail – essentially a technological cancer – unless each probe also has the ability to detect such malfunction in its neighbours and implements a seek and destroy protocol (which in turn could lead to probe-on-probe space wars if faulty probes first managed to multiply to high numbers before they were found by sound ones, which could then well have programming to replicate to matching numbers so as to manage the infestation). Another workaround is based on the need for spacecraft heating during long interstellar travel. The use of plutonium as a thermal source would limit the ability to self-replicate. The spacecraft would have no programming to make more plutonium even if it found the required raw materials. Another is to program the spacecraft with a clear understanding of the dangers of uncontrolled replication.

Applications for self-replicating spacecraft

The details of the mission of self-replicating starships can vary widely from proposal to proposal, and the only common trait is the self-replicating nature.

Von Neumann probes

A von Neumann probe is a spacecraft capable of replicating itself. It is a concatenation of two concepts: a Von Neumann universal constructor (self-replicating machine) and a probe (an instrument to explore or examine something). The concept is named after Hungarian American mathematician and physicist John von Neumann, who rigorously studied the concept of self-replicating machines that he called "Universal Assemblers" and which are often referred to as "von Neumann machines". Such constructs could be theorised to comprise five basic components (variations of this template could create other machines such as Bracewell probes):

  • Probe: which would contain the actual probing instruments & goal-directed AI to guide the construct.
  • Life-support systems: mechanisms to repair and maintain the construct.
  • Factory: mechanisms to harvest resources & replicate itself.
  • Memory banks: store programs for all its components & information gained by the probe.
  • Engine: motor to move the probe.[citation needed]

Andreas M. Hein and science fiction author Stephen Baxter proposed different types of von Neumann probes, termed "Philosopher" and "Founder", where the purpose of the former is exploration and for the latter preparing future settlement.

A near-term concept of a self-replicating probe has been proposed by the Initiative for Interstellar Studies, achieving about 70% self-replication, based on current and near-term technologies.[8]

If a self-replicating probe finds evidence of primitive life (or a primitive, low-level culture) it might be programmed to lie dormant, silently observe, attempt to make contact (this variant is known as a Bracewell probe),[jargon] or even interfere with or guide the evolution of life in some way.

Physicist Paul Davies of University of Adelaide has "raised the possibility of a probe resting on our own Moon", having arrived at some point in Earth's ancient prehistory and remained to monitor Earth, a concept that, per Michio Kaku, was what Stanley Kubrick used as the basis of his film, 2001: A Space Odyssey (though the director cut the relevant monolith scene from the movie). Kubrick's work was based on Arthur C. Clarke's story, "The Sentinel", expanded by the pair in the form of a novel that became the basis for the movie  and so Davies' lunar probe/observatory concept is also considered reminiscent of Clarke.

A variant idea on the interstellar von Neumann probe idea is that of the "Astrochicken", proposed by Freeman Dyson. While it has the common traits of self-replication, exploration, and communication with its "home base", Dyson conceived the Astrochicken to explore and operate within our own planetary system, and not explore interstellar space.

Anders Sandberg and Stuart Armstrong argued that launching the colonization of the entire reachable universe through self-replicating probes is well within the capabilities of a star-spanning civilization, and proposed a theoretical approach for achieving it in 32 years, by mining planet Mercury for resources and constructing a Dyson Swarm around the Sun.

Berserkers

A variant of the self-replicating starship is the Berserker. Unlike the benign probe concept, Berserkers are programmed to seek out and exterminate lifeforms and life-bearing exoplanets whenever they are encountered.

The name is derived from the Berserker series of novels by Fred Saberhagen which describes a war between humanity and such machines. Saberhagen points out (through one of his characters) that the Berserker warships in his novels are not von Neumann machines themselves, but the larger complex of Berserker machines – including automated shipyards – do constitute a von Neumann machine. This again brings up the concept of an ecology of von Neumann machines, or even a von Neumann hive entity.

It is speculated in fiction that Berserkers could be created and launched by a xenophobic civilization (see Anvil of Stars, by Greg Bear, in the section In fiction below) or could theoretically "mutate" from a more benign probe. For instance, a von Neumann ship designed for terraforming processes – mining a planet's surface and adjusting its atmosphere to more human-friendly conditions – could be interpreted as attacking previously inhabited planets, killing their inhabitants in the process of changing the planetary environment, and then self-replicating to dispatch more ships to "attack" other planets.

Replicating seeder ships

Yet another variant on the idea of the self-replicating starship is that of the seeder ship. Such starships might store the genetic patterns of lifeforms from their home world, perhaps even of the species which created it. Upon finding a habitable exoplanet, or even one that might be terraformed, it would try to replicate such lifeforms – either from stored embryos or from stored information using molecular nanotechnology to build zygotes with varying genetic information from local raw materials.

Such ships might be terraforming vessels, preparing colony worlds for later colonization by other vessels, or – should they be programmed to recreate, raise, and educate individuals of the species that created it – self-replicating colonizers themselves. Seeder ships would be a suitable alternative to generation ships as a way to colonize worlds too distant to travel to in one lifetime.

In fiction

Von Neumann probes

  • 2001: A Space Odyssey: The monoliths in Arthur C. Clarke's book and Stanley Kubrick's film 2001: A Space Odyssey were intended to be self-replicating probes, though the artifacts in "The Sentinel", Clarke's original short story upon which 2001 was based, were not. The film was to begin with a series of scientists explaining how probes like these would be the most efficient method of exploring outer space. Kubrick cut the opening segment from his film at the last minute, however, and these monoliths became almost mystical entities in both the film and Clarke's novel.
  • Cold As Ice: In the novel by Charles Sheffield, there is a segment where the author (a physicist) describes Von Neumann machines harvesting sulfur, nitrogen, phosphorus, helium-4, and various metals from the atmosphere of Jupiter.
  • Destiny's Road: Larry Niven frequently refers to Von Neumann probes in many of his works. In his 1998 book Destiny's Road, Von Neumann machines are scattered throughout the human colony world Destiny and its moon Quicksilver in order to build and maintain technology and to make up for the lack of the resident humans' technical knowledge; the Von Neumann machines primarily construct a stretchable fabric cloth capable of acting as a solar collector which serves as the humans' primary energy source. The Von Neumann machines also engage in ecological maintenance and other exploratory work.
  • The Devil's Blind Spot: See also Alexander Kluge, The Devil's Blind Spot (New Directions; 2004.)
  • Grey Goo: In the video game Grey Goo, the "Goo" faction is composed entirely of Von Neumann probes sent through various microscopic wormholes to map the Milky Way Galaxy. The faction's units are configurations of nanites used during their original mission of exploration, which have adapted to a combat role. The Goo starts as an antagonist to the Human and Beta factions, but their true objective is revealed during their portion of the single-player campaign. Related to, and inspired by, the Grey Goo doomsday scenario.
  • Spin: In the novel by Robert Charles Wilson, Earth is veiled by a temporal field. Humanity tries to understand and escape this field by using Von Neumann probes. It is later revealed that the field itself was generated by Von Neumann probes from another civilization, and that a competition for resources had taken place between earth's and the aliens' probes.
  • The Third Millennium: A History of the World AD 2000–3000: In the book by Brian Stableford and David Langford (published by Alfred A. Knopf, Inc., 1985) humanity sends cycle-limited Von Neumann probes out to the nearest stars to do open-ended exploration and to announce humanity's existence to whoever might encounter them.
  • Von Neumann's War: In Von Neumann's War by John Ringo and Travis S. Taylor (published by Baen Books in 2007) Von Neumann probes arrive in the solar system, moving in from the outer planets, and converting all metals into gigantic structures. Eventually, they arrive on Earth, wiping out much of the population before being beaten back when humanity reverse engineers some of the probes.
  • We Are Legion (We Are Bob) by Dennis E. Taylor: Bob Johansson, the former owner of a software company, dies in a car accident, only to wake up a hundred years later as a computer emulation of Bob. Given a Von Neumann probe by America's religious government, he is sent out to explore, exploit, expand, and experiment for the good of the human race.
  • ARMA 3: In the "First Contact" single-player campaign introduced in the Contact expansion, a series of extraterrestrial network structures are found in various locations on Earth, one being the fictional country of Livonia, the campaign's setting. In the credits of the campaign, a radio broadcast reveals that a popular theory surrounding the networks is that they are a type of Von Neumann probe that arrived on Earth during the time of a supercontinent.
  • Questionable Content: In Jeph Jacques' webcomic, Faye Whitaker refers to the "Floating Black Slab Emitting A Low Hum" as a possible Von Neumann probe in Episode 4645: Accessorized.
  • In the third act of the incremental game Universal Paperclips, after all of Earth's matter has been converted into paperclips, players are tasked with sending Von Neumann probes into the universe to find and consume all matter in service of making paperclips, eventually entering a war with another class of probes called "drifters" that are created as a result of random mutations.
  • In the game Satisfactory developed by Coffee Stain Studios, the player arrives on a distant alien planet and is tasked with constructing another spaceship. The player is guided by an artificial intelligence which provides the instructions for creating the spaceship (specifically, which resources are required). When complete, it then leaves to presumably repeat the process on another planet. This is not explicitly explained by the game but lore suggests you are simply a clone created by the previous iteration of the process, and it has been going on for a long, long time.

Berserkers

  • In the science fiction short story collection Berserker by Fred Saberhagen, a series of short stories include accounts of battles fought against extremely destructive Berserker machines. This and subsequent books set in the same fictional universe are the origin of the term "Berserker probe".
  • In the 2003 miniseries reboot of Battlestar Galactica (and the subsequent 2004 series) the Cylons are similar to Berserkers in their wish to destroy human life. They were created by humans in a group of fictional planets called the Twelve Colonies. The Cylons created special models that look like humans in order to destroy the twelve colonies and later, the fleeing fleet of surviving humans.
  • The Borg of Star Trek – a self-replicating bio-mechanical race that is dedicated to the task of achieving perfection through the assimilation of useful technology and lifeforms. Their ships are massive mechanical cubes (a close step from the Berserker's massive mechanical Spheres).
  • Science fiction author Larry Niven later borrowed this notion in his short story "A Teardrop Falls".
  • In the computer game Star Control II, the Slylandro Probe is an out-of-control self-replicating probe that attacks starships of other races. They were not originally intended to be a berserker probe; they sought out intelligent life for peaceful contact, but due to a programming error, they would immediately switch to "resource extraction" mode and attempt to dismantle the target ship for raw materials. While the plot claims that the probes reproduce "at a geometric rate", the game itself caps the frequency of encountering these probes. It is possible to deal with the menace in a side-quest, but this is not necessary to complete the game, as the probes only appear one at a time, and the player's ship will eventually be fast and powerful enough to outrun them or destroy them for resources – although the probes will eventually dominate the entire game universe.
  • In Iain Banks' novel Excession, hegemonising swarms are described as a form of Outside Context Problem. An example of an "Aggressive Hegemonising Swarm Object" is given as an uncontrolled self-replicating probe with the goal of turning all matter into copies of itself. After causing great damage, they are somehow transformed using unspecified techniques by the Zetetic Elench and become "Evangelical Hegemonising Swarm Objects". Such swarms (referred to as "smatter") reappear in the later novels Surface Detail (which features scenes of space combat against the swarms) and The Hydrogen Sonata.
  • The Inhibitors from Alastair Reynolds' Revelation Space series are self-replicating machines whose purpose is to inhibit the development of intelligent star-faring cultures. They are dormant for extreme periods of time until they detect the presence of a space-faring culture and proceed to exterminate it even to the point of sterilizing entire planets. They are very difficult to destroy as they seem to have faced every type of weapon ever devised and only need a short time to 'remember' the necessary counter-measures.
  • Also from Alastair Reynolds' books, the "Greenfly" terraforming machines are another form of berserker machines. For unknown reasons, but probably an error in their programming, they destroy planets and turn them into trillions of domes filled with vegetation – after all, their purpose is to produce a habitable environment for humans, however in doing so they inadvertently decimate the human race. By 10,000, they have wiped out most of the Galaxy.
  • The Reapers in the video game series Mass Effect are also self-replicating probes bent on destroying any advanced civilization encountered in the galaxy. They lie dormant in the vast spaces between the galaxies and follow a cycle of extermination. It is seen in Mass Effect 2 that they assimilate any advanced species.
  • Mantrid Drones from the science fiction television series Lexx were an extremely aggressive type of self-replicating Berserker machine, eventually converting the majority of the matter in the universe into copies of themselves in the course of their quest to thoroughly exterminate humanity.
  • The Babylon 5 episode "Infection" showed a smaller scale berserker in the form of the Icarran War Machine. After being created with the goal of defeating an unspecified enemy faction, the War Machines proceeded to exterminate all life on the planet Icarra VII because they had been programmed with standards for what constituted a 'Pure Icaran' based on religious teachings, which no actual Icaran could satisfy. Because the Icaran were pre-starflight, the War Machines became dormant after completing their task rather than spreading. One unit was reactivated on-board Babylon 5 after being smuggled past quarantine by an unscrupulous archaeologist, but after being confronted with how they had rendered Icara VII a dead world, the simulated personality of the War Machine committed suicide.
  • The Babylon 5 episode "A Day in the Strife" features a probe that threatens the station with destruction unless a series of questions designed to test a civilization's level of advancement are answered correctly. The commander of the station correctly surmises that the probe is actually a berserker and that if the questions are answered the probe would identify them as a threat to its originating civilization and detonate.
  • Greg Bear's novel The Forge of God deals directly with the concept of "Berserker" von Neumann probes and their consequences. The idea is further explored in the novel's sequel, Anvil of Stars, which explores the reaction other civilizations have to the creation and release of Berserkers.
  • In Gregory Benford's Galactic Center Saga series, an antagonist berserker machine race is encountered by Earth, first as a probe in In the Ocean of Night, and then in an attack in Across the Sea of Suns. The berserker machines do not seek to completely eradicate a race if merely throwing it into a primitive low technological state will do as they did to the EMs encountered in Across the Sea of Suns. The alien machine Watchers would not be considered von Neumann machines themselves, but the collective machine race could.
  • On Stargate SG-1 the Replicators were a vicious race of insect-like robots that were originally created by an android named Reese to serve as toys. They grew beyond her control and began evolving, eventually spreading throughout at least two galaxies. In addition to ordinary autonomous evolution they were able to analyze and incorporate new technologies they encountered into themselves, ultimately making them one of the most advanced "races" known.
  • On Stargate Atlantis, a second race of replicators created by the Ancients were encountered in the Pegasus Galaxy. They were created as a means to defeat the Wraith. The Ancients attempted to destroy them after they began showing signs of sentience and requested that their drive to kill the wraith be removed. This failed, and an unspecified length of time after the Ancients retreated to the Milky Way Galaxy, the replicators nearly succeeded in destroying the Wraith. The Wraith were able to hack into the replicators and deactivate the extermination drive, at which point they retreated to their home world and were not heard from again until encountered by the Atlantis Expedition. After the Atlantis Expedition reactivated this dormant directive, the replicators embarked on a plan to kill the Wraith by removing their food source, i.e. all humans in the Pegasus Galaxy.
  • In Stargate Universe Season 2, a galaxy billions of light years distant from the Milky Way is infested with drone ships that are programmed to annihilate intelligent life and advanced technology. The drone ships attack other space ships (including Destiny) as well as humans on planetary surfaces, but don't bother destroying primitive technology such as buildings unless they are harboring intelligent life or advanced technology.
  • In the Justice League Unlimited episode "Dark Heart", an alien weapon based on this same idea lands on Earth.
  • In the Homeworld: Cataclysm video game, a bio-mechanical virus called Beast has the ability to alter organic and mechanic material to suit its needs, and the ships infected become self-replicating hubs for the virus.
  • In the SF MMO EVE Online, experiments to create more autonomous drones than the ones used by player's ships accidentally created 'rogue drones' which form hives in certain parts of space and are used extensively in missions as difficult opponents.
  • In the computer game Sword of the Stars, the player may randomly encounter "Von Neumann". A Von Neumann mothership appears along with smaller Von Neumann probes, which attack and consume the player's ships. The probes then return to the mothership, returning the consumed material. If probes are destroyed, the mothership will create new ones. If all the player's ships are destroyed, the Von Neumann probes will reduce the planets resource levels before leaving. The probes appear as blue octahedrons, with small spheres attached to the apical points. The mothership is a larger version of the probes. In the 2008 expansion A Murder of Crows, Kerberos Productions also introduces the VN Berserker, a combat oriented ship, which attacks player planets and ships in retaliation to violence against VN Motherships. If the player destroys the Berserker things will escalate and a System Destroyer will attack.
  • In the X Computer Game Series, the Xenon are a malevolent race of artificially intelligent machines descended from terraforming ships sent out by humans to prepare worlds for eventual colonization; the result caused by a bugged software update. They are continual antagonists in the X-Universe.
  • In the comic Transmetropolitan a character mentions "Von Neumann rectal infestations" which are apparently caused by "Shit-ticks that build more shit-ticks that build more shit-ticks".
  • In the anime Vandread, harvester ships attack vessels from both male- and female-dominated factions and harvest hull, reactors, and computer components to make more of themselves. To this end, Harvester ships are built around mobile factories. Earth-born humans also view the inhabitants of the various colonies to be little more than spare parts.
  • In Earth 2160, the Morphidian Aliens rely on Mantain strain aliens for colonization. Most Mantain-derived aliens can absorb water, then reproduce like a colony of cells. In this manner, even one Mantain Lady (or Princess, or Queen) can create enough clones to cover the map. Once they have significant numbers, they "choose an evolutionary path" and swarm the enemy, taking over their resources.
  • In the European comic series Storm, numbers 20 & 21, a kind of berserk von Neumann probe is set on a collision course with the Pandarve system.
  • In PC role-playing game Space Rangers and its sequel Space Rangers 2: Dominators, a league of 5 nations battles three different types of Berserker robots. One that focuses on invading planets, another that battles normal space and third that lives in hyperspace.
  • In the Star Wolves video game series, Berserkers are a self-replicating machine menace that threatens the known universe for purposes of destruction and/or assimilation of humanity.
  • The Star Wars expanded universe features the World Devastators, large ships designed and built by the Galactic Empire that tear apart planets to use its materials to build other ships or even upgrade or replicate themselves.
  • The Tet in the 2013 film Oblivion is revealed to be a Berserker of sorts: a sentient machine that travels from planet to planet, exterminating the indigenous population using armies of robotic drones and cloned members of the target species. The Tet then proceeds to harvest the planet's water in order to extract hydrogen for nuclear fusion.
  • In Eclipse Phase, an ETI probe is believed to have infected the TITAN computer systems with the Exsurgent virus to cause them to go berserk and wage war on humanity. This would make ETI probes a form of berserker, albeit one that uses pre-existing computer systems as its key weapons.
  • In Herr aller Dinge by Andreas Eschbach, an ancient nano machine complex is discovered buried in a glacier off the coast of Russia. When it comes in contact with materials it needs to fulfill its mission, it creates a launch facility and launches a space craft. It is later revealed that the nano machines were created by a pre-historic human race with the intention of destroying other interstellar civilizations (for an unknown reason). It is proposed that the reason there is no evidence of the race is because of the nano-machines themselves and their ability to manipulate matter at an atomic level. It is even suggested that viruses could be ancient nano machines that have evolved over time.
  • In Dead Space brother moons could be considered as berserkers.

Replicating seeder ships

  • Code of the Lifemaker by James P. Hogan describes the evolution of a society of humanoid-like robots who inhabit Saturn's moon Titan. The sentient machines are descended from an uncrewed factory ship that was to be self replicating, but suffered radiation damage and went off course, eventually landing on Titan around 1,000,000 BC.
  • Manifold: Space, Stephen Baxter's novel, starts with the discovery of alien self-replicating machines active within the Solar system.
  • In the Metroid Prime subseries of games, the massive Leviathans are probes routinely sent out from the planet Phaaze to infect other planets with Phazon radiation and eventually turn these planets into clones of Phaaze, where the self-replication process can continue.
  • In David Brin's short story collection, The River of Time (1986), the short story "Lungfish" prominently features von Neumann probes. Not only does he explore the concept of the probes themselves, but indirectly explores the ideas of competition between different designs of probes, evolution of von Neumann probes in the face of such competition, and the development of a type of ecology between von Neumann probes. One of the vessels mentioned is clearly a Seeder type.
  • In The Songs of Distant Earth by Arthur C. Clarke, humanity on a future Earth facing imminent destruction creates automated seedships that act as fire and forget lifeboats aimed at distant, habitable worlds. Upon landing, the ship begins to create new humans from stored genetic information, and an onboard computer system raises and trains the first few generations of new inhabitants. The massive ships are then broken down and used as building materials by their "children".
  • On the Stargate Atlantis episode "Remnants", the Atlantis team finds an ancient probe that they later learn was launched by a now-extinct, technologically advanced race in order to seed new worlds and re-propagate their silicon-based species. The probe communicated with inhabitants of Atlantis by means of hallucinations.
  • On the Stargate SG-1 episode "Scorched Earth", a species of newly relocated humanoids face extinction via an automated terraforming colony seeder ship controlled by an Artificial Intelligence.
  • On Stargate Universe, the human adventurers live on a ship called Destiny. Its mission was to connect a network of Stargates, placed by preceding seeder ships on planets capable of supporting life to allow instantaneous travel between them.
  • The trilogy of albums which conclude the comic book series Storm by Don Lawrence (starting with Chronicles of Pandarve 11: The Von Neumann machine) is based on self-replicating conscious machines containing the sum of all human knowledge employed to rebuild human society throughout the universe in case of disaster on Earth. The probe malfunctions and although new probes are built, they do not separate from the motherprobe, which eventually results in a cluster of malfunctioning probes so big that it can absorb entire moons.
  • In the Xeno series, a rogue seeder ship (technically a berserker) known as "Deus" created humanity.

Quantum chemistry

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Qua...