Search This Blog

Friday, August 4, 2023

Quantum mechanics

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Quantum_mechanics
Wave functions of the electron in a hydrogen atom at different energy levels. Quantum mechanics cannot predict the exact location of a particle in space, only the probability of finding it at different locations. The brighter areas represent a higher probability of finding the electron.

Quantum mechanics is a fundamental theory in physics that provides a description of the physical properties of nature at the scale of atoms and subatomic particles. It is the foundation of all quantum physics including quantum chemistry, quantum field theory, quantum technology, and quantum information science.

Classical physics, the collection of theories that existed before the advent of quantum mechanics, describes many aspects of nature at an ordinary (macroscopic) scale, but is not sufficient for describing them at small (atomic and subatomic) scales. Most theories in classical physics can be derived from quantum mechanics as an approximation valid at large (macroscopic) scale.

Quantum mechanics differs from classical physics in that energy, momentum, angular momentum, and other quantities of a bound system are restricted to discrete values (quantization); objects have characteristics of both particles and waves (wave–particle duality); and there are limits to how accurately the value of a physical quantity can be predicted prior to its measurement, given a complete set of initial conditions (the uncertainty principle).

Quantum mechanics arose gradually from theories to explain observations that could not be reconciled with classical physics, such as Max Planck's solution in 1900 to the black-body radiation problem, and the correspondence between energy and frequency in Albert Einstein's 1905 paper, which explained the photoelectric effect. These early attempts to understand microscopic phenomena, now known as the "old quantum theory", led to the full development of quantum mechanics in the mid-1920s by Niels Bohr, Erwin Schrödinger, Werner Heisenberg, Max Born, Paul Dirac and others. The modern theory is formulated in various specially developed mathematical formalisms. In one of them, a mathematical entity called the wave function provides information, in the form of probability amplitudes, about what measurements of a particle's energy, momentum, and other physical properties may yield.

Overview and fundamental concepts

Quantum mechanics allows the calculation of properties and behaviour of physical systems. It is typically applied to microscopic systems: molecules, atoms and sub-atomic particles. It has been demonstrated to hold for complex molecules with thousands of atoms, but its application to human beings raises philosophical problems, such as Wigner's friend, and its application to the universe as a whole remains speculative. Predictions of quantum mechanics have been verified experimentally to an extremely high degree of accuracy.

A fundamental feature of the theory is that it usually cannot predict with certainty what will happen, but only give probabilities. Mathematically, a probability is found by taking the square of the absolute value of a complex number, known as a probability amplitude. This is known as the Born rule, named after physicist Max Born. For example, a quantum particle like an electron can be described by a wave function, which associates to each point in space a probability amplitude. Applying the Born rule to these amplitudes gives a probability density function for the position that the electron will be found to have when an experiment is performed to measure it. This is the best the theory can do; it cannot say for certain where the electron will be found. The Schrödinger equation relates the collection of probability amplitudes that pertain to one moment of time to the collection of probability amplitudes that pertain to another.

One consequence of the mathematical rules of quantum mechanics is a tradeoff in predictability between different measurable quantities. The most famous form of this uncertainty principle says that no matter how a quantum particle is prepared or how carefully experiments upon it are arranged, it is impossible to have a precise prediction for a measurement of its position and also at the same time for a measurement of its momentum.

Another consequence of the mathematical rules of quantum mechanics is the phenomenon of quantum interference, which is often illustrated with the double-slit experiment. In the basic version of this experiment, a coherent light source, such as a laser beam, illuminates a plate pierced by two parallel slits, and the light passing through the slits is observed on a screen behind the plate. The wave nature of light causes the light waves passing through the two slits to interfere, producing bright and dark bands on the screen – a result that would not be expected if light consisted of classical particles. However, the light is always found to be absorbed at the screen at discrete points, as individual particles rather than waves; the interference pattern appears via the varying density of these particle hits on the screen. Furthermore, versions of the experiment that include detectors at the slits find that each detected photon passes through one slit (as would a classical particle), and not through both slits (as would a wave). However, such experiments demonstrate that particles do not form the interference pattern if one detects which slit they pass through. Other atomic-scale entities, such as electrons, are found to exhibit the same behavior when fired towards a double slit. This behavior is known as wave–particle duality.

Another counter-intuitive phenomenon predicted by quantum mechanics is quantum tunnelling: a particle that goes up against a potential barrier can cross it, even if its kinetic energy is smaller than the maximum of the potential. In classical mechanics this particle would be trapped. Quantum tunnelling has several important consequences, enabling radioactive decay, nuclear fusion in stars, and applications such as scanning tunnelling microscopy and the tunnel diode.

When quantum systems interact, the result can be the creation of quantum entanglement: their properties become so intertwined that a description of the whole solely in terms of the individual parts is no longer possible. Erwin Schrödinger called entanglement "...the characteristic trait of quantum mechanics, the one that enforces its entire departure from classical lines of thought". Quantum entanglement enables the counter-intuitive properties of quantum pseudo-telepathy, and can be a valuable resource in communication protocols, such as quantum key distribution and superdense coding. Contrary to popular misconception, entanglement does not allow sending signals faster than light, as demonstrated by the no-communication theorem.

Another possibility opened by entanglement is testing for "hidden variables", hypothetical properties more fundamental than the quantities addressed in quantum theory itself, knowledge of which would allow more exact predictions than quantum theory can provide. A collection of results, most significantly Bell's theorem, have demonstrated that broad classes of such hidden-variable theories are in fact incompatible with quantum physics. According to Bell's theorem, if nature actually operates in accord with any theory of local hidden variables, then the results of a Bell test will be constrained in a particular, quantifiable way. Many Bell tests have been performed, using entangled particles, and they have shown results incompatible with the constraints imposed by local hidden variables.

It is not possible to present these concepts in more than a superficial way without introducing the actual mathematics involved; understanding quantum mechanics requires not only manipulating complex numbers, but also linear algebra, differential equations, group theory, and other more advanced subjects. Accordingly, this article will present a mathematical formulation of quantum mechanics and survey its application to some useful and oft-studied examples.

Mathematical formulation

In the mathematically rigorous formulation of quantum mechanics, the state of a quantum mechanical system is a vector belonging to a (separable) complex Hilbert space . This vector is postulated to be normalized under the Hilbert space inner product, that is, it obeys , and it is well-defined up to a complex number of modulus 1 (the global phase), that is, and represent the same physical system. In other words, the possible states are points in the projective space of a Hilbert space, usually called the complex projective space. The exact nature of this Hilbert space is dependent on the system – for example, for describing position and momentum the Hilbert space is the space of complex square-integrable functions , while the Hilbert space for the spin of a single proton is simply the space of two-dimensional complex vectors with the usual inner product.

Physical quantities of interest – position, momentum, energy, spin – are represented by observables, which are Hermitian (more precisely, self-adjoint) linear operators acting on the Hilbert space. A quantum state can be an eigenvector of an observable, in which case it is called an eigenstate, and the associated eigenvalue corresponds to the value of the observable in that eigenstate. More generally, a quantum state will be a linear combination of the eigenstates, known as a quantum superposition. When an observable is measured, the result will be one of its eigenvalues with probability given by the Born rule: in the simplest case the eigenvalue is non-degenerate and the probability is given by , where is its associated eigenvector. More generally, the eigenvalue is degenerate and the probability is given by , where is the projector onto its associated eigenspace. In the continuous case, these formulas give instead the probability density.

After the measurement, if result was obtained, the quantum state is postulated to collapse to , in the non-degenerate case, or to , in the general case. The probabilistic nature of quantum mechanics thus stems from the act of measurement. This is one of the most difficult aspects of quantum systems to understand. It was the central topic in the famous Bohr–Einstein debates, in which the two scientists attempted to clarify these fundamental principles by way of thought experiments. In the decades after the formulation of quantum mechanics, the question of what constitutes a "measurement" has been extensively studied. Newer interpretations of quantum mechanics have been formulated that do away with the concept of "wave function collapse" (see, for example, the many-worlds interpretation). The basic idea is that when a quantum system interacts with a measuring apparatus, their respective wave functions become entangled so that the original quantum system ceases to exist as an independent entity. For details, see the article on measurement in quantum mechanics.

The time evolution of a quantum state is described by the Schrödinger equation:

Here denotes the Hamiltonian, the observable corresponding to the total energy of the system, and is the reduced Planck constant. The constant is introduced so that the Hamiltonian is reduced to the classical Hamiltonian in cases where the quantum system can be approximated by a classical system; the ability to make such an approximation in certain limits is called the correspondence principle.

The solution of this differential equation is given by

The operator is known as the time-evolution operator, and has the crucial property that it is unitary. This time evolution is deterministic in the sense that – given an initial quantum state  – it makes a definite prediction of what the quantum state will be at any later time.

Fig. 1: Probability densities corresponding to the wave functions of an electron in a hydrogen atom possessing definite energy levels (increasing from the top of the image to the bottom: n = 1, 2, 3, ...) and angular momenta (increasing across from left to right: s, p, d, ...). Denser areas correspond to higher probability density in a position measurement. Such wave functions are directly comparable to Chladni's figures of acoustic modes of vibration in classical physics and are modes of oscillation as well, possessing a sharp energy and thus, a definite frequency. The angular momentum and energy are quantized and take only discrete values like those shown. (As is the case for resonant frequencies in acoustics.)

Some wave functions produce probability distributions that are independent of time, such as eigenstates of the Hamiltonian. Many systems that are treated dynamically in classical mechanics are described by such "static" wave functions. For example, a single electron in an unexcited atom is pictured classically as a particle moving in a circular trajectory around the atomic nucleus, whereas in quantum mechanics, it is described by a static wave function surrounding the nucleus. For example, the electron wave function for an unexcited hydrogen atom is a spherically symmetric function known as an s orbital (Fig. 1).

Analytic solutions of the Schrödinger equation are known for very few relatively simple model Hamiltonians including the quantum harmonic oscillator, the particle in a box, the dihydrogen cation, and the hydrogen atom. Even the helium atom – which contains just two electrons – has defied all attempts at a fully analytic treatment.

However, there are techniques for finding approximate solutions. One method, called perturbation theory, uses the analytic result for a simple quantum mechanical model to create a result for a related but more complicated model by (for example) the addition of a weak potential energy. Another method is called "semi-classical equation of motion", which applies to systems for which quantum mechanics produces only small deviations from classical behavior. These deviations can then be computed based on the classical motion. This approach is particularly important in the field of quantum chaos.

Uncertainty principle

One consequence of the basic quantum formalism is the uncertainty principle. In its most familiar form, this states that no preparation of a quantum particle can imply simultaneously precise predictions both for a measurement of its position and for a measurement of its momentum. Both position and momentum are observables, meaning that they are represented by Hermitian operators. The position operator and momentum operator do not commute, but rather satisfy the canonical commutation relation:

Given a quantum state, the Born rule lets us compute expectation values for both and , and moreover for powers of them. Defining the uncertainty for an observable by a standard deviation, we have

and likewise for the momentum:

The uncertainty principle states that

Either standard deviation can in principle be made arbitrarily small, but not both simultaneously. This inequality generalizes to arbitrary pairs of self-adjoint operators and . The commutator of these two operators is

and this provides the lower bound on the product of standard deviations:

Another consequence of the canonical commutation relation is that the position and momentum operators are Fourier transforms of each other, so that a description of an object according to its momentum is the Fourier transform of its description according to its position. The fact that dependence in momentum is the Fourier transform of the dependence in position means that the momentum operator is equivalent (up to an factor) to taking the derivative according to the position, since in Fourier analysis differentiation corresponds to multiplication in the dual space. This is why in quantum equations in position space, the momentum is replaced by , and in particular in the non-relativistic Schrödinger equation in position space the momentum-squared term is replaced with a Laplacian times .

Composite systems and entanglement

When two different quantum systems are considered together, the Hilbert space of the combined system is the tensor product of the Hilbert spaces of the two components. For example, let A and B be two quantum systems, with Hilbert spaces and , respectively. The Hilbert space of the composite system is then

If the state for the first system is the vector and the state for the second system is , then the state of the composite system is

Not all states in the joint Hilbert space can be written in this form, however, because the superposition principle implies that linear combinations of these "separable" or "product states" are also valid. For example, if and are both possible states for system , and likewise and are both possible states for system , then

is a valid joint state that is not separable. States that are not separable are called entangled.

If the state for a composite system is entangled, it is impossible to describe either component system A or system B by a state vector. One can instead define reduced density matrices that describe the statistics that can be obtained by making measurements on either component system alone. This necessarily causes a loss of information, though: knowing the reduced density matrices of the individual systems is not enough to reconstruct the state of the composite system. Just as density matrices specify the state of a subsystem of a larger system, analogously, positive operator-valued measures (POVMs) describe the effect on a subsystem of a measurement performed on a larger system. POVMs are extensively used in quantum information theory.

As described above, entanglement is a key feature of models of measurement processes in which an apparatus becomes entangled with the system being measured. Systems interacting with the environment in which they reside generally become entangled with that environment, a phenomenon known as quantum decoherence. This can explain why, in practice, quantum effects are difficult to observe in systems larger than microscopic.

Equivalence between formulations

There are many mathematically equivalent formulations of quantum mechanics. One of the oldest and most common is the "transformation theory" proposed by Paul Dirac, which unifies and generalizes the two earliest formulations of quantum mechanics – matrix mechanics (invented by Werner Heisenberg) and wave mechanics (invented by Erwin Schrödinger). An alternative formulation of quantum mechanics is Feynman's path integral formulation, in which a quantum-mechanical amplitude is considered as a sum over all possible classical and non-classical paths between the initial and final states. This is the quantum-mechanical counterpart of the action principle in classical mechanics.

Symmetries and conservation laws

The Hamiltonian is known as the generator of time evolution, since it defines a unitary time-evolution operator for each value of . From this relation between and , it follows that any observable that commutes with will be conserved: its expectation value will not change over time. This statement generalizes, as mathematically, any Hermitian operator can generate a family of unitary operators parameterized by a variable . Under the evolution generated by , any observable that commutes with will be conserved. Moreover, if is conserved by evolution under , then is conserved under the evolution generated by . This implies a quantum version of the result proven by Emmy Noether in classical (Lagrangian) mechanics: for every differentiable symmetry of a Hamiltonian, there exists a corresponding conservation law.

Examples

Free particle

Position space probability density of a Gaussian wave packet moving in one dimension in free space

The simplest example of a quantum system with a position degree of freedom is a free particle in a single spatial dimension. A free particle is one which is not subject to external influences, so that its Hamiltonian consists only of its kinetic energy:

The general solution of the Schrödinger equation is given by

which is a superposition of all possible plane waves , which are eigenstates of the momentum operator with momentum . The coefficients of the superposition are , which is the Fourier transform of the initial quantum state .

It is not possible for the solution to be a single momentum eigenstate, or a single position eigenstate, as these are not normalizable quantum states. Instead, we can consider a Gaussian wave packet:

which has Fourier transform, and therefore momentum distribution

We see that as we make smaller the spread in position gets smaller, but the spread in momentum gets larger. Conversely, by making larger we make the spread in momentum smaller, but the spread in position gets larger. This illustrates the uncertainty principle.

As we let the Gaussian wave packet evolve in time, we see that its center moves through space at a constant velocity (like a classical particle with no forces acting on it). However, the wave packet will also spread out as time progresses, which means that the position becomes more and more uncertain. The uncertainty in momentum, however, stays constant.

Particle in a box

1-dimensional potential energy box (or infinite potential well)

The particle in a one-dimensional potential energy box is the most mathematically simple example where restraints lead to the quantization of energy levels. The box is defined as having zero potential energy everywhere inside a certain region, and therefore infinite potential energy everywhere outside that region. For the one-dimensional case in the direction, the time-independent Schrödinger equation may be written

With the differential operator defined by

the previous equation is evocative of the classic kinetic energy analogue,

with state in this case having energy coincident with the kinetic energy of the particle.

The general solutions of the Schrödinger equation for the particle in a box are

or, from Euler's formula,

The infinite potential walls of the box determine the values of and at and where must be zero. Thus, at ,

and . At ,

in which cannot be zero as this would conflict with the postulate that has norm 1. Therefore, since , must be an integer multiple of ,

This constraint on implies a constraint on the energy levels, yielding

A finite potential well is the generalization of the infinite potential well problem to potential wells having finite depth. The finite potential well problem is mathematically more complicated than the infinite particle-in-a-box problem as the wave function is not pinned to zero at the walls of the well. Instead, the wave function must satisfy more complicated mathematical boundary conditions as it is nonzero in regions outside the well. Another related problem is that of the rectangular potential barrier, which furnishes a model for the quantum tunneling effect that plays an important role in the performance of modern technologies such as flash memory and scanning tunneling microscopy.

Harmonic oscillator

Some trajectories of a harmonic oscillator (i.e. a ball attached to a spring) in classical mechanics (A-B) and quantum mechanics (C-H). In quantum mechanics, the position of the ball is represented by a wave (called the wave function), with the real part shown in blue and the imaginary part shown in red. Some of the trajectories (such as C, D, E, and F) are standing waves (or "stationary states"). Each standing-wave frequency is proportional to a possible energy level of the oscillator. This "energy quantization" does not occur in classical physics, where the oscillator can have any energy.

As in the classical case, the potential for the quantum harmonic oscillator is given by

This problem can either be treated by directly solving the Schrödinger equation, which is not trivial, or by using the more elegant "ladder method" first proposed by Paul Dirac. The eigenstates are given by

where Hn are the Hermite polynomials

and the corresponding energy levels are

This is another example illustrating the discretization of energy for bound states.

Mach–Zehnder interferometer

Schematic of a Mach–Zehnder interferometer

The Mach–Zehnder interferometer (MZI) illustrates the concepts of superposition and interference with linear algebra in dimension 2, rather than differential equations. It can be seen as a simplified version of the double-slit experiment, but it is of interest in its own right, for example in the delayed choice quantum eraser, the Elitzur–Vaidman bomb tester, and in studies of quantum entanglement.

We can model a photon going through the interferometer by considering that at each point it can be in a superposition of only two paths: the "lower" path which starts from the left, goes straight through both beam splitters, and ends at the top, and the "upper" path which starts from the bottom, goes straight through both beam splitters, and ends at the right. The quantum state of the photon is therefore a vector that is a superposition of the "lower" path and the "upper" path , that is, for complex . In order to respect the postulate that we require that .

Both beam splitters are modelled as the unitary matrix , which means that when a photon meets the beam splitter it will either stay on the same path with a probability amplitude of , or be reflected to the other path with a probability amplitude of . The phase shifter on the upper arm is modelled as the unitary matrix , which means that if the photon is on the "upper" path it will gain a relative phase of , and it will stay unchanged if it is in the lower path.

A photon that enters the interferometer from the left will then be acted upon with a beam splitter , a phase shifter , and another beam splitter , and so end up in the state

and the probabilities that it will be detected at the right or at the top are given respectively by

One can therefore use the Mach–Zehnder interferometer to estimate the phase shift by estimating these probabilities.

It is interesting to consider what would happen if the photon were definitely in either the "lower" or "upper" paths between the beam splitters. This can be accomplished by blocking one of the paths, or equivalently by removing the first beam splitter (and feeding the photon from the left or the bottom, as desired). In both cases there will be no interference between the paths anymore, and the probabilities are given by , independently of the phase . From this we can conclude that the photon does not take one path or another after the first beam splitter, but rather that it is in a genuine quantum superposition of the two paths.

Applications

Quantum mechanics has had enormous success in explaining many of the features of our universe, with regards to small-scale and discrete quantities and interactions which cannot be explained by classical methods. Quantum mechanics is often the only theory that can reveal the individual behaviors of the subatomic particles that make up all forms of matter (electrons, protons, neutrons, photons, and others). Solid-state physics and materials science are dependent upon quantum mechanics.

In many aspects modern technology operates at a scale where quantum effects are significant. Important applications of quantum theory include quantum chemistry, quantum optics, quantum computing, superconducting magnets, light-emitting diodes, the optical amplifier and the laser, the transistor and semiconductors such as the microprocessor, medical and research imaging such as magnetic resonance imaging and electron microscopy. Explanations for many biological and physical phenomena are rooted in the nature of the chemical bond, most notably the macro-molecule DNA.

Relation to other scientific theories

Classical mechanics

The rules of quantum mechanics assert that the state space of a system is a Hilbert space and that observables of the system are Hermitian operators acting on vectors in that space – although they do not tell us which Hilbert space or which operators. These can be chosen appropriately in order to obtain a quantitative description of a quantum system, a necessary step in making physical predictions. An important guide for making these choices is the correspondence principle, a heuristic which states that the predictions of quantum mechanics reduce to those of classical mechanics in the regime of large quantum numbers. One can also start from an established classical model of a particular system, and then try to guess the underlying quantum model that would give rise to the classical model in the correspondence limit. This approach is known as quantization.

When quantum mechanics was originally formulated, it was applied to models whose correspondence limit was non-relativistic classical mechanics. For instance, the well-known model of the quantum harmonic oscillator uses an explicitly non-relativistic expression for the kinetic energy of the oscillator, and is thus a quantum version of the classical harmonic oscillator.

Complications arise with chaotic systems, which do not have good quantum numbers, and quantum chaos studies the relationship between classical and quantum descriptions in these systems.

Quantum decoherence is a mechanism through which quantum systems lose coherence, and thus become incapable of displaying many typically quantum effects: quantum superpositions become simply probabilistic mixtures, and quantum entanglement becomes simply classical correlations. Quantum coherence is not typically evident at macroscopic scales, except maybe at temperatures approaching absolute zero at which quantum behavior may manifest macroscopically.

Many macroscopic properties of a classical system are a direct consequence of the quantum behavior of its parts. For example, the stability of bulk matter (consisting of atoms and molecules which would quickly collapse under electric forces alone), the rigidity of solids, and the mechanical, thermal, chemical, optical and magnetic properties of matter are all results of the interaction of electric charges under the rules of quantum mechanics.

Special relativity and electrodynamics

Early attempts to merge quantum mechanics with special relativity involved the replacement of the Schrödinger equation with a covariant equation such as the Klein–Gordon equation or the Dirac equation. While these theories were successful in explaining many experimental results, they had certain unsatisfactory qualities stemming from their neglect of the relativistic creation and annihilation of particles. A fully relativistic quantum theory required the development of quantum field theory, which applies quantization to a field (rather than a fixed set of particles). The first complete quantum field theory, quantum electrodynamics, provides a fully quantum description of the electromagnetic interaction. Quantum electrodynamics is, along with general relativity, one of the most accurate physical theories ever devised.

The full apparatus of quantum field theory is often unnecessary for describing electrodynamic systems. A simpler approach, one that has been used since the inception of quantum mechanics, is to treat charged particles as quantum mechanical objects being acted on by a classical electromagnetic field. For example, the elementary quantum model of the hydrogen atom describes the electric field of the hydrogen atom using a classical Coulomb potential. This "semi-classical" approach fails if quantum fluctuations in the electromagnetic field play an important role, such as in the emission of photons by charged particles.

Quantum field theories for the strong nuclear force and the weak nuclear force have also been developed. The quantum field theory of the strong nuclear force is called quantum chromodynamics, and describes the interactions of subnuclear particles such as quarks and gluons. The weak nuclear force and the electromagnetic force were unified, in their quantized forms, into a single quantum field theory (known as electroweak theory), by the physicists Abdus Salam, Sheldon Glashow and Steven Weinberg.

Relation to general relativity

Even though the predictions of both quantum theory and general relativity have been supported by rigorous and repeated empirical evidence, their abstract formalisms contradict each other and they have proven extremely difficult to incorporate into one consistent, cohesive model. Gravity is negligible in many areas of particle physics, so that unification between general relativity and quantum mechanics is not an urgent issue in those particular applications. However, the lack of a correct theory of quantum gravity is an important issue in physical cosmology and the search by physicists for an elegant "Theory of Everything" (TOE). Consequently, resolving the inconsistencies between both theories has been a major goal of 20th- and 21st-century physics. This TOE would combine not only the models of subatomic physics but also derive the four fundamental forces of nature from a single force or phenomenon.

One proposal for doing so is string theory, which posits that the point-like particles of particle physics are replaced by one-dimensional objects called strings. String theory describes how these strings propagate through space and interact with each other. On distance scales larger than the string scale, a string looks just like an ordinary particle, with its mass, charge, and other properties determined by the vibrational state of the string. In string theory, one of the many vibrational states of the string corresponds to the graviton, a quantum mechanical particle that carries gravitational force.

Another popular theory is loop quantum gravity (LQG), which describes quantum properties of gravity and is thus a theory of quantum spacetime. LQG is an attempt to merge and adapt standard quantum mechanics and standard general relativity. This theory describes space as an extremely fine fabric "woven" of finite loops called spin networks. The evolution of a spin network over time is called a spin foam. The characteristic length scale of a spin foam is the Planck length, approximately 1.616×10−35 m, and so lengths shorter than the Planck length are not physically meaningful in LQG.

Philosophical implications

Unsolved problem in physics:

Is there a preferred interpretation of quantum mechanics? How does the quantum description of reality, which includes elements such as the "superposition of states" and "wave function collapse", give rise to the reality we perceive?

Since its inception, the many counter-intuitive aspects and results of quantum mechanics have provoked strong philosophical debates and many interpretations. The arguments centre on the probabilistic nature of quantum mechanics, the difficulties with wavefunction collapse and the related measurement problem, and quantum nonlocality. Perhaps the only consensus that exists about these issues is that there is no consensus. Richard Feynman once said, "I think I can safely say that nobody understands quantum mechanics." According to Steven Weinberg, "There is now in my opinion no entirely satisfactory interpretation of quantum mechanics."

The views of Niels Bohr, Werner Heisenberg and other physicists are often grouped together as the "Copenhagen interpretation". According to these views, the probabilistic nature of quantum mechanics is not a temporary feature which will eventually be replaced by a deterministic theory, but is instead a final renunciation of the classical idea of "causality". Bohr in particular emphasized that any well-defined application of the quantum mechanical formalism must always make reference to the experimental arrangement, due to the complementary nature of evidence obtained under different experimental situations. Copenhagen-type interpretations remain popular in the 21st century.

Albert Einstein, himself one of the founders of quantum theory, was troubled by its apparent failure to respect some cherished metaphysical principles, such as determinism and locality. Einstein's long-running exchanges with Bohr about the meaning and status of quantum mechanics are now known as the Bohr–Einstein debates. Einstein believed that underlying quantum mechanics must be a theory that explicitly forbids action at a distance. He argued that quantum mechanics was incomplete, a theory that was valid but not fundamental, analogous to how thermodynamics is valid, but the fundamental theory behind it is statistical mechanics. In 1935, Einstein and his collaborators Boris Podolsky and Nathan Rosen published an argument that the principle of locality implies the incompleteness of quantum mechanics, a thought experiment later termed the Einstein–Podolsky–Rosen paradox. In 1964, John Bell showed that EPR's principle of locality, together with determinism, was actually incompatible with quantum mechanics: they implied constraints on the correlations produced by distance systems, now known as Bell inequalities, that can be violated by entangled particles. Since then several experiments have been performed to obtain these correlations, with the result that they do in fact violate Bell inequalities, and thus falsify the conjunction of locality with determinism.

Bohmian mechanics shows that it is possible to reformulate quantum mechanics to make it deterministic, at the price of making it explicitly nonlocal. It attributes not only a wave function to a physical system, but in addition a real position, that evolves deterministically under a nonlocal guiding equation. The evolution of a physical system is given at all times by the Schrödinger equation together with the guiding equation; there is never a collapse of the wave function. This solves the measurement problem.

Everett's many-worlds interpretation, formulated in 1956, holds that all the possibilities described by quantum theory simultaneously occur in a multiverse composed of mostly independent parallel universes. This is a consequence of removing the axiom of the collapse of the wave packet. All possible states of the measured system and the measuring apparatus, together with the observer, are present in a real physical quantum superposition. While the multiverse is deterministic, we perceive non-deterministic behavior governed by probabilities, because we do not observe the multiverse as a whole, but only one parallel universe at a time. Exactly how this is supposed to work has been the subject of much debate. Several attempts have been made to make sense of this and derive the Born rule,with no consensus on whether they have been successful.

Relational quantum mechanics appeared in the late 1990s as a modern derivative of Copenhagen-type ideas, and QBism was developed some years later.

History

Max Planck is considered the father of the quantum theory.

Quantum mechanics was developed in the early decades of the 20th century, driven by the need to explain phenomena that, in some cases, had been observed in earlier times. Scientific inquiry into the wave nature of light began in the 17th and 18th centuries, when scientists such as Robert Hooke, Christiaan Huygens and Leonhard Euler proposed a wave theory of light based on experimental observations. In 1803 English polymath Thomas Young described the famous double-slit experiment. This experiment played a major role in the general acceptance of the wave theory of light.

During the early 19th century, chemical research by John Dalton and Amedeo Avogadro lent weight to the atomic theory of matter, an idea that James Clerk Maxwell, Ludwig Boltzmann and others built upon to establish the kinetic theory of gases. The successes of kinetic theory gave further credence to the idea that matter is composed of atoms, yet the theory also had shortcomings that would only be resolved by the development of quantum mechanics. While the early conception of atoms from Greek philosophy had been that they were indivisible units – the word "atom" deriving from the Greek for "uncuttable" – the 19th century saw the formulation of hypotheses about subatomic structure. One important discovery in that regard was Michael Faraday's 1838 observation of a glow caused by an electrical discharge inside a glass tube containing gas at low pressure. Julius Plücker, Johann Wilhelm Hittorf and Eugen Goldstein carried on and improved upon Faraday's work, leading to the identification of cathode rays, which J. J. Thomson found to consist of subatomic particles that would be called electrons.

The black-body radiation problem was discovered by Gustav Kirchhoff in 1859. In 1900, Max Planck proposed the hypothesis that energy is radiated and absorbed in discrete "quanta" (or energy packets), yielding a calculation that precisely matched the observed patterns of black-body radiation. The word quantum derives from the Latin, meaning "how great" or "how much". According to Planck, quantities of energy could be thought of as divided into "elements" whose size (E) would be proportional to their frequency (ν):

,

where h is Planck's constant. Planck cautiously insisted that this was only an aspect of the processes of absorption and emission of radiation and was not the physical reality of the radiation. In fact, he considered his quantum hypothesis a mathematical trick to get the right answer rather than a sizable discovery. However, in 1905 Albert Einstein interpreted Planck's quantum hypothesis realistically and used it to explain the photoelectric effect, in which shining light on certain materials can eject electrons from the material. Niels Bohr then developed Planck's ideas about radiation into a model of the hydrogen atom that successfully predicted the spectral lines of hydrogen. Einstein further developed this idea to show that an electromagnetic wave such as light could also be described as a particle (later called the photon), with a discrete amount of energy that depends on its frequency. In his paper "On the Quantum Theory of Radiation," Einstein expanded on the interaction between energy and matter to explain the absorption and emission of energy by atoms. Although overshadowed at the time by his general theory of relativity, this paper articulated the mechanism underlying the stimulated emission of radiation, which became the basis of the laser.

The 1927 Solvay Conference in Brussels was the fifth world physics conference.

This phase is known as the old quantum theory. Never complete or self-consistent, the old quantum theory was rather a set of heuristic corrections to classical mechanics. The theory is now understood as a semi-classical approximation to modern quantum mechanics. Notable results from this period include, in addition to the work of Planck, Einstein and Bohr mentioned above, Einstein and Peter Debye's work on the specific heat of solids, Bohr and Hendrika Johanna van Leeuwen's proof that classical physics cannot account for diamagnetism, and Arnold Sommerfeld's extension of the Bohr model to include special-relativistic effects.

In the mid-1920s quantum mechanics was developed to become the standard formulation for atomic physics. In 1923, the French physicist Louis de Broglie put forward his theory of matter waves by stating that particles can exhibit wave characteristics and vice versa. Building on de Broglie's approach, modern quantum mechanics was born in 1925, when the German physicists Werner Heisenberg, Max Born, and Pascual Jordan developed matrix mechanics and the Austrian physicist Erwin Schrödinger invented wave mechanics. Born introduced the probabilistic interpretation of Schrödinger's wave function in July 1926. Thus, the entire field of quantum physics emerged, leading to its wider acceptance at the Fifth Solvay Conference in 1927.

By 1930 quantum mechanics had been further unified and formalized by David Hilbert, Paul Dirac and John von Neumann with greater emphasis on measurement, the statistical nature of our knowledge of reality, and philosophical speculation about the 'observer'. It has since permeated many disciplines, including quantum chemistry, quantum electronics, quantum optics, and quantum information science. It also provides a useful framework for many features of the modern periodic table of elements, and describes the behaviors of atoms during chemical bonding and the flow of electrons in computer semiconductors, and therefore plays a crucial role in many modern technologies. While quantum mechanics was constructed to describe the world of the very small, it is also needed to explain some macroscopic phenomena such as superconductors and superfluids.

Centennial Challenges

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Centennial_Challenges

The Centennial Challenges are NASA space competition inducement prize contests for non-government-funded technological achievements by American teams.

Origin

NASA's Centennial Challenge Program (CCP) directly engages the public at large in the process of advanced technology development that is of value to NASA's missions and to the aerospace community. CCP offers challenges set up as competitions that award prize money to the individuals or teams to achieve the specified technology challenge. The prize contests are named "Centennial" in honor of the 100 years since the Wright brothers' first flight in 1903. The Wright Brothers' pioneering inventions embody the spirit of the challenges.

The Centennial Challenges are based on a long history of technology prize contests, including the Longitude prize (won by John Harrison), the Orteig Prize (won by Charles Lindbergh), the Ansari X PRIZE (won by Scaled Composites), and the DARPA Grand Challenge (won by Stanford University in 2005 and Carnegie Mellon University in 2007). A key advantage of prizes over traditional grants is that money is only paid when the goal is achieved. A 1999 National Academy of Engineering committee report recommended that "Congress encourage federal agencies to experiment more extensively with inducement prize contests in science and technology". A 2003 NASA Space Architect study, assisted by the X PRIZE Foundation, led to the establishment of the Centennial Challenges.

As a federal agency, NASA has one of the federal government's three largest procurement budgets. The Department of Energy (DOE) and the Defense Department (DOD) round out the trio. With the subsequent proposal in Congress of "H Prize" funding for breakthroughs in hydrogen fuel-related technology, the Department of Energy is poised to join NASA and DARPA's Defense Department in fortifying this paradigm shift favoring a growing quantity of technology experimenters who might otherwise be neglected by traditional government contractors and federal procurement officials.

Current challenges

Sample return robot challenge

The West Virginia University Mountaineers pose with their robot, Cataglyphis, and officials at the 2014 NASA Centennial Challenges Sample Return Robot Challenge at Worcester Polytechnic Institute in Worcester, Mass., after completing Level 1 for a prize of $5,000. A year later, the team won the $100,000 Level-2 Prize. In 2016, Team Mountaineers won the final challenge with a $750,000 prize (NASA/Joel Kowsky)

The Sample Return Robot Challenge is to build an autonomous rough-terrain robot which can find and retrieve geologic samples. The intent is to advance autonomic robotics and remote manipulator technology. The prize is US$1.5 million. The Allied Organization selected to partner with NASA to conduct this challenge is Worcester Polytechnic Institute in Worcester, Massachusetts. Team registration began Summer 2011, and the first competition was held June 16, 2012.

Eleven teams registered for the event, with six showing up to the competition. All but one team were unable to compete after failing the weigh-in and/or inspection. Team SpacePride competed in level one, but did not succeed.

The second running of the challenge took place June 6–8, 2013, at WPI. Ten teams competed for a Level 1 prize. Team Survey of Los Angeles was awarded $5,000 for successfully completing Level 1: their robot left the platform, retrieved a sample and returned to the platform within the 15-minute limit. No teams advanced to Level 2.

The third running of the challenge took place June 9–14, 2014, at WPI. 17 teams competed for Level 1 and Level 2 prizes. Team Mountaineers from West Virginia University (WVU), led by Dr. Yu Gu, successfully completed Level 1 challenge. No teams completed Level 2 challenge in 2014.

The fourth competition took place June 8–12, 2015, at WPI. 16 teams competed for Level 1 and Level 2 prizes. Team Mountaineers from West Virginia University successfully completed Level 2 challenge (with two collected samples or 3 points) and brought home a $100,000 prize. No other team completed Level 1 or Level 2 challenge in 2015.

The fifth year challenge was divided into two events. The Level 1 challenge happened between June 6–11, 2016. Five new teams completed Level 1. The final Level 2 challenge was performed on Sep. 4 & 5. Team Mountaineers from West Virginia University collected 5 samples with a total score of 11 points, and won the challenge with a $750,000 prize.

Efforts were coordinated by NASA and the WPI Robotics Center.

Mars Ascent Vehicle Prize

The MAV Prize is a challenge to demonstrate technologies that may be relevant to future NASA Science Mission Directorate Mars missions. The competition will mimic a MAV mission. When NASA eventually returns samples from Mars, there will be a requirement for a special rocket system — the MAV — to launch the samples from Mars’ surface into orbit for rendezvous with a spacecraft that will return them to Earth. The MAV Challenge requires highly reliable and autonomous sample insertion into the rocket, launch from the surface, and deployment of the sample container. Innovative technology from this competition may be considered in future planning for a Mars exploration mission. The first-place award is $25,000; second-place is $15,000; and third-place is $10,000. Competing teams will be eligible for prize money only after the successful completion of all the required tasks.

The inaugural competition was held in April 2015. North Carolina State University of Raleigh won $25,000 for first place; Tarleton State University of Stephenville, Texas, won second, winning $15,000. There was no third-place winner.

Cube Quest Challenge

The Cube Quest Challenge offers a prize purse of $5 million to teams that meet the challenge objectives of designing, building and delivering flight-qualified, small satellites capable of advanced operations near and beyond the moon. Cube Quest teams will have the opportunity to compete for a secondary payload spot on the first mission of NASA's Orion spacecraft, which will launch atop the agency's Space Launch System (SLS) rocket. The competition includes three stages: Ground Tournaments, Deep Space Derby, and Lunar Derby. The Ground Tournaments will be held every four to six months, leading to an opportunity to earn a spot on the first integrated flight of Orion and SLS. The Deep Space Derby will focus on finding innovative solutions to deep space communications using small spacecraft, and the Lunar Derby will focus primarily on propulsion for small spacecraft and near-Earth communications.

Completed challenges

Green Flight Challenge

Pipistrel Taurus G4, the 2011 Green Flight Challenge winning aircraft of Pipistrel USA.com team, taxiing at the event.

The Green Flight Challenge sponsored by Google is to build an aircraft which can fly 200 miles in under two hours using the energy equivalent of a gallon of gasoline per passenger. The US$1,650,000 prize was competed for Sept 25 - Oct 1, 2011 at the Charles M. Schulz Sonoma County Airport, Santa Rosa, California. The CAFE Foundation was the Allied Organization which partnered with NASA's Centennial Challenges Program to conduct the challenge. On October 1, 2011, CAFE had a competition open house for the public to see the aircraft and meet the competing teams. The Google Green Flight Challenge Exposition was at NASA Ames Research Center in Sunnyvale, California on October 3, 2011. Free admission tickets were available at the Expo website. The Expo had the competition aircraft on display, presented winner checks and additional displays of green energy technology.

Strong tether challenge

This competition presented the challenge of constructing super-strong tethers, a crucial component of a space elevator. The 2005 contest was to award US$50,000 to the team which constructed the strongest tether, with contests in future years requiring that each winner outperform that of the previous year by 50%. No competing tether surpassed the commercial off-the-shelf baseline and the prize was increased to US$200,000 in 2006.

In 2007 the prize money was raised to US$500,000 USD for this competition.

The 2011 Strong Tether Centennial Challenge was held at the Space Elevator Conference in Redmond, Washington on August 12, 2011. The Space Elevator Conference, sponsored by Microsoft, The Leeward Space Foundation and The International Space Elevator Consortium has hosted the Tether competition for five years and there has yet to be a winner.

Power beam challenge

Power Beam competitions were held in 2005, 2006, 2007 and 2009. They were directed at space elevator applications. Teams built mechanical devices (climbers) that could propel themselves up a vertical cable. The power supply for the device was not self-contained but remained on the ground. The technical challenge was to transmit the power to the climber and transform it into mechanical motion, efficiently and reliably.

This was a competition to build a wirelessly-powered ribbon-climbing robot. The contest involves having the robot lift a large payload within a limited timeframe. The first competition in 2005 would have awarded US$50,000, US$20,000, and US$10,000 to the three best-performing teams, meeting the minimum benchmark of 1 m/s. However, no team met this standard, with only two teams climbing under beam power. This prize also increased to US$200,000 in 2006, but no team was able to accomplish the full set of requirements. See Elevator:2010 for more information on Power Beam Challenge as well as other challenges related to space elevator technologies.

In 2007 the prize money was raised to US$500,000 USD for this competition.

In the 2009 competition, the competitors drove their laser-powered devices up a cable one kilometer high, suspended from a helicopter. LaserMotive LLC was awarded US$900,000 in the 2009 Power Beaming Challenge.

Moon regolith Oxygen (MoonROx) challenge

This head-to-head competition was for a system capable of extracting 2.5 kilograms of oxygen from 100 kilograms of artificial lunar regolith in 4 hours or less using at most 10 kW of power. This US$1 million prize expired in June 2009 without a winner.

The initial MoonROx challenge was announced in 2005 with the intent to award a US$250,000 prize to the first team to develop the capability to extract 5 kilograms of breathable oxygen from simulated lunar soil in an eight-hour period. The prize expired in June 2008.

For the initial announcement of the challenge, the competition was to be administered by the Florida Space Research Institute (FSRI) in collaboration with NASA. The next year the California Space Education and Workforce Institute (CSEWI) was selected to administer the challenge when FSRI was dissolved and Space Florida was created to take its place.

Since extracting oxygen from silicates is difficult, and the oxygen electrochemically bound into the silicates at high temperature, it is likely that a solar-furnace may be part of the solution.

Astronaut glove challenge

2009 Competition

In the 2007 competition, only the pressure-restraining layer part of the glove was required. But for the 2009 challenge, teams had to provide a complete glove, including the outer, thermal-micrometeoroid-protection layer. This competition rewarded US$200,000 in May 2007 to the team which constructed the best-performing astronaut glove.

The first competition took place May 2 and May 3, 2007, at the New England Air Museum in Windsor Locks, Connecticut. NASA offered a total of US$200,000 for the team that could design and manufacture the best astronaut glove that exceeded minimum requirements. An additional US$50,000 was offered to the team that best demonstrated Mechanical Counter Pressure gloves. The US$200,000 prize was awarded to Peter K. Homer, an engineer from Southwest Harbor, Maine; the US$50,000 prize went unclaimed and rolled to the next competition.

The 2009 competition was held on November 18 and 19 at the Astronaut Hall of Fame in Titusville, Florida. In the 2009 competition Peter K. Homer of Maine won US$250,000 and Ted Southern of New York won US$100,000, both had competed previously. Another challenge is planned and the date is yet to be announced.

NASA's page

Official Website

Vertical and lunar lander challenges

Armadillo Aerospace technicians on the launch pad performing a vehicle inspection.

Also announced at the XPrize Cup Expo and run by the XPrize Foundation, this prize is for a VTVL (vertical take-off, vertical landing) suborbital rocket that can achieve the altitudes and launch energies that are equivalent to what would be needed for a lunar lander. The Vertical Lander Challenge requires 50 meter minimum altitude, horizontal distance of 100 meters, flight time of 90 seconds, and landing on a smooth surface and after refueling, return to its original location. The more aggressive Lunar Lander Challenge increases that to 180s of flight time and landing on a rocky surface. The VLC has a first prize of $350,000, while the LLC has a first prize in excess of this. For 2006 at the Wirefly X PRIZE Cup, Armadillo Aerospace was the only team able to compete. Their vehicle "Pixel" completed one leg of the trip on its third try but crashed shortly after takeoff on the return, leaving all prizes unclaimed.

In 2008, Armadillo Aerospace successfully completed the easier level one VLC prize.

In 2009, the level two first prize was won by Masten Space Systems, while Armadillo Aerospace took the level two second prize.

Regolith excavation challenge

In this Challenge, teams designed and built robotic machines to excavate simulated lunar soil (regolith). The Challenge was managed by the California Space Authority and was competed in 2007, 2008, and 2009, at which time the Challenge was won by a team from Worcester Polytechnic Institute, which won the US$500,000 prize purse.

NASA page

Regolith Challenge Excavation

Night rover challenge

The Night Rover Challenge is to build a solar-powered robot which can operate on stored energy for a significant portion of time. The intent is to spur development of extreme environment battery technology for use in space missions. The prize is US$1.5 million. NASA is partnered with nonprofit organization Clean Tech Open for this challenge. Requirements for proposal submission are here.

As of October 2013, the Night Rover Challenge was closed as no competitors registered.

Unmanned aircraft systems airspace operations challenge

In October 2012 NASA announced a challenge with the goal of developing some of the key technologies that will make it possible to integrate unmanned aerial vehicles into the National Airspace System. The challenge's focus was on demonstrating a high level of operational robustness and the ability to "sense and avoid" other air traffic.

The challenge was to have been divided into two parts: Phase 1 was scheduled to be held in Spring 2014, and Phase 2 would have taken place one year after Phase 1 was successfully completed. The total prize money available in Phase 1 was US$500,000. Phase 2 was planned to have US$1 million in prize money.

In May 2013, NASA announced that it had selected Development Projects Inc. of Dayton, Ohio to manage the challenge.

As of November 2014, NASA has cancelled the Unmanned Aircraft Systems (UAS) Airspace Operations Challenge (AOC) due to unanticipated technical and operational issues as well as additional costs. NASA Centennial Challenges have historically been high-risk and leveraged activities conducted with minimal government funding. NASA reviewed the intended outcomes of the AOC and determined that the competition was no longer timely or cost-effective to execute as planned. NASA's cancellation of the AOC was not based in any way on technical progress or performance of the registered teams.

CO2 conversion challenge

The CO2 conversion challenge is a competition to convert carbon dioxide into sugars to be used as feedstock for biomanufacturing in space and on Mars. The competition began in 2018 to incentivize the public to recreate the process plants do regularly, except with a non-biological system. Five teams were each awarded a $50,000 milestone prize in 2019 for Phase 1 of the competition to design a system that could accomplish the chemical transformation, including teams from University of California, Princeton University, Rutgers University, Air Company, and Dioxide Materials. Phase 2 of the competition ended in 2021, and three teams split a $750,000 prize purse.

High-level radioactive waste management

Spent nuclear fuel stored underwater and uncapped at the Hanford site in Washington, USA.

High-level radioactive waste management concerns how radioactive materials created during production of nuclear power and nuclear weapons are dealt with. Radioactive waste contains a mixture of short-lived and long-lived nuclides, as well as non-radioactive nuclides. There was reportedly some 47,000 tonnes (100 million pounds) of high-level nuclear waste stored in the United States in 2002.

The most troublesome transuranic elements in spent fuel are neptunium-237 (half-life two million years) and plutonium-239 (half-life 24,000 years). Consequently, high-level radioactive waste requires sophisticated treatment and management to successfully isolate it from the biosphere. This usually necessitates treatment, followed by a long-term management strategy involving permanent storage, disposal or transformation of the waste into a non-toxic form. Radioactive decay follows the half-life rule, which means that the rate of decay is inversely proportional to the duration of decay. In other words, the radiation from a long-lived isotope like iodine-129 will be much less intense than that of short-lived isotope like iodine-131.

Governments around the world are considering a range of waste management and disposal options, usually involving deep-geologic placement, although there has been limited progress toward implementing long-term waste management solutions. This is partly because the timeframes in question when dealing with radioactive waste range from 10,000 to millions of years, according to studies based on the effect of estimated radiation doses.

Thus, engineer and physicist Hannes Alfvén identified two fundamental prerequisites for effective management of high-level radioactive waste: (1) stable geological formations, and (2) stable human institutions over hundreds of thousands of years. As Alfvén suggests, no known human civilization has ever endured for so long, and no geologic formation of adequate size for a permanent radioactive waste repository has yet been discovered that has been stable for so long a period. Nevertheless, avoiding confronting the risks associated with managing radioactive wastes may create countervailing risks of greater magnitude. Radioactive waste management is an example of policy analysis that requires special attention to ethical concerns, examined in the light of uncertainty and futurity: consideration of 'the impacts of practices and technologies on future generations'.

There is a debate over what should constitute an acceptable scientific and engineering foundation for proceeding with radioactive waste disposal strategies. There are those who have argued, on the basis of complex geochemical simulation models, that relinquishing control over radioactive materials to geohydrologic processes at repository closure is an acceptable risk. They maintain that so-called "natural analogues" inhibit subterranean movement of radionuclides, making disposal of radioactive wastes in stable geologic formations unnecessary. However, existing models of these processes are empirically underdetermined: due to the subterranean nature of such processes in solid geologic formations, the accuracy of computer simulation models has not been verified by empirical observation, certainly not over periods of time equivalent to the lethal half-lives of high-level radioactive waste. On the other hand, some insist deep geologic repositories in stable geologic formations are necessary. National management plans of various countries display a variety of approaches to resolving this debate.

Researchers suggest that forecasts of health detriment for such long periods should be examined critically. Practical studies only consider up to 100 years as far as effective planning and cost evaluations are concerned. Long term behaviour of radioactive wastes remains a subject for ongoing research. Management strategies and implementation plans of several representative national governments are described below.

Geologic disposal

The International Panel on Fissile Materials has said:

It is widely accepted that spent nuclear fuel and high-level reprocessing and plutonium wastes require well-designed storage for periods ranging from tens of thousands to a million years, to minimize releases of the contained radioactivity into the environment. Safeguards are also required to ensure that neither plutonium nor highly enriched uranium is diverted to weapon use. There is general agreement that placing spent nuclear fuel in repositories hundreds of meters below the surface would be safer than indefinite storage of spent fuel on the surface.

The process of selecting appropriate permanent repositories for high level waste and spent fuel is now under way in several countries with the first expected to be commissioned some time after 2017. The basic concept is to locate a large, stable geologic formation and use mining technology to excavate a tunnel, or large-bore tunnel boring machines (similar to those used to drill the Channel Tunnel from England to France) to drill a shaft 500–1,000 metres (1,600–3,300 ft) below the surface where rooms or vaults can be excavated for disposal of high-level radioactive waste. The goal is to permanently isolate nuclear waste from the human environment. However, many people remain uncomfortable with the immediate stewardship cessation of this disposal system, suggesting perpetual management and monitoring would be more prudent.

Because some radioactive species have half-lives longer than one million years, even very low container leakage and radionuclide migration rates must be taken into account. Moreover, it may require more than one half-life until some nuclear materials lose enough radioactivity to no longer be lethal to living organisms. A 1983 review of the Swedish radioactive waste disposal program by the National Academy of Sciences found that country’s estimate of several hundred thousand years—perhaps up to one million years—being necessary for waste isolation "fully justified."

The proposed land-based subductive waste disposal method would dispose of nuclear waste in a subduction zone accessed from land, and therefore is not prohibited by international agreement. This method has been described as a viable means of disposing of radioactive waste, and as a state-of-the-art nuclear waste disposal technology.

In nature, sixteen repositories were discovered at the Oklo mine in Gabon where natural nuclear fission reactions took place 1.7 billion years ago. The fission products in these natural formations were found to have moved less than 10 ft (3 m) over this period, though the lack of movement may be due more to retention in the uraninite structure than to insolubility and sorption from moving ground water; uraninite crystals are better preserved here than those in spent fuel rods because of a less complete nuclear reaction, so that reaction products would be less accessible to groundwater attack.

Horizontal drillhole disposal describes proposals to drill over one kilometer vertically, and two kilometers horizontally in the earth’s crust, for the purpose of disposing of high-level waste forms such as spent nuclear fuel, Caesium-137, or Strontium-90. After the emplacement and the retrievability period, drillholes would be backfilled and sealed. A series of tests of the technology were carried out in November 2018 and then again publicly in January 2019 by a U.S. based private company. The test demonstrated the emplacement of a test-canister in a horizontal drillhole and retrieval of the same canister. There was no actual high-level waste used in this test.

Materials for geological disposal

In order to store the high level radioactive waste in long-term geological depositories, specific waste forms need to be used which will allow the radioactivity to decay away while the materials retain their integrity for thousands of years. The materials being used can be broken down into a few classes: glass waste forms, ceramic waste forms, and nanostructured materials.

The glass forms include borosilicate glasses and phosphate glasses. Borosilicate nuclear waste glasses are used on an industrial scale to immobilize high level radioactive waste in many countries which are producers of nuclear energy or have nuclear weaponry. The glass waste forms have the advantage of being able to accommodate a wide variety of waste-stream compositions, they are easy to scale up to industrial processing, and they are stable against thermal, radiative, and chemical perturbations. These glasses function by binding radioactive elements to nonradioactive glass-forming elements. Phosphate glasses while not being used industrially have much lower dissolution rates than borosilicate glasses, which make them a more favorable option. However, no single phosphate material has the ability to accommodate all of the radioactive products so phosphate storage requires more reprocessing to separate the waste into distinct fractions. Both glasses have to be processed at elevated temperatures making them unusable for some of the more volatile radiotoxic elements.

The ceramic waste forms offer higher waste loadings than the glass options because ceramics have crystalline structure. Also, mineral analogues of the ceramic waste forms provide evidence for long term durability. Due to this fact and the fact that they can be processed at lower temperatures, ceramics are often considered the next generation in high level radioactive waste forms. Ceramic waste forms offer great potential, but a lot of research remains to be done.

National management plans

Finland, the United States and Sweden are the most advanced in developing a deep repository for high-level radioactive waste disposal. Countries vary in their plans on disposing used fuel directly or after reprocessing, with France and Japan having an extensive commitment to reprocessing. The country-specific status of high-level waste management plans are described below.

In many European countries (e.g., Britain, Finland, the Netherlands, Sweden and Switzerland) the risk or dose limit for a member of the public exposed to radiation from a future high-level nuclear waste facility is considerably more stringent than that suggested by the International Commission on Radiation Protection or proposed in the United States. European limits are often more stringent than the standard suggested in 1990 by the International Commission on Radiation Protection by a factor of 20, and more stringent by a factor of ten than the standard proposed by the U.S. Environmental Protection Agency (EPA) for Yucca Mountain nuclear waste repository for the first 10,000 years after closure. Moreover, the U.S. EPA’s proposed standard for greater than 10,000 years is 250 times more permissive than the European limit.

The countries that have made the most progress towards a repository for high-level radioactive waste have typically started with public consultations and made voluntary siting a necessary condition. This consensus seeking approach is believed to have a greater chance of success than top-down modes of decision making, but the process is necessarily slow, and there is "inadequate experience around the world to know if it will succeed in all existing and aspiring nuclear nations".

Moreover, most communities do not want to host a nuclear waste repository as they are "concerned about their community becoming a de facto site for waste for thousands of years, the health and environmental consequences of an accident, and lower property values".

Asia

China

In China (People's Republic of China), ten reactors provide about 2% of electricity and five more are under construction. China made a commitment to reprocessing in the 1980s; a pilot plant is under construction at Lanzhou, where a temporary spent fuel storage facility has been constructed. Geological disposal has been studied since 1985, and a permanent deep geological repository was required by law in 2003. Sites in Gansu Province near the Gobi desert in northwestern China are under investigation, with a final site expected to be selected by 2020, and actual disposal by about 2050.

Taiwan

In Taiwan (Republic of China), nuclear waste storage facility was built at the Southern tip of Orchid Island in Taitung County, offshore of Taiwan Island. The facility was built in 1982 and it is owned and operated by Taipower. The facility receives nuclear waste from Taipower's current three nuclear power plants. However, due to the strong resistance from local community in the island, the nuclear waste has to be stored at the power plant facilities themselves.

India

India adopted a closed fuel cycle, which involves reprocessing and recycling of the spent fuel. The reprocessing results in 2-3% of the spent fuel going to waste while the rest is recycled. The waste fuel, called high level liquid waste, is converted to glass through vitrification. Vitrified waste is then stored for a period of 30–40 years for cooling.

Sixteen nuclear reactors produce about 3% of India’s electricity, and seven more are under construction. Spent fuel is processed at facilities in Trombay near Mumbai, at Tarapur on the west coast north of Mumbai, and at Kalpakkam on the southeast coast of India. Plutonium will be used in a fast breeder reactor (under construction) to produce more fuel, and other waste vitrified at Tarapur and Trombay. Interim storage for 30 years is expected, with eventual disposal in a deep geological repository in crystalline rock near Kalpakkam.

Japan

In 2000, a Specified Radioactive Waste Final Disposal Act called for creation of a new organization to manage high level radioactive waste, and later that year the Nuclear Waste Management Organization of Japan (NUMO) was established under the jurisdiction of the Ministry of Economy, Trade and Industry. NUMO is responsible for selecting a permanent deep geological repository site, construction, operation and closure of the facility for waste emplacement by 2040. Site selection began in 2002 and application information was sent to 3,239 municipalities, but by 2006, no local government had volunteered to host the facility. Kōchi Prefecture showed interest in 2007, but its mayor resigned due to local opposition. In December 2013 the government decided to identify suitable candidate areas before approaching municipalities.

The head of the Science Council of Japan’s expert panel has said Japan's seismic conditions makes it difficult to predict ground conditions over the necessary 100,000 years, so it will be impossible to convince the public of the safety of deep geological disposal.

Europe

Belgium

Belgium has seven nuclear reactors that provide about 52% of its electricity. Belgian spent nuclear fuel was initially sent for reprocessing in France. In 1993, reprocessing was suspended following a resolution of the Belgian parliament; spent fuel is since being stored on the sites of the nuclear power plants. The deep disposal of high-level radioactive waste (HLW) has been studied in Belgium for more than 30 years. Boom Clay is studied as a reference host formation for HLW disposal. The Hades underground research laboratory (URL) is located at −223 m (−732 ft) in the Boom Formation at the Mol site. The Belgian URL is operated by the Euridice Economic Interest Group, a joint organisation between SCK•CEN, the Belgian Nuclear Research Centre which initiated the research on waste disposal in Belgium in the 1970s and 1980s and ONDRAF/NIRAS, the Belgian agency for radioactive waste management. In Belgium, the regulatory body in charge of guidance and licensing approval is the Federal Agency of Nuclear Control, created in 2001.

Finland

In 1983, the government decided to select a site for permanent repository by 2010. With four nuclear reactors providing 29% of its electricity, Finland in 1987 enacted a Nuclear Energy Act making the producers of radioactive waste responsible for its disposal, subject to requirements of its Radiation and Nuclear Safety Authority and an absolute veto given to local governments in which a proposed repository would be located. Producers of nuclear waste organized the company Posiva, with responsibility for site selection, construction and operation of a permanent repository. A 1994 amendment to the Act required final disposal of spent fuel in Finland, prohibiting the import or export of radioactive waste.

Environmental assessment of four sites occurred in 1997–98, Posiva chose the Olkiluoto site near two existing reactors, and the local government approved it in 2000. The Finnish Parliament approved a deep geologic repository there in igneous bedrock at a depth of about 500 metres (1,600 ft) in 2001. The repository concept is similar to the Swedish model, with containers to be clad in copper and buried below the water table beginning in 2020. An underground characterization facility, Onkalo spent nuclear fuel repository, was constructed at the site from 2004 to 2017.

France

With 58 nuclear reactors contributing about 75% of its electricity, the highest percentage of any country, France has been reprocessing its spent reactor fuel since the introduction of nuclear power there. Some reprocessed plutonium is used to make fuel, but more is being produced than is being recycled as reactor fuel. France also reprocesses spent fuel for other countries, but the nuclear waste is returned to the country of origin. Radioactive waste from reprocessing French spent fuel is expected to be disposed of in a geological repository, pursuant to legislation enacted in 1991 that established a 15-year period for conducting radioactive waste management research. Under this legislation, partition and transmutation of long-lived elements, immobilization and conditioning processes, and long-term near surface storage are being investigated by the Commissariat à l’Energie Atomique (CEA). Disposal in deep geological formations is being studied by the French agency for radioactive waste management (Agence nationale pour la Gestion des Déchets radioactifs), in underground research labs.

Three sites were identified for possible deep geologic disposal in clay near the border of Meuse and Haute-Marne, near Gard, and at Vienne. In 1998 the government approved the Meuse/Haute Marne Underground Research Laboratory, a site near Meuse/Haute-Marne and dropped the others from further consideration. Legislation was proposed in 2006 to license a repository by 2020, with operations expected in 2035.

Germany

Anti-nuclear protest near nuclear waste disposal centre at Gorleben in northern Germany

Nuclear waste policy in Germany is in flux. German planning for a permanent geologic repository began in 1974, focused on salt dome Gorleben, a salt mine near Gorleben about 100 kilometres (62 mi) northeast of Braunschweig. The site was announced in 1977 with plans for a reprocessing plant, spent fuel management, and permanent disposal facilities at a single site. Plans for the reprocessing plant were dropped in 1979. In 2000, the federal government and utilities agreed to suspend underground investigations for three to ten years, and the government committed to ending its use of nuclear power, closing one reactor in 2003.

Within days of the March 2011 Fukushima Daiichi nuclear disaster, Chancellor Angela Merkel "imposed a three-month moratorium on previously announced extensions for Germany's existing nuclear power plants, while shutting seven of the 17 reactors that had been operating since 1981". Protests continued and, on 29 May 2011, Merkel's government announced that it would close all of its nuclear power plants by 2022.

Meanwhile, electric utilities have been transporting spent fuel to interim storage facilities at Gorleben, Lubmin and Ahaus until temporary storage facilities can be built near reactor sites. Previously, spent fuel was sent to France or the United Kingdom for reprocessing, but this practice was ended in July 2005.

Netherlands

COVRA (Centrale Organisatie Voor Radioactief Afval) is the Dutch interim nuclear waste processing and storage company in Vlissingen, which stores the waste produced in their only remaining nuclear power plant after it is reprocessed by Areva NC in La Hague, Manche, Normandy, France. Until the Dutch government decides what to do with the waste, it will stay at COVRA, which currently has a license to operate for one hundred years. As of early 2017, there are no plans for a permanent disposal facility.

Russia

In Russia, the Ministry of Atomic Energy (Minatom) is responsible for 31 nuclear reactors which generate about 16% of its electricity. Minatom is also responsible for reprocessing and radioactive waste disposal, including over 25,000 tonnes (55 million pounds) of spent nuclear fuel in temporary storage in 2001.

Russia has a long history of reprocessing spent fuel for military purposes, and previously planned to reprocess imported spent fuel, possibly including some of the 33,000 tonnes (73 million pounds) of spent fuel accumulated at sites in other countries who received fuel from the U.S., which the U.S. originally pledged to take back, such as Brazil, the Czech Republic, India, Japan, Mexico, Slovenia, South Korea, Switzerland, Taiwan, and the European Union.

An Environmental Protection Act in 1991 prohibited importing radioactive material for long-term storage or burial in Russia, but controversial legislation to allow imports for permanent storage was passed by the Russian Parliament and signed by President Putin in 2001. In the long term, the Russian plan is for deep geologic disposal. Most attention has been paid to locations where waste has accumulated in temporary storage at Mayak, near Chelyabinsk in the Ural Mountains, and in granite at Krasnoyarsk in Siberia.

Spain

Spain has five active nuclear plants with seven reactors which produced 21% of the country's electricity in 2013. Furthermore, there is legacy high-level waste from another two older, closed plants. Between 2004 and 2011, a bipartisan initiative of the Spanish Government promoted the construction of an interim centralized storage facility (ATC, Almacén Temporal Centralizado), similar to the Dutch COVRA concept. In late 2011 and early 2012 the final green light was given, preliminary studies were being completed and land was purchased near Villar de Cañas (Cuenca) after a competitive tender process. The facility would be initially licensed for 60 years.

However, soon before groundbreaking was slated to begin in 2015, the project was stopped because of a mix of geological, technical, political and ecological problems. By late 2015, the Regional Government considered it "obsolete" and effectively "paralyzed." As of early 2017, the project has not been shelved but it stays frozen and no further action is expected anytime soon. Meanwhile, the spent nuclear fuel and other high-level waste is being kept in the plants' pools, as well as on-site dry cask storage (almacenes temporales individualizados) in Garoña and Trillo.

As of early 2017, there are no plans for a permanent high-level disposal facility either. Low- and medium-level waste is stored in the El Cabril facility (Province of Cordoba.)

Sweden

In Sweden, as of 2007 there are ten operating nuclear reactors that produce about 45% of its electricity. Two other reactors in Barsebäck were shut down in 1999 and 2005. When these reactors were built, it was expected their nuclear fuel would be reprocessed in a foreign country, and the reprocessing waste would not be returned to Sweden. Later, construction of a domestic reprocessing plant was contemplated, but has not been built.

Passage of the Stipulation Act of 1977 transferred responsibility for nuclear waste management from the government to the nuclear industry, requiring reactor operators to present an acceptable plan for waste management with "absolute safety" in order to obtain an operating license. In early 1980, after the Three Mile Island meltdown in the United States, a referendum was held on the future use of nuclear power in Sweden. In late 1980, after a three-question referendum produced mixed results, the Swedish Parliament decided to phase out existing reactors by 2010. On 5 February 2009, the Government of Sweden announced an agreement allowing for the replacement of existing reactors, effectively ending the phase-out policy. In 2010, the Swedish government opened up for construction of new nuclear reactors. The new units can only be built at the existing nuclear power sites, Oskarshamn, Ringhals or Forsmark, and only to replace one of the existing reactors, that will have to be shut down for the new one to be able to start up.

The Swedish Nuclear Fuel and Waste Management Company. (Svensk Kärnbränslehantering AB, known as SKB) was created in 1980 and is responsible for final disposal of nuclear waste there. This includes operation of a monitored retrievable storage facility, the Central Interim Storage Facility for Spent Nuclear Fuel at Oskarshamn, about 240 kilometres (150 mi) south of Stockholm on the Baltic coast; transportation of spent fuel; and construction of a permanent repository. Swedish utilities store spent fuel at the reactor site for one year before transporting it to the facility at Oskarshamn, where it will be stored in excavated caverns filled with water for about 30 years before removal to a permanent repository.

Conceptual design of a permanent repository was determined by 1983, calling for placement of copper-clad iron canisters in granite bedrock about 500 metres (1,600 ft) underground, below the water table in what is known as the KBS-3 method. Space around the canisters will be filled with bentonite clay. After examining six possible locations for a permanent repository, three were nominated for further investigation, at Osthammar, Oskarshamn, and Tierp. On 3 June 2009, Swedish Nuclear Fuel and Waste Co. chose a location for a deep-level waste site at Östhammar, near Forsmark Nuclear Power plant. The application to build the repository was handed in by SKB 2011, and was approved by the Swedish Government on 27 January 2022.

Switzerland

Switzerland has five nuclear reactors that provide about 43% of its electricity around 2007 (34% in 2015). Some Swiss spent nuclear fuel has been sent for reprocessing in France and the United Kingdom; most fuel is being stored without reprocessing. An industry-owned organization, ZWILAG, built and operates a central interim storage facility for spent nuclear fuel and high-level radioactive waste, and for conditioning low-level radioactive waste and for incinerating wastes. Other interim storage facilities predating ZWILAG continue to operate in Switzerland.

The Swiss program is considering options for the siting of a deep repository for high-level radioactive waste disposal, and for low and intermediate level wastes. Construction of a repository is not foreseen until well into this century. Research on sedimentary rock (especially Opalinus Clay) is carried out at the Swiss Mont Terri rock laboratory; the Grimsel Test Site, an older facility in crystalline rock is also still active.

United Kingdom

Great Britain has 19 operating reactors, producing about 20% of its electricity. It processes much of its spent fuel at Sellafield on the northwest coast across from Ireland, where nuclear waste is vitrified and sealed in stainless steel canisters for dry storage above ground for at least 50 years before eventual deep geologic disposal. Sellafield has a history of environmental and safety problems, including a fire in a nuclear plant in Windscale, and a significant incident in 2005 at the main reprocessing plant (THORP).

In 1982 the Nuclear Industry Radioactive Waste Management Executive (NIREX) was established with responsibility for disposing of long-lived nuclear waste and in 2006 a Committee on Radioactive Waste Management (CoRWM) of the Department of Environment, Food and Rural Affairs recommended geologic disposal 200–1,000 metres (660–3,280 ft) underground. NIREX developed a generic repository concept based on the Swedish model but has not yet selected a site. A Nuclear Decommissioning Authority is responsible for packaging waste from reprocessing and will eventually relieve British Nuclear Fuels Ltd. of responsibility for power reactors and the Sellafield reprocessing plant.

North America

Canada

The 18 operating nuclear power plants in Canada generated about 16% of its electricity in 2006. A national Nuclear Fuel Waste Act was enacted by the Canadian Parliament in 2002, requiring nuclear energy corporations to create a waste management organization to propose to the Government of Canada approaches for management of nuclear waste, and implementation of an approach subsequently selected by the government. The Act defined management as "long term management by means of storage or disposal, including handling, treatment, conditioning or transport for the purpose of storage or disposal."

The resulting Nuclear Waste Management Organization (NWMO) conducted an extensive three-year study and consultation with Canadians. In 2005, they recommended Adaptive Phased Management, an approach that emphasized both technical and management methods. The technical method included centralized isolation and containment of spent nuclear fuel in a deep geologic repository in a suitable rock formation, such as the granite of the Canadian Shield or Ordovician sedimentary rocks. Also recommended was a phased decision-making process supported by a program of continuous learning, research and development.

In 2007, the Canadian government accepted this recommendation, and NWMO was tasked with implementing the recommendation. No specific timeframe was defined for the process. In 2009, the NWMO was designing the process for site selection; siting was expected to take 10 years or more.

United States

The locations across the U.S. where nuclear waste is stored

The Nuclear Waste Policy Act of 1982 established a timetable and procedure for constructing a permanent, underground repository for high-level radioactive waste by the mid-1990s, and provided for some temporary storage of waste, including spent fuel from 104 civilian nuclear reactors that produce about 19.4% of electricity there. The United States in April 2008 had about 56,000 tonnes (120 million pounds) of spent fuel and 20,000 canisters of solid defense-related waste, and this is expected to increase to 119,000 tonnes (260 million pounds) by 2035. The U.S. opted for Yucca Mountain nuclear waste repository, a final repository at Yucca Mountain in Nevada, but this project was widely opposed, with some of the main concerns being long-distance transportation of waste from across the United States to this site, the possibility of accidents, and the uncertainty of success in isolating nuclear waste from the human environment in perpetuity. Yucca Mountain, with capacity for 70,000 tonnes (150 million pounds) of radioactive waste, was expected to open in 2017. However, the Obama Administration rejected use of the site in the 2009 United States Federal Budget proposal, which eliminated all funding except that needed to answer inquiries from the Nuclear Regulatory Commission, "while the Administration devises a new strategy toward nuclear waste disposal." On March 5, 2009, Energy Secretary Steven Chu told a Senate hearing "the Yucca Mountain site no longer was viewed as an option for storing reactor waste." Starting in 1999, military-generated nuclear waste is being entombed at the Waste Isolation Pilot Plant in New Mexico.

Since the fraction of a radioisotope's atoms decaying per unit of time is inversely proportional to its half-life, the relative radioactivity of a quantity of buried human radioactive waste would diminish over time compared to natural radioisotopes; such as the decay chains of 120 million megatonnes (260 quadrillion pounds) of thorium and 40 million megatonnes (88 quadrillion pounds) of uranium which are at relatively trace concentrations of parts per million each over the crust's 30,000 quadrillion tonnes (66,000,000 quadrillion pounds) mass. For instance, over a timeframe of thousands of years, after the most active short half-life radioisotopes decayed, burying U.S. nuclear waste would increase the radioactivity in the top 610 metres (2,000 ft) of rock and soil in the United States (10 million square kilometres, 3.9 million square miles) by 1 part in 10 million over the cumulative amount of natural radioisotopes in such a volume, although the vicinity of the site would have a far higher concentration of artificial radioisotopes underground than such an average.

In a Presidential Memorandum dated January 29, 2010, President Obama established the Blue Ribbon Commission on America’s Nuclear Future (the commission). The commission, composed of fifteen members, conducted an extensive two-year study of nuclear waste disposal, what is referred to as the "back end" of the nuclear energy process. The commission established three subcommittees: Reactor and Fuel Cycle Technology, Transportation and Storage, and Disposal. On January 26, 2012, the Commission submitted its final report to Energy Secretary Steven Chu. In the Disposal Subcommittee’s final report, the Commission does not issue recommendations for a specific site but rather presents a comprehensive recommendation for disposal strategies. During their research, the Commission visited Finland, France, Japan, Russia, Sweden, and the UK. In their final report, the Commission put forth seven recommendations for developing a comprehensive strategy to pursue:

Recommendation #1
The United States should undertake an integrated nuclear waste management program that leads to the timely development of one or more permanent deep geological facilities for the safe disposal of spent fuel and high-level nuclear waste.
Recommendation #2
A new, single-purpose organization is needed to develop and implement a focused, integrated program for the transportation, storage, and disposal of nuclear waste in the United States.
Recommendation #3
Assured access to the balance in the Nuclear Waste Fund (NWF) and to the revenues generated by annual nuclear waste fee payments from utility ratepayers is absolutely essential and must be provided to the new nuclear waste management organization.
Recommendation #4
A new approach is needed to site and develop nuclear waste facilities in the United States in the future. We believe that these processes are most likely to succeed if they are:
  • Adaptive—in the sense that process itself is flexible and produces decisions that are responsive to new information and new technical, social, or political developments.
  • Staged—in the sense that key decisions are revisited and modified as necessary along the way rather than being pre-determined in advance.
  • Consent-based—in the sense that affected communities have an opportunity to decide whether to accept facility siting decisions and retain significant local control.
  • Transparent—in the sense that all stakeholders have an opportunity to understand key decisions and engage in the process in a meaningful way.
  • Standards- and science-based—in the sense that the public can have confidence that all facilities meet rigorous, objective, and consistently-applied standards of safety and environmental protection.
  • Governed by partnership arrangements or legally-enforceable agreements with host states, tribes and local communities.
Recommendation #5
The current division of regulatory responsibilities for long-term repository performance between the NRC and the EPA is appropriate and should continue. The two agencies should develop new, site-independent safety standards in a formally coordinated joint process that actively engages and solicits input from all the relevant constituencies.
Recommendation #6
The roles, responsibilities, and authorities of local, state, and tribal governments (with respect to facility siting and other aspects of nuclear waste disposal) must be an element of the negotiation between the federal government and the other affected units of government in establishing a disposal facility. In addition to legally-binding agreements, as discussed in Recommendation #4, all affected levels of government (local, state, tribal, etc.) must have, at a minimum, a meaningful consultative role in all other important decisions. Additionally, states and tribes should retain—or where appropriate, be delegated—direct authority over aspects of regulation, permitting, and operations where oversight below the federal level can be exercised effectively and in a way that is helpful in protecting the interests and gaining the confidence of affected communities and citizens.[96]
Recommendation #7
The Nuclear Waste Technical Review Board (NWTRB) should be retained as a valuable source of independent technical advice and review.

Biden administration has recommended the categorization of waste by level of radioactivity rather than the source of the waste which would enable new management plans.

International repository

Although Australia does not have any nuclear power reactors, Pangea Resources considered siting an international repository in the outback of South Australia or Western Australia in 1998, but this stimulated legislative opposition in both states and the Australian national Senate during the following year. Thereafter, Pangea ceased operations in Australia but reemerged as Pangea International Association, and in 2002 evolved into the Association for Regional and International Underground Storage with support from Belgium, Bulgaria, Hungary, Japan and Switzerland. A general concept for an international repository has been advanced by one of the principals in all three ventures. Russia has expressed interest in serving as a repository for other countries, but does not envision sponsorship or control by an international body or group of other countries. South Africa, Argentina and western China have also been mentioned as possible locations.

In the EU, COVRA is negotiating a European-wide waste disposal system with single disposal sites that can be used by several EU-countries. This EU-wide storage possibility is being researched under the SAPIERR-2 program.

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...