Search This Blog

Thursday, September 6, 2018

Complementarity (physics)

From Wikipedia, the free encyclopedia

In physics, complementarity is both a theoretical and an experimental result of quantum mechanics, also referred to as principle of complementarity. It holds that objects have certain pairs of complementary properties which cannot all be observed or measured simultaneously.

The complementarity principle was formulated by Niels Bohr, a leading founder of quantum mechanics. Examples of complementary properties that Bohr considered:

Concept

For example, the particle and wave aspects of physical objects are such complementary phenomena. Both concepts are borrowed from classical mechanics, where it is impossible to be a particle and wave at the same time. Therefore, it is impossible to measure the full properties of the wave and particle at a particular moment. Moreover, Bohr implies that it is not possible to regard objects governed by quantum mechanics as having intrinsic properties independent of determination with a measuring device, a viewpoint supported by the Kochen-Specker theorem. The type of measurement determines which property is shown. However the single and double-slit experiment and other experiments show that some effects of wave and particle can be measured in one measurement.

Nature

An aspect of complementarity is that it not only applies to measurability or knowability of some property of a physical entity, but more importantly it applies to the limitations of that physical entity’s very manifestation of the property in the physical world. All properties of physical entities exist only in pairs, which Bohr described as complementary or conjugate pairs. Physical reality is determined and defined by manifestations of properties which are limited by trade-offs between these complementary pairs. For example, an electron can manifest a greater and greater accuracy of its position only in even trade for a complementary loss in accuracy of manifesting its momentum. This means that there is a limitation on the precision with which an electron can possess (i.e., manifest) position, since an infinitely precise position would dictate that its manifested momentum would be infinitely imprecise, or undefined (i.e., non-manifest or not possessed), which is not possible. The ultimate limitations in precision of property manifestations are quantified by the Heisenberg uncertainty principle and Planck units. Complementarity and Uncertainty dictate that therefore all properties and actions in the physical world manifest themselves as non-deterministic to some degree.
Physicists F.A.M. Frescura and Basil Hiley have summarized the reasons for the introduction of the principle of complementarity in physics as follows:
“In the traditional view, it is assumed that there exists a reality in space-time and that this reality is a given thing, all of whose aspects can be viewed or articulated at any given moment. Bohr was the first to point out that quantum mechanics called this traditional outlook into question. To him the ‘indivisibility of the quantum of action’, which was his way of describing the uncertainty principle, implied that not all aspects of a system can be viewed simultaneously. By using one particular piece of apparatus only certain features could be made manifest at the expense of others, while with a different piece of apparatus another complementary aspect could be made manifest in such a way that the original set became non-manifest, that is, the original attributes were no longer well defined. For Bohr, this was an indication that the principle of complementarity, a principle that he had previously known to appear extensively in other intellectual disciplines but which did not appear in classical physics, should be adopted as a universal principle.”
The emergence of complementarity in a system occurs when one considers the circumstances under which one attempts to measure its properties; as Bohr noted, the principle of complementarity "implies the impossibility of any sharp separation between the behaviour of atomic objects and the interaction with the measuring instruments that serve to define the conditions under which the phenomena appear." It is important to distinguish, as did Bohr in his original statements, the principle of complementarity from a statement of the uncertainty principle. For a technical discussion of contemporary issues surrounding complementarity in physics see, e.g., Bandyopadhyay (2000), from which parts of this discussion were drawn.

Additional considerations

In his original lecture on the topic, Bohr pointed out that just as the finitude of the speed of light implies the impossibility of a sharp separation between space and time (relativity), the finitude of the quantum of action implies the impossibility of a sharp separation between the behavior of a system and its interaction with the measuring instruments and leads to the well known difficulties with the concept of 'state' in quantum theory; the notion of complementarity is intended to symbolize this new situation in epistemology created by quantum theory. Some people consider it a philosophical adjunct to quantum mechanics, while others consider it to be a discovery that is as important as the formal aspects of quantum theory. Examples of the latter include Leon Rosenfeld, who claimed that "[C]omplementarity is not a philosophical superstructure invented by Bohr to be placed as a decoration on top of the quantal formalism, it is the bedrock of the quantal description.", and John Wheeler, who opined that "Bohr's principle of complementarity is the most revolutionary scientific concept of this century and the heart of his fifty-year search for the full significance of the quantum idea."

Experiments

The quintessential example of wave–particle complementarity in the laboratory is the double slit. The crux of the complementary behavior is the question: "What information exists – embedded in the constituents of the universe – that can reveal the history of the signal particles as they pass through the double slit?" If information exists (even if it is not measured by a conscious observer) that reveals "which slit" each particle traversed, then each particle will exhibit no wave interference with the other slit. This is the particle-like behavior. But if no information exists about which slit – so that no conscious observer, no matter how well equipped, will ever be able to determine which slit each particle traverses – then the signal particles will interfere with themselves as if they traveled through both slits at the same time, as a wave. This is the wave-like behavior. These behaviors are complementary, according to the Englert–Greenberger duality relation, because when one behavior is observed the other is absent. Both behaviors can be observed at the same time, but each only as lesser manifestations of their full behavior (as determined by the duality relation). This superposition of complementary behaviors exists whenever there is partial "which slit" information. While there is some contention to the duality relation, and thus complementarity itself, the contrary position is not accepted by mainstream physics. Double slit experiments with single photons show clearly that photons are particles at the same time as they are waves. Photons impact the screen where they are detected in points, and when enough points have accumulated the wave aspect is clearly visible. Also the particle and wave aspect is seen at the same time in photons that are stationary.

Various neutron interferometry experiments demonstrate the subtlety of the notions of duality and complementarity. By passing through the interferometer, the neutron appears to act as a wave. Yet upon passage, the neutron is subject to gravitation. As the neutron interferometer is rotated through Earth's gravitational field a phase change between the two arms of the interferometer can be observed, accompanied by a change in the constructive and destructive interference of the neutron waves on exit from the interferometer. Some interpretations claim that understanding the interference effect requires one to concede that a single neutron takes both paths through the interferometer at the same time; a single neutron would "be in two places at once", as it were. Since the two paths through a neutron interferometer can be as far as 5 cm to 15 cm apart, the effect is hardly microscopic. This is similar to traditional double-slit and mirror interferometer experiments where the slits (or mirrors) can be arbitrarily far apart. So, in interference and diffraction experiments, neutrons behave the same way as photons (or electrons) of corresponding wavelength.

History

Niels Bohr apparently conceived of the principle of complementarity during a skiing vacation in Norway in February and March 1927, during which he received a letter from Werner Heisenberg regarding the latter's newly discovered (and not yet published) uncertainty principle. Upon returning from his vacation, by which time Heisenberg had already submitted his paper on the uncertainty principle for publication, he convinced Heisenberg that the uncertainty principle was a manifestation of the deeper concept of complementarity. Heisenberg duly appended a note to this effect to his paper on the uncertainty principle, before its publication, stating:
Bohr has brought to my attention [that] the uncertainty in our observation does not arise exclusively from the occurrence of discontinuities, but is tied directly to the demand that we ascribe equal validity to the quite different experiments which show up in the [particulate] theory on one hand, and in the wave theory on the other hand.
Bohr publicly introduced the principle of complementarity in a lecture he delivered on 16 September 1927 at the International Physics Congress held in Como, Italy, attended by most of the leading physicists of the era, with the notable exceptions of Einstein, Schrödinger, and Dirac. However, these three were in attendance one month later when Bohr again presented the principle at the Fifth Solvay Congress in Brussels, Belgium. The lecture was published in the proceedings of both of these conferences, and was republished the following year in Naturwissenschaften (in German) and in Nature (in English).

An article written by Bohr in 1949 titled "Discussions with Einstein on Epistemological Problems in Atomic Physics" is considered by many to be a definitive description of the notion of complementarity.

Wave–particle duality

From Wikipedia, the free encyclopedia

Wave–particle duality is the concept in quantum mechanics that every particle or quantic entity may be partly described in terms not only of particles, but also of waves. It expresses the inability of the classical concepts "particle" or "wave" to fully describe the behavior of quantum-scale objects. As Albert Einstein wrote:
It seems as though we must use sometimes the one theory and sometimes the other, while at times we may use either. We are faced with a new kind of difficulty. We have two contradictory pictures of reality; separately neither of them fully explains the phenomena of light, but together they do.
Through the work of Max Planck, Albert Einstein, Louis de Broglie, Arthur Compton, Niels Bohr and many others, current scientific theory holds that all particles exhibit a wave nature and vice versa. This phenomenon has been verified not only for elementary particles, but also for compound particles like atoms and even molecules. For macroscopic particles, because of their extremely short wavelengths, wave properties usually cannot be detected.

Although the use of the wave-particle duality has worked well in physics, the meaning or interpretation has not been satisfactorily resolved; see Interpretations of quantum mechanics.
Bohr regarded the "duality paradox" as a fundamental or metaphysical fact of nature. A given kind of quantum object will exhibit sometimes wave, sometimes particle, character, in respectively different physical settings. He saw such duality as one aspect of the concept of complementarity. Bohr regarded renunciation of the cause-effect relation, or complementarity, of the space-time picture, as essential to the quantum mechanical account.

Werner Heisenberg considered the question further. He saw the duality as present for all quantic entities, but not quite in the usual quantum mechanical account considered by Bohr. He saw it in what is called second quantization, which generates an entirely new concept of fields which exist in ordinary space-time, causality still being visualizable. Classical field values (e.g. the electric and magnetic field strengths of Maxwell) are replaced by an entirely new kind of field value, as considered in quantum field theory. Turning the reasoning around, ordinary quantum mechanics can be deduced as a specialized consequence of quantum field theory.

Brief history of wave and particle viewpoints

Democritus—the original atomist—argued that all things in the universe, including light, are composed of indivisible sub-components (light being some form of solar atom). At the beginning of the 11th Century, the Arabic scientist Alhazen wrote the first comprehensive treatise on optics; describing refraction, reflection, and the operation of a pinhole lens via rays of light traveling from the point of emission to the eye. He asserted that these rays were composed of particles of light. In 1630, René Descartes popularized and accredited the opposing wave description in his treatise on light, showing that the behavior of light could be re-created by modeling wave-like disturbances in a universal medium ("plenum"). Beginning in 1670 and progressing over three decades, Isaac Newton developed and championed his corpuscular hypothesis, arguing that the perfectly straight lines of reflection demonstrated light's particle nature; only particles could travel in such straight lines. He explained refraction by positing that particles of light accelerated laterally upon entering a denser medium. Around the same time, Newton's contemporaries Robert Hooke and Christiaan Huygens—and later Augustin-Jean Fresnel—mathematically refined the wave viewpoint, showing that if light traveled at different speeds in different media (such as water and air), refraction could be easily explained as the medium-dependent propagation of light waves. The resulting Huygens–Fresnel principle was extremely successful at reproducing light's behavior and was subsequently supported by Thomas Young's 1801 discovery of double-slit interference. The wave view did not immediately displace the ray and particle view, but began to dominate scientific thinking about light in the mid 19th century, since it could explain polarization phenomena that the alternatives could not.

Thomas Young's sketch of two-slit diffraction of waves, 1803

James Clerk Maxwell discovered that he could apply his equations for electromagnetism, which had been previously discovered, along with a slight modification to describe self-propagating waves of oscillating electric and magnetic fields. It quickly became apparent that visible light, ultraviolet light, and infrared light (phenomena thought previously to be unrelated) were all electromagnetic waves of differing frequency. The wave theory had prevailed—or at least it seemed to.

While the 19th century had seen the success of the wave theory at describing light, it had also witnessed the rise of the atomic theory at describing matter. Antoine Lavoisier deduced the law of conservation of mass and categorized many new chemical elements and compounds; and Joseph Louis Proust advanced chemistry towards the atom by showing that elements combined in definite proportions. This led John Dalton to propose that elements were invisible sub components; Amedeo Avogadro discovered diatomic gases and completed the basic atomic theory, allowing the correct molecular formulae of most known compounds—as well as the correct weights of atoms—to be deduced and categorized in a consistent manner. Dimitri Mendeleev saw an order in recurring chemical properties, and created a table presenting the elements in unprecedented order and symmetry.

Animation showing the wave-particle duality with a double slit experiment and effect of an observer. Increase size to see explanations in the video itself. 
 
A quantum particle is represented by a wave packet.
Interference of a quantum particle with itself.

Turn of the 20th century and the paradigm shift

Particles of electricity

At the close of the 19th century, the reductionism of atomic theory began to advance into the atom itself; determining, through physics, the nature of the atom and the operation of chemical reactions. Electricity, first thought to be a fluid, was now understood to consist of particles called electrons. This was first demonstrated by J. J. Thomson in 1897 when, using a cathode ray tube, he found that an electrical charge would travel across a vacuum (which would possess infinite resistance in classical theory). Since the vacuum offered no medium for an electric fluid to travel, this discovery could only be explained via a particle carrying a negative charge and moving through the vacuum. This electron flew in the face of classical electrodynamics, which had successfully treated electricity as a fluid for many years (leading to the invention of batteries, electric motors, dynamos, and arc lamps). More importantly, the intimate relation between electric charge and electromagnetism had been well documented following the discoveries of Michael Faraday and James Clerk Maxwell. Since electromagnetism was known to be a wave generated by a changing electric or magnetic field (a continuous, wave-like entity itself) an atomic/particle description of electricity and charge was a non sequitur. Furthermore, classical electrodynamics was not the only classical theory rendered incomplete.

Radiation quantization

In 1901, Max Planck published an analysis that succeeded in reproducing the observed spectrum of light emitted by a glowing object. To accomplish this, Planck had to make an ad hoc mathematical assumption of quantized energy of the oscillators (atoms of the black body) that emit radiation. Einstein later proposed that electromagnetic radiation itself is quantized, not the energy of radiating atoms.

Black-body radiation, the emission of electromagnetic energy due to an object's heat, could not be explained from classical arguments alone. The equipartition theorem of classical mechanics, the basis of all classical thermodynamic theories, stated that an object's energy is partitioned equally among the object's vibrational modes. But applying the same reasoning to the electromagnetic emission of such a thermal object was not so successful. That thermal objects emit light had been long known. Since light was known to be waves of electromagnetism, physicists hoped to describe this emission via classical laws. This became known as the black body problem. Since the equipartition theorem worked so well in describing the vibrational modes of the thermal object itself, it was natural to assume that it would perform equally well in describing the radiative emission of such objects. But a problem quickly arose: if each mode received an equal partition of energy, the short wavelength modes would consume all the energy. This became clear when plotting the Rayleigh–Jeans law which, while correctly predicting the intensity of long wavelength emissions, predicted infinite total energy as the intensity diverges to infinity for short wavelengths. This became known as the ultraviolet catastrophe.

In 1900, Max Planck hypothesized that the frequency of light emitted by the black body depended on the frequency of the oscillator that emitted it, and the energy of these oscillators increased linearly with frequency (according to his constant h, where E = hν). This was not an unsound proposal considering that macroscopic oscillators operate similarly: when studying five simple harmonic oscillators of equal amplitude but different frequency, the oscillator with the highest frequency possesses the highest energy (though this relationship is not linear like Planck's). By demanding that high-frequency light must be emitted by an oscillator of equal frequency, and further requiring that this oscillator occupy higher energy than one of a lesser frequency, Planck avoided any catastrophe; giving an equal partition to high-frequency oscillators produced successively fewer oscillators and less emitted light. And as in the Maxwell–Boltzmann distribution, the low-frequency, low-energy oscillators were suppressed by the onslaught of thermal jiggling from higher energy oscillators, which necessarily increased their energy and frequency.

The most revolutionary aspect of Planck's treatment of the black body is that it inherently relies on an integer number of oscillators in thermal equilibrium with the electromagnetic field. These oscillators give their entire energy to the electromagnetic field, creating a quantum of light, as often as they are excited by the electromagnetic field, absorbing a quantum of light and beginning to oscillate at the corresponding frequency. Planck had intentionally created an atomic theory of the black body, but had unintentionally generated an atomic theory of light, where the black body never generates quanta of light at a given frequency with an energy less than . However, once realizing that he had quantized the electromagnetic field, he denounced particles of light as a limitation of his approximation, not a property of reality.

Photoelectric effect illuminated

While Planck had solved the ultraviolet catastrophe by using atoms and a quantized electromagnetic field, most contemporary physicists agreed that Planck's "light quanta" represented only flaws in his model. A more-complete derivation of black body radiation would yield a fully continuous and 'wave-like' electromagnetic field with no quantization. However, in 1905 Albert Einstein took Planck's black body model to produce his solution to another outstanding problem of the day: the photoelectric effect, wherein electrons are emitted from atoms when they absorb energy from light. Since their existence was theorized eight years previously, phenomenon had been studied with the electron model in mind in physics laboratories worldwide.

In 1902 Philipp Lenard discovered that the energy of these ejected electrons did not depend on the intensity of the incoming light, but instead on its frequency. So if one shines a little low-frequency light upon a metal, a few low energy electrons are ejected. If one now shines a very intense beam of low-frequency light upon the same metal, a whole slew of electrons are ejected; however they possess the same low energy, there are merely more of them. The more light there is, the more electrons are ejected. Whereas in order to get high energy electrons, one must illuminate the metal with high-frequency light. Like blackbody radiation, this was at odds with a theory invoking continuous transfer of energy between radiation and matter. However, it can still be explained using a fully classical description of light, as long as matter is quantum mechanical in nature.

If one used Planck's energy quanta, and demanded that electromagnetic radiation at a given frequency could only transfer energy to matter in integer multiples of an energy quantum , then the photoelectric effect could be explained very simply. Low-frequency light only ejects low-energy electrons because each electron is excited by the absorption of a single photon. Increasing the intensity of the low-frequency light (increasing the number of photons) only increases the number of excited electrons, not their energy, because the energy of each photon remains low. Only by increasing the frequency of the light, and thus increasing the energy of the photons, can one eject electrons with higher energy. Thus, using Planck's constant h to determine the energy of the photons based upon their frequency, the energy of ejected electrons should also increase linearly with frequency; the gradient of the line being Planck's constant. These results were not confirmed until 1915, when Robert Andrews Millikan, who had previously determined the charge of the electron, produced experimental results in perfect accord with Einstein's predictions. While the energy of ejected electrons reflected Planck's constant, the existence of photons was not explicitly proven until the discovery of the photon antibunching effect, of which a modern experiment can be performed in undergraduate-level labs.[13] This phenomenon could only be explained via photons, and not through any semi-classical theory (which could alternatively explain the photoelectric effect). When Einstein received his Nobel Prize in 1921, it was not for his more difficult and mathematically laborious special and general relativity, but for the simple, yet totally revolutionary, suggestion of quantized light. Einstein's "light quanta" would not be called photons until 1925, but even in 1905 they represented the quintessential example of wave-particle duality. Electromagnetic radiation propagates following linear wave equations, but can only be emitted or absorbed as discrete elements, thus acting as a wave and a particle simultaneously.

Einstein's explanation of the photoelectric effect

The photoelectric effect. Incoming photons on the left strike a metal plate (bottom), and eject electrons, depicted as flying off to the right.

In 1905, Albert Einstein provided an explanation of the photoelectric effect, a hitherto troubling experiment that the wave theory of light seemed incapable of explaining. He did so by postulating the existence of photons, quanta of light energy with particulate qualities.

In the photoelectric effect, it was observed that shining a light on certain metals would lead to an electric current in a circuit. Presumably, the light was knocking electrons out of the metal, causing current to flow. However, using the case of potassium as an example, it was also observed that while a dim blue light was enough to cause a current, even the strongest, brightest red light available with the technology of the time caused no current at all. According to the classical theory of light and matter, the strength or amplitude of a light wave was in proportion to its brightness: a bright light should have been easily strong enough to create a large current. Yet, oddly, this was not so.
Einstein explained this enigma by postulating that the electrons can receive energy from electromagnetic field only in discrete portions (quanta that were called photons): an amount of energy E that was related to the frequency f of the light by
E=hf\,
where h is Planck's constant (6.626 × 10−34 J seconds). Only photons of a high enough frequency (above a certain threshold value) could knock an electron free. For example, photons of blue light had sufficient energy to free an electron from the metal, but photons of red light did not. One photon of light above the threshold frequency could release only one electron; the higher the frequency of a photon, the higher the kinetic energy of the emitted electron, but no amount of light (using technology available at the time) below the threshold frequency could release an electron. To "violate" this law would require extremely high-intensity lasers which had not yet been invented. Intensity-dependent phenomena have now been studied in detail with such lasers.

Einstein was awarded the Nobel Prize in Physics in 1921 for his discovery of the law of the photoelectric effect.

De Broglie's wavelength

Propagation of de Broglie waves in 1d—real part of the complex amplitude is blue, imaginary part is green. The probability (shown as the colour opacity) of finding the particle at a given point x is spread out like a waveform; there is no definite position of the particle. As the amplitude increases above zero the curvature decreases, so the amplitude decreases again, and vice versa—the result is an alternating amplitude: a wave. Top: Plane wave. Bottom: Wave packet.

In 1924, Louis-Victor de Broglie formulated the de Broglie hypothesis, claiming that all matter, not just light, has a wave-like nature; he related wavelength (denoted as λ), and momentum (denoted as p):
\lambda ={\frac  {h}{p}}
This is a generalization of Einstein's equation above, since the momentum of a photon is given by p = {\tfrac  {E}{c}} and the wavelength (in a vacuum) by λ = {\tfrac  {c}{f}}, where c is the speed of light in vacuum.

De Broglie's formula was confirmed three years later for electrons (which differ from photons in having a rest mass) with the observation of electron diffraction in two independent experiments. At the University of Aberdeen, George Paget Thomson passed a beam of electrons through a thin metal film and observed the predicted interference patterns. At Bell Labs, Clinton Joseph Davisson and Lester Halbert Germer guided their beam through a crystalline grid.
 
De Broglie was awarded the Nobel Prize for Physics in 1929 for his hypothesis. Thomson and Davisson shared the Nobel Prize for Physics in 1937 for their experimental work.

Heisenberg's uncertainty principle

In his work on formulating quantum mechanics, Werner Heisenberg postulated his uncertainty principle, which states:
{\displaystyle \Delta x\,\Delta p\geq {\tfrac {1}{2}}\hbar }
where
\Delta here indicates standard deviation, a measure of spread or uncertainty;
x and p are a particle's position and linear momentum respectively.
\hbar is the reduced Planck's constant (Planck's constant divided by 2\pi ).
Heisenberg originally explained this as a consequence of the process of measuring: Measuring position accurately would disturb momentum and vice versa, offering an example (the "gamma-ray microscope") that depended crucially on the de Broglie hypothesis. The thought is now, however, that this only partly explains the phenomenon, but that the uncertainty also exists in the particle itself, even before the measurement is made.

In fact, the modern explanation of the uncertainty principle, extending the Copenhagen interpretation first put forward by Bohr and Heisenberg, depends even more centrally on the wave nature of a particle: Just as it is nonsensical to discuss the precise location of a wave on a string, particles do not have perfectly precise positions; likewise, just as it is nonsensical to discuss the wavelength of a "pulse" wave traveling down a string, particles do not have perfectly precise momenta (which corresponds to the inverse of wavelength). Moreover, when position is relatively well defined, the wave is pulse-like and has a very ill-defined wavelength (and thus momentum). And conversely, when momentum (and thus wavelength) is relatively well defined, the wave looks long and sinusoidal, and therefore it has a very ill-defined position.

de Broglie–Bohm theory

Couder experiments,[17] "materializing" the pilot wave model.

De Broglie himself had proposed a pilot wave construct to explain the observed wave-particle duality. In this view, each particle has a well-defined position and momentum, but is guided by a wave function derived from Schrödinger's equation. The pilot wave theory was initially rejected because it generated non-local effects when applied to systems involving more than one particle. Non-locality, however, soon became established as an integral feature of quantum theory (see EPR paradox), and David Bohm extended de Broglie's model to explicitly include it.

In the resulting representation, also called the de Broglie–Bohm theory or Bohmian mechanics, the wave-particle duality vanishes, and explains the wave behaviour as a scattering with wave appearance, because the particle's motion is subject to a guiding equation or quantum potential. "This idea seems to me so natural and simple, to resolve the wave-particle dilemma in such a clear and ordinary way, that it is a great mystery to me that it was so generally ignored", J.S.Bell.

The best illustration of the pilot-wave model was given by Couder's 2010 "walking droplets" experiments, demonstrating the pilot-wave behaviour in a macroscopic mechanical analog.

Wave behavior of large objects

Since the demonstrations of wave-like properties in photons and electrons, similar experiments have been conducted with neutrons and protons. Among the most famous experiments are those of Estermann and Otto Stern in 1929. Authors of similar recent experiments with atoms and molecules, described below, claim that these larger particles also act like waves.

A dramatic series of experiments emphasizing the action of gravity in relation to wave–particle duality was conducted in the 1970s using the neutron interferometer. Neutrons, one of the components of the atomic nucleus, provide much of the mass of a nucleus and thus of ordinary matter. In the neutron interferometer, they act as quantum-mechanical waves directly subject to the force of gravity. While the results were not surprising since gravity was known to act on everything, including light (see tests of general relativity and the Pound–Rebka falling photon experiment), the self-interference of the quantum mechanical wave of a massive fermion in a gravitational field had never been experimentally confirmed before.

In 1999, the diffraction of C60 fullerenes by researchers from the University of Vienna was reported. Fullerenes are comparatively large and massive objects, having an atomic mass of about 720 u. The de Broglie wavelength of the incident beam was about 2.5 pm, whereas the diameter of the molecule is about 1 nm, about 400 times larger. In 2012, these far-field diffraction experiments could be extended to phthalocyanine molecules and their heavier derivatives, which are composed of 58 and 114 atoms respectively. In these experiments the build-up of such interference patterns could be recorded in real time and with single molecule sensitivity.

In 2003, the Vienna group also demonstrated the wave nature of tetraphenylporphyrin—a flat biodye with an extension of about 2 nm and a mass of 614 u. For this demonstration they employed a near-field Talbot Lau interferometer. In the same interferometer they also found interference fringes for C60F48., a fluorinated buckyball with a mass of about 1600 u, composed of 108 atoms. Large molecules are already so complex that they give experimental access to some aspects of the quantum-classical interface, i.e., to certain decoherence mechanisms. In 2011, the interference of molecules as heavy as 6910 u could be demonstrated in a Kapitza–Dirac–Talbot–Lau interferometer. In 2013, the interference of molecules beyond 10,000 u has been demonstrated.

Whether objects heavier than the Planck mass (about the weight of a large bacterium) have a de Broglie wavelength is theoretically unclear and experimentally unreachable; above the Planck mass a particle's Compton wavelength would be smaller than the Planck length and its own Schwarzschild radius, a scale at which current theories of physics may break down or need to be replaced by more general ones.

Recently Couder, Fort, et al. showed that we can use macroscopic oil droplets on a vibrating surface as a model of wave–particle duality—localized droplet creates periodical waves around and interaction with them leads to quantum-like phenomena: interference in double-slit experiment, unpredictable tunneling (depending in complicated way on practically hidden state of field), orbit quantization (that particle has to 'find a resonance' with field perturbations it creates—after one orbit, its internal phase has to return to the initial state) and Zeeman effect.

Treatment in modern quantum mechanics

Wave–particle duality is deeply embedded into the foundations of quantum mechanics. In the formalism of the theory, all the information about a particle is encoded in its wave function, a complex-valued function roughly analogous to the amplitude of a wave at each point in space. This function evolves according to a differential equation (generically called the Schrödinger equation). For particles with mass this equation has solutions that follow the form of the wave equation. Propagation of such waves leads to wave-like phenomena such as interference and diffraction. Particles without mass, like photons, have no solutions of the Schrödinger equation so have another wave.

The particle-like behavior is most evident due to phenomena associated with measurement in quantum mechanics. Upon measuring the location of the particle, the particle will be forced into a more localized state as given by the uncertainty principle. When viewed through this formalism, the measurement of the wave function will randomly "collapse", or rather "decohere", to a sharply peaked function at some location. For particles with mass the likelihood of detecting the particle at any particular location is equal to the squared amplitude of the wave function there. The measurement will return a well-defined position, (subject to uncertainty), a property traditionally associated with particles. It is important to note that a measurement is only a particular type of interaction where some data is recorded and the measured quantity is forced into a particular eigenstate. The act of measurement is therefore not fundamentally different from any other interaction.

Following the development of quantum field theory the ambiguity disappeared. The field permits solutions that follow the wave equation, which are referred to as the wave functions. The term particle is used to label the irreducible representations of the Lorentz group that are permitted by the field. An interaction as in a Feynman diagram is accepted as a calculationally convenient approximation where the outgoing legs are known to be simplifications of the propagation and the internal lines are for some order in an expansion of the field interaction. Since the field is non-local and quantized, the phenomena which previously were thought of as paradoxes are explained. Within the limits of the wave-particle duality the quantum field theory gives the same results.

Visualization

There are two ways to visualize the wave-particle behaviour: by the "standard model", described below; and by the Broglie–Bohm model, where no duality is perceived.

Below is an illustration of wave–particle duality as it relates to De Broglie's hypothesis and Heisenberg's uncertainty principle (above), in terms of the position and momentum space wavefunctions for one spinless particle with mass in one dimension. These wavefunctions are Fourier transforms of each other.

The more localized the position-space wavefunction, the more likely the particle is to be found with the position coordinates in that region, and correspondingly the momentum-space wavefunction is less localized so the possible momentum components the particle could have are more widespread.
Conversely the more localized the momentum-space wavefunction, the more likely the particle is to be found with those values of momentum components in that region, and correspondingly the less localized the position-space wavefunction, so the position coordinates the particle could occupy are more widespread.

Position x and momentum p wavefunctions
corresponding to quantum particles. The colour
opacity (%) of the particles corresponds to the
probability density of finding the particle with
position x or momentum component p.

Top: If wavelength λ is unknown, so are
momentum p, wave-vector k and energy E
 (de Broglie relations). As the particle is more
localized in position space, Δx is smaller than for Δpx.
 
Bottom: If λ is known, so are p, k, and E. As the
particle is more localized in momentum space,
Δp is smaller than for Δx.

Alternative views

Wave–particle duality is an ongoing conundrum in modern physics. Most physicists accept wave-particle duality as the best explanation for a broad range of observed phenomena; however, it is not without controversy. Alternative views are also presented here. These views are not generally accepted by mainstream physics, but serve as a basis for valuable discussion within the community.

Both-particle-and-wave view

The pilot wave model, originally developed by Louis de Broglie and further developed by David Bohm into the hidden variable theory proposes that there is no duality, but rather a system exhibits both particle properties and wave properties simultaneously, and particles are guided, in a deterministic fashion, by the pilot wave (or its "quantum potential") which will direct them to areas of constructive interference in preference to areas of destructive interference. This idea is held by a significant minority within the physics community.

At least one physicist considers the "wave-duality" as not being an incomprehensible mystery. L.E. Ballentine, Quantum Mechanics, A Modern Development, p. 4, explains:
When first discovered, particle diffraction was a source of great puzzlement. Are "particles" really "waves?" In the early experiments, the diffraction patterns were detected holistically by means of a photographic plate, which could not detect individual particles. As a result, the notion grew that particle and wave properties were mutually incompatible, or complementary, in the sense that different measurement apparatuses would be required to observe them. That idea, however, was only an unfortunate generalization from a technological limitation. Today it is possible to detect the arrival of individual electrons, and to see the diffraction pattern emerge as a statistical pattern made up of many small spots (Tonomura et al., 1989). Evidently, quantum particles are indeed particles, but whose behaviour is very different from classical physics would have us to expect.
The Afshar experiment (2007) may suggest that it is possible to simultaneously observe both wave and particle properties of photons. This claim is, however, disputed by other scientists.

Wave-only view

Carver Mead, an American scientist and professor at Caltech, proposes that the duality can be replaced by a "wave-only" view. In his book Collective Electrodynamics: Quantum Foundations of Electromagnetism (2000), Mead purports to analyze the behavior of electrons and photons purely in terms of electron wave functions, and attributes the apparent particle-like behavior to quantization effects and eigenstates. According to reviewer David Haddon:
Mead has cut the Gordian knot of quantum complementarity. He claims that atoms, with their neutrons, protons, and electrons, are not particles at all but pure waves of matter. Mead cites as the gross evidence of the exclusively wave nature of both light and matter the discovery between 1933 and 1996 of ten examples of pure wave phenomena, including the ubiquitous laser of CD players, the self-propagating electrical currents of superconductors, and the Bose–Einstein condensate of atoms.
Albert Einstein, who, in his search for a Unified Field Theory, did not accept wave-particle duality, wrote:
This double nature of radiation (and of material corpuscles) ... has been interpreted by quantum-mechanics in an ingenious and amazingly successful fashion. This interpretation ... appears to me as only a temporary way out...
The many-worlds interpretation (MWI) is sometimes presented as a waves-only theory, including by its originator, Hugh Everett who referred to MWI as "the wave interpretation".

The three wave hypothesis of R. Horodecki relates the particle to wave. The hypothesis implies that a massive particle is an intrinsically spatially, as well as temporally extended, wave phenomenon by a nonlinear law.

Particle-only view

Still in the days of the old quantum theory, a pre-quantum-mechanical version of wave–particle duality was pioneered by William Duane, and developed by others including Alfred Landé. Duane explained diffraction of x-rays by a crystal in terms solely of their particle aspect. The deflection of the trajectory of each diffracted photon was explained as due to quantized momentum transfer from the spatially regular structure of the diffracting crystal.

Neither-wave-nor-particle view

It has been argued that there are never exact particles or waves, but only some compromise or intermediate between them. For this reason, in 1928 Arthur Eddington coined the name "wavicle" to describe the objects although it is not regularly used today. One consideration is that zero-dimensional mathematical points cannot be observed. Another is that the formal representation of such points, the Dirac delta function is unphysical, because it cannot be normalized. Parallel arguments apply to pure wave states. Roger Penrose states:
"Such 'position states' are idealized wavefunctions in the opposite sense from the momentum states. Whereas the momentum states are infinitely spread out, the position states are infinitely concentrated. Neither is normalizable [...]."

Relational approach to wave–particle duality

Relational quantum mechanics has been developed as a point of view that regards the event of particle detection as having established a relationship between the quantized field and the detector. The inherent ambiguity associated with applying Heisenberg’s uncertainty principle is consequently avoided; hence there is no wave-particle duality.

Applications

Although it is difficult to draw a line separating wave–particle duality from the rest of quantum mechanics, it is nevertheless possible to list some applications of this basic idea.
  • Wave–particle duality is exploited in electron microscopy, where the small wavelengths associated with the electron can be used to view objects much smaller than what is visible using visible light.
  • Similarly, neutron diffraction uses neutrons with a wavelength of about 0.1 nm, the typical spacing of atoms in a solid, to determine the structure of solids.
  • Photos are now able to show this dual nature, which may lead to new ways of examining and recording this behaviour.

What a piece of work is a man

From Wikipedia, the free encyclopedia
 
"What a piece of work is man!" is a phrase within a soliloquy by Prince Hamlet in William Shakespeare's play of the same name. Hamlet is reflecting, at first admiringly, and then despairingly, on the human condition.

The speech

The soliloquy, spoken in the play by Prince Hamlet to Rosencrantz and Guildenstern in Act II, Scene 2, follows in its entirety. Rather than appearing in blank verse, the typical mode of composition of Shakespeare's plays, the speech appears in straight prose:
I will tell you why; so shall my anticipation prevent your discovery, and your secrecy to the King and queene: moult no feather. I have of late, (but wherefore I know not) lost all my mirth, forgone all custom of exercises; and indeed, it goes so heavily with my disposition; that this goodly frame the earth, seems to me a sterile promontory; this most excellent canopy the air, look you, this brave o'er hanging firmament, this majestical roof, fretted with golden fire: why, it appeareth no other thing to me, than a foul and pestilent congregation of vapours. What a piece of work is man, How noble in reason, how infinite in faculty, In form and moving how express and admirable, In action how like an Angel, In apprehension how like a god, The beauty of the world, The paragon of animals. And yet to me, what is this quintessence of dust? Man delights not me; no, nor Woman neither; though by your smiling you seem to say so.

Meaning

Hamlet is saying that although humans may appear to think and act "nobly" they are essentially "dust". Hamlet is expressing his melancholy to his old friends over the difference between the best that men aspire to be, and how they actually behave; the great divide that depresses him.

Differences between texts

The speech was fully omitted from Nicholas Ling's 1603 First Quarto, which reads simply:
Yes faith, this great world you see contents me not,
No nor the spangled heauens, nor earth, nor sea,
No nor Man that is so glorious a creature,
Contents not me, no nor woman too, though you laugh.
This version has been argued to have been a bad quarto, a tourbook copy, or an initial draft. By the 1604 Second Quarto, the speech is essentially present but punctuated differently:
What a piece of work is a man, how noble in reason,
how infinite in faculties, in form and moving,
how express and admirable in action, how like an angel in apprehension,
how like a god!
Then, by the 1623 First Folio, it appeared as:
What a piece of worke is a man! how Noble in
Reason? how infinite in faculty? in forme and mouing
how expresse and admirable? in Action, how like an Angel?
in apprehension, how like a God? ...
J. Dover Wilson, in his notes in the New Shakespeare edition, observed that the Folio text "involves two grave difficulties", namely that according to Elizabethan thought angels could apprehend but not act, making "in action how like an angel" nonsensical, and that "express" (which as an adjective means "direct and purposive") makes sense applied to "action", but goes very awkwardly with "form and moving".
These difficulties are remedied if we read it thus:
What a piece of worke is a man! how Noble in
Reason? how infinite in faculty, in forme, and mouing
how expresse and admirable in Action, how like an Angel
in apprehension, how like a God?

Sources

A source well known to Shakespeare is Psalm 8, especially verse 5: "You have made [humans] a little lower than the heavenly beings and crowned them with glory and honor."

Scholars have pointed out this section's similarities to lines written by Montaigne:
Qui luy a persuadé que ce branle admirable de la voute celeste, la lumiere eternelle de ces flambeaux roulans si fierement sur sa teste, les mouvemens espouventables de ceste mer infinie, soyent establis et se continuent tant de siecles, pour sa commodité et pour son service ? Est-il possible de rien imaginer si ridicule, que ceste miserable et chetive creature, qui n’est pas seulement maistresse de soy, exposée aux offences de toutes choses, se die maistresse et emperiere de l’univers?
Who have persuaded [man] that this admirable moving of heavens vaults, that the eternal light of these lampes so fiercely rowling over his head, that the horror-moving and continuall motion of this infinite vaste ocean were established, and continue so many ages for his commoditie and service? Is it possible to imagine so ridiculous as this miserable and wretched creature, which is not so much as master of himselfe, exposed and subject to offences of all things, and yet dareth call himself Master and Emperor.
However, rather than being a direct influence on Shakespeare, Montaigne may have merely been reacting to the same general atmosphere of the time, making the source of these lines one of context rather than direct influence.

References in later works of fiction and music

Film

  • At the conclusion of the Lindsay Anderson film Britannia Hospital (1982), the computer which is the outcome of Professor Millar's Genesis project recites "What a piece of Work is a Man" up to "how like a God", at which point it repeats the line over and over.
  • In the film Down and Out in Beverly Hills (1986), Jerry Baskin, played by Nick Nolte, recites this speech on the pier.
  • In Bruce Robinson's British film Withnail & I (1987), the credits roll after lead character Withnail recites the monologue to an audience of wolves in London Zoo.
  • In Gettysburg (1993), Union Colonel Joshua Lawrence Chamberlain recites from the speech while discussing slavery. To which Sergeant Kilrain responds "Well, if he's an angel, all right then... But he damn well must be a killer angel."
  • In the film Grosse Pointe Blank (1997), Mr. Newberry says to Martin: "What a piece of work is man! How noble... oh, fuck it, let's have a drink and forget the whole damn thing."
  • In the film Madagascar 2: Escape 2 Africa (2008), the penguin "Private" tries to enter code into the ship's navigation system by randomly jumping on the keyboard. A section of text on the screen that was entered as "WhATApiece OFworkisPenGuin". This may be a possible reference to the Infinite monkey theorem.
  • In the stop motion animation film Coraline (2009), the other Ms. Spink and Forcible recite it while performing their trapeze acrobatics.
  • In the vampire film Only Lovers Left Alive (2013), directed by Jim Jarmusch, parts of the monologue are quoted. Notably, Adam (Tom Hiddleston) utters "quintessence of dust" at the death bed of the vampire Marlowe. The plot includes the suggestion that the latter was the original author of the Shakespeare oeuvre, as some eccentric critics have argued.

Stage productions

  • In the 1967 rock musical Hair, numerous lyrics are derived from Hamlet, most notably a song titled "What a Piece of Work is Man", which uses much of the speech verbatim.
  • In the Reduced Shakespeare Company's production The Complete Works of William Shakespeare (abridged), the more famous solliloquy, "To be, or not to be," is omitted from the Hamlet portion of the production, not for time constraints, or because the speech is so well known, but because the group states that they dislike the speech for momentum and motivation reasons. The "What a piece of work is a man" speech is delivered in its stead.

Television

  • In the Babylon 5 episode "The Paragon of Animals", one of the characters, Byron, recites Hamlet's "how noble is man..." speech to Lyta Alexander.
  • In the third season finale of Person of Interest, titled "Deus Ex Machina", part of the monologue is paraphrased by the character John Greer, instead referencing the artificial intelligence system known as The Machine: "What a piece of work is your Machine, Harold. "In action, how like an angel. In apprehension, how like a god.""
  • The ninth episode of the seventh season of Sons of Anarchy is titled "What A Piece Of Work Is Man". This is a reference to the Shakespearean influence of the hit TV series.
  • In the Star Trek: The Next Generation episode “Hide and Q”, Q mocks humanity to Captain Jean-Luc Picard by means of Shakespeare quotes. Picard retorts by paraphrasing Hamlet's monologue, noting that "what he might say with irony, I say with conviction."
  • In season 12 episode 13 of ER, reference is made by Dr. Victor Clemente to Shakespeare as being how he knows the meaning of the word quintessence. Later he paraphrases the "What a piece of work is man!" monologue while at the bedside of his girlfriend who has just suffered multiple gunshot wounds from her husband.

Equality (mathematics)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Equality_...