Search This Blog

Thursday, March 14, 2019

Bohr–Einstein debates

From Wikipedia, the free encyclopedia

Niels Bohr with Albert Einstein at Paul Ehrenfest's home in Leiden (December 1925)

The Bohr–Einstein debates were a series of public disputes about quantum mechanics between Albert Einstein and Niels Bohr. Their debates are remembered because of their importance to the philosophy of science. An account of the debates was written by Bohr in an article titled "Discussions with Einstein on Epistemological Problems in Atomic Physics". Despite their differences of opinion regarding quantum mechanics, Bohr and Einstein had a mutual admiration that was to last the rest of their lives.

The debates represent one of the highest points of scientific research in the first half of the twentieth century because it called attention to an element of quantum theory, quantum non-locality, which is central to our modern understanding of the physical world. The consensus view of professional physicists has been that Bohr proved victorious in his defense of quantum theory, and definitively established the fundamental probabilistic character of quantum measurement.

Pre-revolutionary debates

Einstein was the first physicist to say that Planck's discovery of the quantum (h) would require a rewriting of the laws of physics. To support his point, in 1905 he proposed that light sometimes acts as a particle which he called a light quantum. Bohr was one of the most vocal opponents of the photon idea and did not openly embrace it until 1925. The photon appealed to Einstein because he saw it as a physical reality (although a confusing one) behind the numbers. Bohr disliked it because it made the choice of mathematical solution arbitrary. He did not like a scientist having to choose between equations.

1913 brought the Bohr model of the hydrogen atom, which made use of the quantum to explain the atomic spectrum. Einstein was at first skeptical, but quickly changed his mind and admitted his shift in mindset.

The quantum revolution

The quantum revolution of the mid-1920s occurred under the direction of both Einstein and Bohr, and their post-revolutionary debates were about making sense of the change. The shocks for Einstein began in 1925 when Werner Heisenberg introduced matrix equations that removed the Newtonian elements of space and time from any underlying reality. The next shock came in 1926 when Max Born proposed that mechanics were to be understood as a probability without any causal explanation.
Einstein rejected this interpretation. In a 1926 letter to Max Born, Einstein wrote: "I, at any rate, am convinced that He [God] does not throw dice."

At the Fifth Solvay Conference held in October 1927 Heisenberg and Born concluded that the revolution was over and nothing further was needed. It was at that last stage that Einstein's skepticism turned to dismay. He believed that much had been accomplished, but the reasons for the mechanics still needed to be understood.

Einstein's refusal to accept the revolution as complete reflected his desire to see developed a model for the underlying causes from which these apparent random statistical methods resulted. He did not reject the idea that positions in space-time could never be completely known but did not want to allow the uncertainty principle to necessitate a seemingly random, non-deterministic mechanism by which the laws of physics operated. Einstein himself was a statistical thinker but disagreed that no more needed to be discovered and clarified. Bohr, meanwhile, was dismayed by none of the elements that troubled Einstein. He made his own peace with the contradictions by proposing a principle of complementarity that emphasized the role of the observer over the observed.

Post-revolution: First stage

As mentioned above, Einstein's position underwent significant modifications over the course of the years. In the first stage, Einstein refused to accept quantum indeterminism and sought to demonstrate that the principle of indeterminacy could be violated, suggesting ingenious thought experiments which should permit the accurate determination of incompatible variables, such as position and velocity, or to explicitly reveal simultaneously the wave and the particle aspects of the same process.

Einstein's argument

The first serious attack by Einstein on the "orthodox" conception took place during the Fifth Solvay International Conference on Electrons and Photons in 1927. Einstein pointed out how it was possible to take advantage of the (universally accepted) laws of conservation of energy and of impulse (momentum) in order to obtain information on the state of a particle in a process of interference which, according to the principle of indeterminacy or that of complementarity, should not be accessible. 

Figure A. A monochromatic beam (one for which all the particles have the same impulse) encounters a first screen, diffracts, and the diffracted wave encounters a second screen with two slits, resulting in the formation of an interference figure on the background F. As always, it is assumed that only one particle at a time is able to pass the entire mechanism. From the measure of the recoil of the screen S1, according to Einstein, one can deduce from which slit the particle has passed without destroying the wave aspects of the process.
 
Figure B. Einstein's slit.
 
In order to follow his argumentation and to evaluate Bohr's response, it is convenient to refer to the experimental apparatus illustrated in figure A. A beam of light perpendicular to the X axis propagates in the direction z and encounters a screen S1 with a narrow (relative to the wavelength of the ray) slit. After having passed through the slit, the wave function diffracts with an angular opening that causes it to encounter a second screen S2 with two slits. The successive propagation of the wave results in the formation of the interference figure on the final screen F

At the passage through the two slits of the second screen S2, the wave aspects of the process become essential. In fact, it is precisely the interference between the two terms of the quantum superposition corresponding to states in which the particle is localized in one of the two slits which implies that the particle is "guided" preferably into the zones of constructive interference and cannot end up in a point in the zones of destructive interference (in which the wave function is nullified). It is also important to note that any experiment designed to evidence the "corpuscular" aspects of the process at the passage of the screen S2 (which, in this case, reduces to the determination of which slit the particle has passed through) inevitably destroys the wave aspects, implies the disappearance of the interference figure and the emergence of two concentrated spots of diffraction which confirm our knowledge of the trajectory followed by the particle.

At this point Einstein brings into play the first screen as well and argues as follows: since the incident particles have velocities (practically) perpendicular to the screen S1, and since it is only the interaction with this screen that can cause a deflection from the original direction of propagation, by the law of conservation of impulse which implies that the sum of the impulses of two systems which interact is conserved, if the incident particle is deviated toward the top, the screen will recoil toward the bottom and vice versa. In realistic conditions the mass of the screen is so large that it will remain stationary, but, in principle, it is possible to measure even an infinitesimal recoil. If we imagine taking the measurement of the impulse of the screen in the direction X after every single particle has passed, we can know, from the fact that the screen will be found recoiled toward the top (bottom), whether the particle in question has been deviated toward the bottom or top, and therefore through which slit in S2 the particle has passed. But since the determination of the direction of the recoil of the screen after the particle has passed cannot influence the successive development of the process, we will still have an interference figure on the screen F. The interference takes place precisely because the state of the system is the superposition of two states whose wave functions are non-zero only near one of the two slits. On the other hand, if every particle passes through only the slit b or the slit c, then the set of systems is the statistical mixture of the two states, which means that interference is not possible. If Einstein is correct, then there is a violation of the principle of indeterminacy.

Bohr's response

Bohr's response was to illustrate Einstein's idea more clearly using the diagram in Figure C. (Figure C shows a fixed screen S1 that is bolted down. Then try to imagine one that can slide up or down along a rod instead of a fixed bolt.) Bohr observes that extremely precise knowledge of any (potential) vertical motion of the screen is an essential presupposition in Einstein's argument. In fact, if its velocity in the direction X before the passage of the particle is not known with a precision substantially greater than that induced by the recoil (that is, if it were already moving vertically with an unknown and greater velocity than that which it derives as a consequence of the contact with the particle), then the determination of its motion after the passage of the particle would not give the information we seek. However, Bohr continues, an extremely precise determination of the velocity of the screen, when one applies the principle of indeterminacy, implies an inevitable imprecision of its position in the direction X. Before the process even begins, the screen would therefore occupy an indeterminate position at least to a certain extent (defined by the formalism). Now consider, for example, the point d in figure A, where the interference is destructive. Any displacement of the first screen would make the lengths of the two paths, a–b–d and a–c–d, different from those indicated in the figure. If the difference between the two paths varies by half a wavelength, at point d there will be constructive rather than destructive interference. The ideal experiment must average over all the possible positions of the screen S1, and, for every position, there corresponds, for a certain fixed point F, a different type of interference, from the perfectly destructive to the perfectly constructive. The effect of this averaging is that the pattern of interference on the screen F will be uniformly grey. Once more, our attempt to evidence the corpuscular aspects in S2 has destroyed the possibility of interference in F, which depends crucially on the wave aspects. 

Figure C. In order to realize Einstein's proposal, it is necessary to replace the first screen in Figure A (S1) with a diaphragm that can move vertically, such as this proposed by Bohr.
 
It should be noted that, as Bohr recognized, for the understanding of this phenomenon "it is decisive that, contrary to genuine instruments of measurement, these bodies along with the particles would constitute, in the case under examination, the system to which the quantum-mechanical formalism must apply. With respect to the precision of the conditions under which one can correctly apply the formalism, it is essential to include the entire experimental apparatus. In fact, the introduction of any new apparatus, such as a mirror, in the path of a particle could introduce new effects of interference which influence essentially the predictions about the results which will be registered at the end." Further along, Bohr attempts to resolve this ambiguity concerning which parts of the system should be considered macroscopic and which not:
In particular, it must be very clear that...the unambiguous use of spatiotemporal concepts in the description of atomic phenomena must be limited to the registration of observations which refer to images on a photographic lens or to analogous practically irreversible effects of amplification such as the formation of a drop of water around an ion in a dark room.
Bohr's argument about the impossibility of using the apparatus proposed by Einstein to violate the principle of indeterminacy depends crucially on the fact that a macroscopic system (the screen S1) obeys quantum laws. On the other hand, Bohr consistently held that, in order to illustrate the microscopic aspects of reality, it is necessary to set off a process of amplification, which involves macroscopic apparatuses, whose fundamental characteristic is that of obeying classical laws and which can be described in classical terms. This ambiguity would later come back in the form of what is still called today the measurement problem.

The principle of indeterminacy applied to time and energy

Figure D. A wave extended longitudinally passes through a slit which remains open only for a brief interval of time. Beyond the slit, there is a spatially limited wave in the direction of propagation.
 
In many textbook examples and popular discussions of quantum mechanics, the principle of indeterminacy is explained by reference to the pair of variables position and velocity (or momentum). It is important to note that the wave nature of physical processes implies that there must exist another relation of indeterminacy: that between time and energy. In order to comprehend this relation, it is convenient to refer to the experiment illustrated in Figure D, which results in the propagation of a wave which is limited in spatial extension. Assume that, as illustrated in the figure, a ray which is extremely extended longitudinally is propagated toward a screen with a slit furnished with a shutter which remains open only for a very brief interval of time . Beyond the slit, there will be a wave of limited spatial extension which continues to propagate toward the right. 

A perfectly monochromatic wave (such as a musical note which cannot be divided into harmonics) has infinite spatial extent. In order to have a wave which is limited in spatial extension (which is technically called a wave packet), several waves of different frequencies must be superimposed and distributed continuously within a certain interval of frequencies around an average value, such as . It then happens that at a certain instant, there exists a spatial region (which moves over time) in which the contributions of the various fields of the superposition add up constructively. Nonetheless, according to a precise mathematical theorem, as we move far away from this region, the phases of the various fields, at any specified point, are distributed causally and destructive interference is produced. The region in which the wave has non-zero amplitude is therefore spatially limited. It is easy to demonstrate that, if the wave has a spatial extension equal to (which means, in our example, that the shutter has remained open for a time where v is the velocity of the wave), then the wave contains (or is a superposition of) various monochromatic waves whose frequencies cover an interval which satisfies the relation:
Remembering that in the universal relation of Planck, frequency and energy are proportional:
it follows immediately from the preceding inequality that the particle associated with the wave should possess an energy which is not perfectly defined (since different frequencies are involved in the superposition) and consequently there is indeterminacy in energy:
From this it follows immediately that:
which is the relation of indeterminacy between time and energy.

Einstein's second criticism

Einstein's thought experiment of 1930 as designed by Bohr. Einstein's box was supposed to prove the violation of the indeterminacy relation between time and energy.
 
At the sixth Congress of Solvay in 1930, the indeterminacy relation just discussed was Einstein's target of criticism. His idea contemplates the existence of an experimental apparatus which was subsequently designed by Bohr in such a way as to emphasize the essential elements and the key points which he would use in his response. 

Einstein considers a box (called Einstein's box; see figure) containing electromagnetic radiation and a clock which controls the opening of a shutter which covers a hole made in one of the walls of the box. The shutter uncovers the hole for a time which can be chosen arbitrarily. During the opening, we are to suppose that a photon, from among those inside the box, escapes through the hole. In this way a wave of limited spatial extension has been created, following the explanation given above. In order to challenge the indeterminacy relation between time and energy, it is necessary to find a way to determine with adequate precision the energy that the photon has brought with it. At this point, Einstein turns to his celebrated relation between mass and energy of special relativity: . From this it follows that knowledge of the mass of an object provides a precise indication about its energy. The argument is therefore very simple: if one weighs the box before and after the opening of the shutter and if a certain amount of energy has escaped from the box, the box will be lighter. The variation in mass multiplied by will provide precise knowledge of the energy emitted. Moreover, the clock will indicate the precise time at which the event of the particle's emission took place. Since, in principle, the mass of the box can be determined to an arbitrary degree of accuracy, the energy emitted can be determined with a precision as accurate as one desires. Therefore, the product can be rendered less than what is implied by the principle of indeterminacy. 

George Gamow's make-believe experimental apparatus for validating the thought experiment at the Niels Bohr Institute in Copenhagen.
 
The idea is particularly acute and the argument seemed unassailable. It's important to consider the impact of all of these exchanges on the people involved at the time. Leon Rosenfeld, a scientist who had participated in the Congress, described the event several years later:
It was a real shock for Bohr...who, at first, could not think of a solution. For the entire evening he was extremely agitated, and he continued passing from one scientist to another, seeking to persuade them that it could not be the case, that it would have been the end of physics if Einstein were right; but he couldn't come up with any way to resolve the paradox. I will never forget the image of the two antagonists as they left the club: Einstein, with his tall and commanding figure, who walked tranquilly, with a mildly ironic smile, and Bohr who trotted along beside him, full of excitement...The morning after saw the triumph of Bohr.

Bohr's Triumph

The "Triumph of Bohr" consisted in his demonstrating, once again, that Einstein's subtle argument was not conclusive, but even more so in the way that he arrived at this conclusion by appealing precisely to one of the great ideas of Einstein: the principle of equivalence between gravitational mass and inertial mass, together with the time dilation of special relativity, and a consequence of these—the Gravitational redshift. Bohr showed that, in order for Einstein's experiment to function, the box would have to be suspended on a spring in the middle of a gravitational field. In order to obtain a measurement of the weight of the box, a pointer would have to be attached to the box which corresponded with the index on a scale. After the release of a photon, a mass could be added to the box to restore it to its original position and this would allow us to determine the energy that was lost when the photon left. The box is immersed in a gravitational field of strength , and the gravitational redshift affects the speed of the clock, yielding uncertainty in the time required for the pointer to return to its original position. Bohr gave the following calculation establishing the uncertainty relation

Let the uncertainty in the mass be denoted by . Let the error in the position of the pointer be . Adding the load to the box imparts a momentum that we can measure with an accuracy , where . Clearly , and therefore . By the redshift formula (which follows from the principle of equivalence and the time dilation), the uncertainty in the time is , and , and so . We have therefore proven the claimed .

Post-revolution: Second stage

The second phase of Einstein's "debate" with Bohr and the orthodox interpretation is characterized by an acceptance of the fact that it is, as a practical matter, impossible to simultaneously determine the values of certain incompatible quantities, but the rejection that this implies that these quantities do not actually have precise values. Einstein rejects the probabilistic interpretation of Born and insists that quantum probabilities are epistemic and not ontological in nature. As a consequence, the theory must be incomplete in some way. He recognizes the great value of the theory, but suggests that it "does not tell the whole story", and, while providing an appropriate description at a certain level, it gives no information on the more fundamental underlying level:
I have the greatest consideration for the goals which are pursued by the physicists of the latest generation which go under the name of quantum mechanics, and I believe that this theory represents a profound level of truth, but I also believe that the restriction to laws of a statistical nature will turn out to be transitory....Without doubt quantum mechanics has grasped an important fragment of the truth and will be a paragon for all future fundamental theories, for the fact that it must be deducible as a limiting case from such foundations, just as electrostatics is deducible from Maxwell's equations of the electromagnetic field or as thermodynamics is deducible from statistical mechanics.
These thoughts of Einstein would set off a line of research into hidden variable theories, such as the Bohm interpretation, in an attempt to complete the edifice of quantum theory. If quantum mechanics can be made complete in Einstein's sense, it cannot be done locally; this fact was demonstrated by John Stewart Bell with the formulation of Bell's inequality in 1964.

Post-revolution: Third stage

The argument of EPR

Title sections of historical papers on EPR.

In 1935 Einstein, Boris Podolsky and Nathan Rosen developed an argument, published in the magazine Physical Review with the title Can Quantum-Mechanical Description of Physical Reality Be Considered Complete?, based on an entangled state of two systems. Before coming to this argument, it is necessary to formulate another hypothesis that comes out of Einstein's work in relativity: the principle of locality. The elements of physical reality which are objectively possessed cannot be influenced instantaneously at a distance.
 
The argument of EPR was in 1957 picked up by David Bohm and Yakir Aharonov in a paper published in Physical Review with the title Discussion of Experimental Proof for the Paradox of Einstein, Rosen, and Podolsky. The authors reformulated the argument in terms of an entangled state of two particles, which can be summarized as follows: 

1) Consider a system of two photons which at time t are located, respectively, in the spatially distant regions A and B and which are also in the entangled state of polarization described below:
2) At time t the photon in region A is tested for vertical polarization. Suppose that the result of the measurement is that the photon passes through the filter. According to the reduction of the wave packet, the result is that, at time t + dt, the system becomes
3) At this point, the observer in A who carried out the first measurement on photon 1, without doing anything else that could disturb the system or the other photon ("assumption (R)", below), can predict with certainty that photon 2 will pass a test of vertical polarization. It follows that photon 2 possesses an element of physical reality: that of having a vertical polarization. 

4) According to the assumption of locality, it cannot have been the action carried out in A which created this element of reality for photon 2. Therefore, we must conclude that the photon possessed the property of being able to pass the vertical polarization test before and independently of the measurement of photon 1

5) At time t, the observer in A could have decided to carry out a test of polarization at 45°, obtaining a certain result, for example, that the photon passes the test. In that case, he could have concluded that photon 2 turned out to be polarized at 45°. Alternatively, if the photon did not pass the test, he could have concluded that photon 2 turned out to be polarized at 135°. Combining one of these alternatives with the conclusion reached in 4, it seems that photon 2, before the measurement took place, possessed both the property of being able to pass with certainty a test of vertical polarization and the property of being able to pass with certainty a test of polarization at either 45° or 135°. These properties are incompatible according to the formalism. 

6) Since natural and obvious requirements have forced the conclusion that photon 2 simultaneously possesses incompatible properties, this means that, even if it is not possible to determine these properties simultaneously and with arbitrary precision, they are nevertheless possessed objectively by the system. But quantum mechanics denies this possibility and it is therefore an incomplete theory.

Bohr's response

Bohr's response to this argument was published, five months later than the original publication of EPR, in the same magazine Physical Review and with exactly the same title as the original. The crucial point of Bohr's answer is distilled in a passage which he later had republished in Paul Arthur Schilpp's book Albert Einstein, scientist-philosopher in honor of the seventieth birthday of Einstein. Bohr attacks assumption (R) of EPR by stating:
The statement of the criterion in question is ambiguous with regard to the expression "without disturbing the system in any way". Naturally, in this case no mechanical disturbance of the system under examination can take place in the crucial stage of the process of measurement. But even in this stage there arises the essential problem of an influence on the precise conditions which define the possible types of prediction which regard the subsequent behaviour of the system...their arguments do not justify their conclusion that the quantum description turns out to be essentially incomplete...This description can be characterized as a rational use of the possibilities of an unambiguous interpretation of the process of measurement compatible with the finite and uncontrollable interaction between the object and the instrument of measurement in the context of quantum theory.

Post-revolution: Fourth stage

In his last writing on the topic, Einstein further refined his position, making it completely clear that what really disturbed him about the quantum theory was the problem of the total renunciation of all minimal standards of realism, even at the microscopic level, that the acceptance of the completeness of the theory implied. Although the majority of experts in the field agree that Einstein was wrong, the current understanding is still not complete. There is no scientific consensus that determinism would have been refuted.

Photoelectric effect

From Wikipedia, the free encyclopedia

The photoelectric effect is the emission of electrons or other free carriers when light falls on a material. Electrons emitted in this manner can be called photoelectrons. This phenomenon is commonly studied in electronic physics, as well as in fields of chemistry, such as quantum chemistry or electrochemistry

According to classical electromagnetic theory, this effect can be attributed to the transfer of energy from the light to an electron. From this perspective, an alteration in the intensity of light would induce changes in the kinetic energy of the electrons emitted from the metal. Furthermore, according to this theory, a sufficiently dim light would be expected to show a time lag between the initial shining of its light and the subsequent emission of an electron. However, the experimental results did not correlate with either of the two predictions made by classical theory.

Instead, electrons are dislodged only by the impingement of photons when those photons reach or exceed a threshold frequency (energy). Below that threshold, no electrons are emitted from the material regardless of the light intensity or the length of time of exposure to the light. (Rarely, an electron will escape by absorbing two or more quanta. However, this is extremely rare because by the time it absorbs enough quanta to escape, the electron will probably have emitted the rest of the quanta.) To make sense of the fact that light can eject electrons even if its intensity is low, Albert Einstein proposed that a beam of light is not a wave propagating through space, but rather a collection of discrete wave packets (photons), each with energy . This shed light on Max Planck's previous discovery of the Planck relation (E = ) linking energy (E) and frequency (ν) as arising from quantization of energy. The factor h is known as the Planck constant.

In 1887, Heinrich Hertz discovered that electrodes illuminated with ultraviolet light create electric sparks more easily. In 1900, while studying black-body radiation, the German physicist Max Planck suggested that the energy carried by electromagnetic waves could only be released in "packets" of energy. In 1905, Albert Einstein published a paper advancing the hypothesis that light energy is carried in discrete quantized packets to explain experimental data from the photoelectric effect. This model contributed to the development of quantum mechanics. In 1914, Millikan's experiment supported Einstein's model of the photoelectric effect. Einstein was awarded the Nobel Prize in 1921 for "his discovery of the law of the photoelectric effect", and Robert Millikan was awarded the Nobel Prize in 1923 for "his work on the elementary charge of electricity and on the photoelectric effect".

The photoelectric effect requires photons with energies approaching zero (in the case of negative electron affinity) to over 1 MeV for core electrons in elements with a high atomic number. Emission of conduction electrons from typical metals usually requires a few electron-volts, corresponding to short-wavelength visible or ultraviolet light. Study of the photoelectric effect led to important steps in understanding the quantum nature of light and electrons and influenced the formation of the concept of wave–particle duality. Other phenomena where light affects the movement of electric charges include the photoconductive effect (also known as photoconductivity or photoresistivity), the photovoltaic effect, and the photoelectrochemical effect.

Photoemission can occur from any material, but it is most easily observable from metals or other conductors because the process produces a charge imbalance, and if this charge imbalance is not neutralized by current flow (enabled by conductivity), the potential barrier to emission increases until the emission current ceases. It is also usual to have the emitting surface in a vacuum, since gases impede the flow of photoelectrons and make them difficult to observe. Additionally, the energy barrier to photoemission is usually increased by thin oxide layers on metal surfaces if the metal has been exposed to oxygen, so most practical experiments and devices based on the photoelectric effect use clean metal surfaces in a vacuum.

When the photoelectron is emitted into a solid rather than into a vacuum, the term internal photoemission is often used, and emission into a vacuum distinguished as external photoemission.

Schematic of experimental apparatus to demonstrate the photoelectric effect. The filter passes light of certain wavelengths from the lamp at left. The light strikes the curved electrode, and electrons are emitted. The adjustable voltage can be increased until the current stops flowing. This "stopping voltage" is a function only of the electrode material and the frequency of the incident light, and is not affected by the intensity of the light.

Emission mechanism

The photons of a light beam have a characteristic energy proportional to the frequency of the light. In the photoemission process, if an electron within some material absorbs the energy of one photon and acquires more energy than the work function (the electron binding energy) of the material, it is ejected. If the photon energy is too low, the electron is unable to escape the material. Since an increase in the intensity of low-frequency light will only increase the number of low-energy photons sent over a given interval of time, this change in intensity will not create any single photon with enough energy to dislodge an electron. Thus, the energy of the emitted electrons does not depend on the intensity of the incoming light, but only on the energy (equivalent frequency) of the individual photons. It is an interaction between the incident photon and the outermost electrons. 

Electrons can absorb energy from photons when irradiated, but they usually follow an "all or nothing" principle. All of the energy from one photon must be absorbed and used to liberate one electron from atomic binding, or else the energy is re-emitted. If the photon energy is absorbed, some of the energy liberates the electron from the atom, and the rest contributes to the electron's kinetic energy as a free particle.

Experimental observations of photoelectric emission

The theory of the photoelectric effect must explain the experimental observations of the emission of electrons from an illuminated metal surface. 

For a given metal surface, there exists a certain minimum frequency of incident radiation below which no photoelectrons are emitted. This frequency is called the threshold frequency. Increasing the frequency of the incident beam, keeping the number of incident photons fixed (this would result in a proportionate increase in energy) increases the maximum kinetic energy of the photoelectrons emitted. Thus the stopping voltage increases (see the experimental setup in the figure). The number of electrons also changes because of the probability that each photon results in an emitted electron are a function of photon energy. If the intensity of the incident radiation of a given frequency is increased, there is no effect on the kinetic energy of each photoelectron. 

Above the threshold frequency, the maximum kinetic energy of the emitted photoelectron depends on the frequency of the incident light, but is independent of the intensity of the incident light so long as the latter is not too high.

For a given metal and frequency of incident radiation, the rate at which photoelectrons are ejected is directly proportional to the intensity of the incident light. An increase in the intensity of the incident beam (keeping the frequency fixed) increases the magnitude of the photoelectric current, although the stopping voltage remains the same.

The time lag between the incidence of radiation and the emission of a photoelectron is very small, less than 10−9 second. 

The direction of distribution of emitted electrons peaks in the direction of polarization (the direction of the electric field) of the incident light, if it is linearly polarized.

Mathematical description

In 1905, Einstein proposed an explanation of the photoelectric effect using a concept first put forward by Max Planck that light waves consist of tiny bundles or packets of energy known as photons or quanta. 

Diagram of the maximum kinetic energy as a function of the frequency of light on zinc

The maximum kinetic energy of an ejected electron is given by
where is the Planck constant and is the frequency of the incident photon. The term is the work function (sometimes denoted , or ), which gives the minimum energy required to remove an electron from the surface of the metal. The work function satisfies
where is the threshold frequency for the metal. The maximum kinetic energy of an ejected electron is then
Kinetic energy is positive, so we must have for the photoelectric effect to occur.

Stopping potential

The relation between current and applied voltage illustrates the nature of the photoelectric effect. For discussion, a light source illuminates a plate P, and another plate electrode Q collects any emitted electrons. We vary the potential between P and Q and measure the current flowing in the external circuit between the two plates. 

If the frequency and the intensity of the incident radiation are fixed, the photoelectric current increases gradually with an increase in the positive potential on the collector electrode until all the photoelectrons emitted are collected. The photoelectric current attains a saturation value and does not increase further for any increase in the positive potential. The saturation current increases with the increase of the light intensity. It also increases with greater frequencies due to a greater probability of electron emission when collisions happen with higher energy photons.

If we apply a negative potential to the collector plate Q with respect to the plate P and gradually increase it, the photoelectric current decreases, becoming zero at a certain negative potential. The negative potential on the collector at which the photoelectric current becomes zero is called the stopping potential or cut off potential.
  • For a given frequency of incident radiation, the stopping potential is independent of its intensity.
  • For a given frequency of incident radiation, the stopping potential is determined by the maximum kinetic energy of the photoelectrons that are emitted. If qe is the charge on the electron and is the stopping potential, then the work done by the retarding potential in stopping the electron is , so we have
Recalling
we see that the stopping voltage varies linearly with frequency of light, but depends on the type of material. For any particular material, there is a threshold frequency that must be exceeded, independent of light intensity, to observe any electron emission.

Three-step model

In the X-ray regime, the photoelectric effect in crystalline material is often decomposed into three steps:
  1. Inner photoelectric effect (see photo diode below). The hole left behind can give rise to Auger effect, which is visible even when the electron does not leave the material. In molecular solids phonons are excited in this step and may be visible as lines in the final electron energy. The inner photoeffect has to be dipole allowed. The transition rules for atoms translate via the tight-binding model onto the crystal. They are similar in geometry to plasma oscillations in that they have to be transverse.
  2. Ballistic transport of half of the electrons to the surface. Some electrons are scattered.
  3. Electrons escape from the material at the surface.
In the three-step model, an electron can take multiple paths through these three steps. All paths can interfere in the sense of the path integral formulation. For surface states and molecules the three-step model does still make some sense as even most atoms have multiple electrons which can scatter the one electron leaving.

History

When a surface is exposed to electromagnetic radiation above a certain threshold frequency (typically visible light for alkali metals, near ultraviolet for other metals, and extreme ultraviolet for non-metals), the radiation is absorbed and electrons are emitted. Light, and especially ultra-violet light, discharges negatively electrified bodies with the production of rays of the same nature as cathode rays. Under certain circumstances it can directly ionize gases. The first of these phenomena was discovered by Hertz and Hallwachs in 1887. The second was announced first by Philipp Lenard in 1900.
The ultra-violet light to produce these effects may be obtained from an arc lamp, or by burning magnesium, or by sparking with an induction coil between zinc or cadmium terminals, the light from which is very rich in ultra-violet rays. Sunlight is not rich in ultra-violet rays, as these have been absorbed by the atmosphere, and it does not produce nearly so large an effect as the arc-light. Many substances besides metals discharge negative electricity under the action of ultraviolet light: lists of these substances will be found in papers by G. C. Schmidt and O. Knoblauch.

19th century

In 1839, Alexandre Edmond Becquerel discovered the photovoltaic effect while studying the effect of light on electrolytic cells. Though not equivalent to the photoelectric effect, his work on photovoltaics was instrumental in showing a strong relationship between light and electronic properties of materials. In 1873, Willoughby Smith discovered photoconductivity in selenium while testing the metal for its high resistance properties in conjunction with his work involving submarine telegraph cables.
Johann Elster (1854–1920) and Hans Geitel (1855–1923), students in Heidelberg, developed the first practical photoelectric cells that could be used to measure the intensity of light. Elster and Geitel had investigated with great success the effects produced by light on electrified bodies.
In 1887, Heinrich Hertz observed the photoelectric effect and the production and reception of electromagnetic waves. He published these observations in the journal Annalen der Physik. His receiver consisted of a coil with a spark gap, where a spark would be seen upon detection of electromagnetic waves. He placed the apparatus in a darkened box to see the spark better. However, he noticed that the maximum spark length was reduced when in the box. A glass panel placed between the source of electromagnetic waves and the receiver absorbed ultraviolet radiation that assisted the electrons in jumping across the gap. When removed, the spark length would increase. He observed no decrease in spark length when he replaced the glass with quartz, as quartz does not absorb UV radiation. Hertz concluded his months of investigation and reported the results obtained. He did not further pursue the investigation of this effect.
The discovery by Hertz in 1887 that the incidence of ultra-violet light on a spark gap facilitated the passage of the spark, led immediately to a series of investigations by Hallwachs, Hoor, Righi, and Stoletow on the effect of light, and especially of ultra-violet light, on charged bodies. It was proved by these investigations that a newly cleaned surface of zinc, if charged with negative electricity, rapidly loses this charge however small it may be when ultra-violet light falls upon the surface; while if the surface is uncharged to begin with, it acquires a positive charge when exposed to the light, the negative electrification going out into the gas by which the metal is surrounded; this positive electrification can be much increased by directing a strong airblast against the surface. If however the zinc surface is positively electrified it suffers no loss of charge when exposed to the light: this result has been questioned, but a very careful examination of the phenomenon by Elster and Geitel has shown that the loss observed under certain circumstances is due to the discharge by the light reflected from the zinc surface of negative electrification on neighbouring conductors induced by the positive charge, the negative electricity under the influence of the electric field moving up to the positively electrified surface.
With regard to the Hertz effect, the researchers from the start showed a great complexity of the phenomenon of photoelectric fatigue — that is, the progressive diminution of the effect observed upon fresh metallic surfaces. According to an important research by Wilhelm Hallwachs, ozone played an important part in the phenomenon. However, other elements enter such as oxidation, the humidity, the mode of polish of the surface, etc. It was at the time not even sure that the fatigue is absent in a vacuum.
In the period from February 1888 and until 1891, a detailed analysis of photo effect was performed by Aleksandr Stoletov with results published in 6 works; four of them in Comptes Rendus, one review in Physikalische Revue (translated from Russian), and the last work in Journal de Physique. First, in these works Stoletov invented a new experimental setup which was more suitable for a quantitative analysis of photo effect. Using this setup, he discovered the direct proportionality between the intensity of light and the induced photo electric current (the first law of photo effect or Stoletov's law). One of his other findings resulted from measurements of the dependence of the intensity of the electric photo current on the gas pressure, where he found the existence of an optimal gas pressure Pm corresponding to a maximum photocurrent; this property was used for a creation of solar cells.
In 1899, J. J. Thomson investigated ultraviolet light in Crookes tubes. Thomson deduced that the ejected particles were the same as those previously found in the cathode ray, later called electrons, which he called "corpuscles". In the research, Thomson enclosed a metal plate (a cathode) in a vacuum tube, and exposed it to high-frequency radiation. It was thought that the oscillating electromagnetic fields caused the atoms' field to resonate and, after reaching a certain amplitude, caused a subatomic "corpuscle" to be emitted, and current to be detected. The amount of this current varied with the intensity and color of the radiation. Larger radiation intensity or frequency would produce more current.

20th century

The discovery of the ionization of gases by ultra-violet light was made by Philipp Lenard in 1900. As the effect was produced across several centimeters of air and yielded a greater number of positive ions than negative, it was natural to interpret the phenomenon, as did J. J. Thomson, as a Hertz effect upon the solid or liquid particles present in the gas.
In 1902, Lenard observed that the energy of individual emitted electrons increased with the frequency (which is related to the color) of the light.
This appeared to be at odds with Maxwell's wave theory of light, which predicted that the electron energy would be proportional to the intensity of the radiation.
Lenard observed the variation in electron energy with light frequency using a powerful electric arc lamp which enabled him to investigate large changes in intensity, and that had sufficient power to enable him to investigate the variation of potential with light frequency. His experiment directly measured potentials, not electron kinetic energy: he found the electron energy by relating it to the maximum stopping potential (voltage) in a phototube. He found that the calculated maximum electron kinetic energy is determined by the frequency of the light. For example, an increase in frequency results in an increase in the maximum kinetic energy calculated for an electron upon liberation – ultraviolet radiation would require a higher applied stopping potential to stop current in a phototube than blue light. However, Lenard's results were qualitative rather than quantitative because of the difficulty in performing the experiments: the experiments needed to be done on freshly cut metal so that the pure metal was observed, but it oxidized in a matter of minutes even in the partial vacuums he used. The current emitted by the surface was determined by the light's intensity, or brightness: doubling the intensity of the light doubled the number of electrons emitted from the surface.
The researches of Langevin and those of Eugene Bloch have shown that the greater part of the Lenard effect is certainly due to this 'Hertz effect'. The Lenard effect upon the gas itself nevertheless does exist. Refound by J. J. Thomson and then more decisively by Frederic Palmer, Jr., it was studied and showed very different characteristics than those at first attributed to it by Lenard.
In 1905, Albert Einstein solved this apparent paradox by describing light as composed of discrete quanta, now called photons, rather than continuous waves. Based upon Max Planck's theory of black-body radiation, Einstein theorized that the energy in each quantum of light was equal to the frequency multiplied by a constant, later called Planck's constant. A photon above a threshold frequency has the required energy to eject a single electron, creating the observed effect. This discovery led to the quantum revolution in physics and earned Einstein the Nobel Prize in Physics in 1921. By wave-particle duality the effect can be analyzed purely in terms of waves though not as conveniently.
Albert Einstein's mathematical description of how the photoelectric effect was caused by absorption of quanta of light was in one of his 1905 papers, named "On a Heuristic Viewpoint Concerning the Production and Transformation of Light". This paper proposed the simple description of "light quanta", or photons, and showed how they explained such phenomena as the photoelectric effect. His simple explanation in terms of absorption of discrete quanta of light explained the features of the phenomenon and the characteristic frequency.
The idea of light quanta began with Max Planck's published law of black-body radiation ("On the Law of Distribution of Energy in the Normal Spectrum") by assuming that Hertzian oscillators could only exist at energies E proportional to the frequency f of the oscillator by E = hf, where h is Planck's constant. By assuming that light actually consisted of discrete energy packets, Einstein wrote an equation for the photoelectric effect that agreed with experimental results. It explained why the energy of photoelectrons was dependent only on the frequency of the incident light and not on its intensity: a low-intensity, the high-frequency source could supply a few high energy photons, whereas a high-intensity, the low-frequency source would supply no photons of sufficient individual energy to dislodge any electrons. This was an enormous theoretical leap, but the concept was strongly resisted at first because it contradicted the wave theory of light that followed naturally from James Clerk Maxwell's equations for electromagnetic behavior, and more generally, the assumption of infinite divisibility of energy in physical systems. Even after experiments showed that Einstein's equations for the photoelectric effect were accurate, resistance to the idea of photons continued since it appeared to contradict Maxwell's equations, which were well understood and verified.
Einstein's work predicted that the energy of individual ejected electrons increases linearly with the frequency of the light. Perhaps surprisingly, the precise relationship had not at that time been tested. By 1905 it was known that the energy of photoelectrons increases with increasing frequency of incident light and is independent of the intensity of the light. However, the manner of the increase was not experimentally determined until 1914 when Robert Andrews Millikan showed that Einstein's prediction was correct.
The photoelectric effect helped to propel the then-emerging concept of wave–particle duality in the nature of light. Light simultaneously possesses the characteristics of both waves and particles, each being manifested according to the circumstances. The effect was impossible to understand in terms of the classical wave description of light, as the energy of the emitted electrons did not depend on the intensity of the incident radiation. Classical theory predicted that the electrons would 'gather up' energy over a period of time, and then be emitted.

Uses and effects

Photomultipliers

These are extremely light-sensitive vacuum tubes with a photocathode coated onto part (an end or side) of the inside of the envelope. The photo cathode contains combinations of materials such as cesium, rubidium, and antimony specially selected to provide a low work function, so when illuminated even by very low levels of light, the photocathode readily releases electrons. By means of a series of electrodes (dynodes) at ever-higher potentials, these electrons are accelerated and substantially increased in number through secondary emission to provide a readily detectable output current. Photomultipliers are still commonly used wherever low levels of light must be detected.

Image sensors

Video camera tubes in the early days of television used the photoelectric effect, for example, Philo Farnsworth's "Image dissector" used a screen charged by the photoelectric effect to transform an optical image into a scanned electronic signal.

Gold-leaf electroscope

The gold leaf electroscope

Gold-leaf electroscopes are designed to detect static electricity. Charge placed on the metal cap spreads to the stem and the gold leaf of the electroscope. Because they then have the same charge, the stem and leaf repel each other. This will cause the leaf to bend away from the stem.
An electroscope is an important tool in illustrating the photoelectric effect. For example, if the electroscope is negatively charged throughout, there is an excess of electrons and the leaf is separated from the stem. If high-frequency light shines on the cap, the electroscope discharges, and the leaf will fall limp. This is because the frequency of the light shining on the cap is above the cap's threshold frequency. The photons in the light have enough energy to liberate electrons from the cap, reducing its negative charge. This will discharge a negatively charged electroscope and further charge a positive electroscope. However, if the electromagnetic radiation hitting the metal cap does not have a high enough frequency (its frequency is below the threshold value for the cap), then the leaf will never discharge, no matter how long one shines the low-frequency light at the cap.

Photoelectron spectroscopy

Since the energy of the photoelectrons emitted is exactly the energy of the incident photon minus the material's work function or binding energy, the work function of a sample can be determined by bombarding it with a monochromatic X-ray source or UV source, and measuring the kinetic energy distribution of the electrons emitted.
Photoelectron spectroscopy is usually done in a high-vacuum environment, since the electrons would be scattered by gas molecules if they were present. However, some companies are now selling products that allow photoemission in air. The light source can be a laser, a discharge tube, or a synchrotron radiation source.
The concentric hemispherical analyzer is a typical electron energy analyzer and uses an electric field to change the directions of incident electrons, depending on their kinetic energies. For every element and core (atomic orbital) there will be a different binding energy. The many electrons created from each of these combinations will show up as spikes in the analyzer output, and these can be used to determine the elemental composition of the sample.

Spacecraft

The photoelectric effect will cause spacecraft exposed to sunlight to develop a positive charge. This can be a major problem, as other parts of the spacecraft are in shadow which will result in the spacecraft developing a negative charge from nearby plasmas. The imbalance can discharge through delicate electrical components. The static charge created by the photoelectric effect is self-limiting, because a higher charged object doesn't give up its electrons as easily as a lower charged object does.

Moon dust

Light from the sun hitting lunar dust causes it to become charged with the photoelectric effect. The charged dust then repels itself and lifts off the surface of the Moon by electrostatic levitation. This manifests itself almost like an "atmosphere of dust", visible as a thin haze and blurring of distant features, and visible as a dim glow after the sun has set. This was first photographed by the Surveyor program probes in the 1960s. It is thought that the smallest particles are repelled kilometers from the surface and that the particles move in "fountains" as they charge and discharge.

Night vision devices

Photons hitting a thin film of alkali metal or semiconductor material such as gallium arsenide in an image intensifier tube cause the ejection of photoelectrons due to the photoelectric effect. These are accelerated by an electrostatic field where they strike a phosphor coated screen, converting the electrons back into photons. Intensification of the signal is achieved either through acceleration of the electrons or by increasing the number of electrons through secondary emissions, such as with a micro-channel plate. Sometimes a combination of both methods is used. Additional kinetic energy is required to move an electron out of the conduction band and into the vacuum level. This is known as the electron affinity of the photocathode and is another barrier to photoemission other than the forbidden band, explained by the band gap model. Some materials such as Gallium Arsenide have an effective electron affinity that is below the level of the conduction band. In these materials, electrons that move to the conduction band are all of the sufficient energy to be emitted from the material and as such, the film that absorbs photons can be quite thick. These materials are known as negative electron affinity materials.

Cross section

The photoelectric effect is one interaction mechanism between photons and atoms. It is one of 12 theoretically possible interactions.
At the high photon energies comparable to the electron rest energy of 511 keV, Compton scattering, another process, may take place. Above twice this (1.022 MeV) pair production may take place. Compton scattering and pair production are examples of two other competing mechanisms.
Indeed, even if the photoelectric effect is the favoured reaction for a particular single-photon bound-electron interaction, the result is also subject to statistical processes and is not guaranteed, albeit the photon has certainly disappeared and a bound electron has been excited (usually K or L shell electrons at gamma ray energies). The probability of the photoelectric effect occurring is measured by the cross-section of interaction, σ. This has been found to be a function of the atomic number of the target atom and photon energy. A crude approximation, for photon energies above the highest atomic binding energy, is given by:
Here Z is atomic number and n is a number which varies between 4 and 5. (At lower photon energies a characteristic structure with edges appears, K edge, L edges, M edges, etc.) The obvious interpretation follows that the photoelectric effect rapidly decreases in significance, in the gamma-ray region of the spectrum, with increasing photon energy, and that photoelectric effect increases steeply with atomic number. The corollary is that high-Z materials make good gamma-ray shields, which is the principal reason that lead (Z = 82) is a preferred and ubiquitous gamma radiation shield.

Cooperative

From Wikipedia, the free encyclopedia ...