Search This Blog

Tuesday, March 4, 2025

Planck units

From Wikipedia, the free encyclopedia

In particle physics and physical cosmology, Planck units are a system of units of measurement defined exclusively in terms of four universal physical constants: c, G, ħ, and kB (described further below). Expressing one of these physical constants in terms of Planck units yields a numerical value of 1. They are a system of natural units, defined using fundamental properties of nature (specifically, properties of free space) rather than properties of a chosen prototype object. Originally proposed in 1899 by German physicist Max Planck, they are relevant in research on unified theories such as quantum gravity.

The term Planck scale refers to quantities of space, time, energy and other units that are similar in magnitude to corresponding Planck units. This region may be characterized by particle energies of around 1019 GeV or 109 J, time intervals of around 5×10−44 s and lengths of around 10−35 m (approximately the energy-equivalent of the Planck mass, the Planck time and the Planck length, respectively). At the Planck scale, the predictions of the Standard Model, quantum field theory and general relativity are not expected to apply, and quantum effects of gravity are expected to dominate. One example is represented by the conditions in the first 10−43 seconds of our universe after the Big Bang, approximately 13.8 billion years ago.

The four universal constants that, by definition, have a numeric value 1 when expressed in these units are:

Variants of the basic idea of Planck units exist, such as alternate choices of normalization that give other numeric values to one or more of the four constants above.

Introduction

Any system of measurement may be assigned a mutually independent set of base quantities and associated base units, from which all other quantities and units may be derived. In the International System of Units, for example, the SI base quantities include length with the associated unit of the metre. In the system of Planck units, a similar set of base quantities and associated units may be selected, in terms of which other quantities and coherent units may be expressed. The Planck unit of length has become known as the Planck length, and the Planck unit of time is known as the Planck time, but this nomenclature has not been established as extending to all quantities.

All Planck units are derived from the dimensional universal physical constants that define the system, and in a convention in which these units are omitted (i.e. treated as having the dimensionless value 1), these constants are then eliminated from equations of physics in which they appear. For example, Newton's law of universal gravitation, can be expressed as: Both equations are dimensionally consistent and equally valid in any system of quantities, but the second equation, with G absent, is relating only dimensionless quantities since any ratio of two like-dimensioned quantities is a dimensionless quantity. If, by a shorthand convention, it is understood that each physical quantity is the corresponding ratio with a coherent Planck unit (or "expressed in Planck units"), the ratios above may be expressed simply with the symbols of physical quantity, without being scaled explicitly by their corresponding unit: This last equation (without G) is valid with F, m1, m2, and r being the dimensionless ratio quantities corresponding to the standard quantities, written e.g. FF or F = F/FP, but not as a direct equality of quantities. This may seem to be "setting the constants c, G, etc., to 1" if the correspondence of the quantities is thought of as equality. For this reason, Planck or other natural units should be employed with care. Referring to "G = c = 1", Paul S. Wesson wrote that, "Mathematically it is an acceptable trick which saves labour. Physically it represents a loss of information and can lead to confusion."

History and definition

Max Planck in 1933

The concept of natural units was introduced in 1874, when George Johnstone Stoney, noting that electric charge is quantized, derived units of length, time, and mass, now named Stoney units in his honor. Stoney chose his units so that G, c, and the electron charge e would be numerically equal to 1. In 1899, one year before the advent of quantum theory, Max Planck introduced what became later known as the Planck constant. At the end of the paper, he proposed the base units that were later named in his honor. The Planck units are based on the quantum of action, now usually known as the Planck constant, which appeared in the Wien approximation for black-body radiation. Planck underlined the universality of the new unit system, writing:

... die Möglichkeit gegeben ist, Einheiten für Länge, Masse, Zeit und Temperatur aufzustellen, welche, unabhängig von speciellen Körpern oder Substanzen, ihre Bedeutung für alle Zeiten und für alle, auch ausserirdische und aussermenschliche Culturen nothwendig behalten und welche daher als »natürliche Maasseinheiten« bezeichnet werden können.

... it is possible to set up units for length, mass, time and temperature, which are independent of special bodies or substances, necessarily retaining their meaning for all times and for all civilizations, including extraterrestrial and non-human ones, which can be called "natural units of measure".

Planck considered only the units based on the universal constants , , , and to arrive at natural units for length, time, mass, and temperature. His definitions differ from the modern ones by a factor of , because the modern definitions use rather than .

Table 1: Modern values for Planck's original choice of quantities
Name Dimension Expression Value (SI units)
Planck length length (L) 1.616255(18)×10−35 m
Planck mass mass (M) 2.176434(24)×10−8 kg
Planck time time (T) 5.391247(60)×10−44 s
Planck temperature temperature (Θ) 1.416784(16)×1032 K

Unlike the case with the International System of Units, there is no official entity that establishes a definition of a Planck unit system. Some authors define the base Planck units to be those of mass, length and time, regarding an additional unit for temperature to be redundant. Other tabulations add, in addition to a unit for temperature, a unit for electric charge, so that either the Coulomb constant or the vacuum permittivity is normalized to 1. Thus, depending on the author's choice, this charge unit is given by for , or for . Some of these tabulations also replace mass with energy when doing so. In SI units, the values of c, h, e and kB are exact and the values of ε0 and G in SI units respectively have relative uncertainties of 1.6×10−10 and 2.2×10−5. Hence, the uncertainties in the SI values of the Planck units derive almost entirely from uncertainty in the SI value of G.

Compared to Stoney units, Planck base units are all larger by a factor , where is the fine-structure constant.

Derived units

In any system of measurement, units for many physical quantities can be derived from base units. Table 2 offers a sample of derived Planck units, some of which are seldom used. As with the base units, their use is mostly confined to theoretical physics because most of them are too large or too small for empirical or practical use and there are large uncertainties in their values.

Table 2: Coherent derived units of Planck units
Derived unit of Expression Approximate SI equivalent
area (L2) 2.6121×10−70 m2
volume (L3) 4.2217×10−105 m3
momentum (LMT−1) 6.5249 kg⋅m/s
energy (L2MT−2) 1.9561×109 J
force (LMT−2) 1.2103×1044 N
density (L−3M) 5.1550×1096 kg/m3
acceleration (LT−2) 5.5608×1051 m/s2

Some Planck units, such as of time and length, are many orders of magnitude too large or too small to be of practical use, so that Planck units as a system are typically only relevant to theoretical physics. In some cases, a Planck unit may suggest a limit to a range of a physical quantity where present-day theories of physics apply. For example, our understanding of the Big Bang does not extend to the Planck epoch, i.e., when the universe was less than one Planck time old. Describing the universe during the Planck epoch requires a theory of quantum gravity that would incorporate quantum effects into general relativity. Such a theory does not yet exist.

Several quantities are not "extreme" in magnitude, such as the Planck mass, which is about 22 micrograms: very large in comparison with subatomic particles, and within the mass range of living organisms. Similarly, the related units of energy and of momentum are in the range of some everyday phenomena.

Significance

Planck units have little anthropocentric arbitrariness, but do still involve some arbitrary choices in terms of the defining constants. Unlike the metre and second, which exist as base units in the SI system for historical reasons, the Planck length and Planck time are conceptually linked at a fundamental physical level. Consequently, natural units help physicists to reframe questions. Frank Wilczek puts it succinctly:

We see that the question [posed] is not, "Why is gravity so feeble?" but rather, "Why is the proton's mass so small?" For in natural (Planck) units, the strength of gravity simply is what it is, a primary quantity, while the proton's mass is the tiny number 1/13 quintillion.

While it is true that the electrostatic repulsive force between two protons (alone in free space) greatly exceeds the gravitational attractive force between the same two protons, this is not about the relative strengths of the two fundamental forces. From the point of view of Planck units, this is comparing apples with oranges, because mass and electric charge are incommensurable quantities. Rather, the disparity of magnitude of force is a manifestation of that the proton charge is approximately the unit charge but the proton mass is far less than the unit mass in a system that treats both forces as having the same form.

When Planck proposed his units, the goal was only that of establishing a universal ("natural") way of measuring objects, without giving any special meaning to quantities that measured one single unit. In 1918, Arthur Eddington suggested that the Planck length could have a special significance for understanding gravitation, but this suggestion was not influential. During the 1950s, multiple authors including Lev Landau and Oskar Klein argued that quantities on the order of the Planck scale indicated the limits of the validity of quantum field theory. John Archibald Wheeler proposed in 1955 that quantum fluctuations of spacetime become significant at the Planck scale, though at the time he was unaware of Planck's unit system. In 1959, C. A. Mead showed that distances that measured of the order of one Planck length, or, similarly, times that measured of the order of Planck time, did carry special implications related to Heisenberg's uncertainty principle:

An analysis of the effect of gravitation on hypothetical experiments indicates that it is impossible to measure the position of a particle with error less than 𝛥⁢𝑥 ≳ √𝐺 = 1.6 × 10−33 cm, where 𝐺 is the gravitational constant in natural units. A similar limitation applies to the precise synchronization of clocks.

Planck scale

The Planck mass can be defined as the mass at which a particle's Schwarzschild radius equals its Compton wavelength, making it the threshold where gravity and quantum effects become equally important. This mass-radius log plot of all the universe demonstrates it visually

In particle physics and physical cosmology, the Planck scale is an energy scale around 1.22×1028 eV (the Planck energy, corresponding to the energy equivalent of the Planck mass, 2.17645×10−8 kg) at which quantum effects of gravity become significant. At this scale, present descriptions and theories of sub-atomic particle interactions in terms of quantum field theory break down and become inadequate, due to the impact of the apparent non-renormalizability of gravity within current theories.

Relationship to gravity

At the Planck length scale, the strength of gravity is expected to become comparable with the other forces, and it has been theorized that all the fundamental forces are unified at that scale, but the exact mechanism of this unification remains unknown. The Planck scale is therefore the point at which the effects of quantum gravity can no longer be ignored in other fundamental interactions, where current calculations and approaches begin to break down, and a means to take account of its impact is necessary. On these grounds, it has been speculated that it may be an approximate lower limit at which a black hole could be formed by collapse.

While physicists have a fairly good understanding of the other fundamental interactions of forces on the quantum level, gravity is problematic, and cannot be integrated with quantum mechanics at very high energies using the usual framework of quantum field theory. At lesser energy levels it is usually ignored, while for energies approaching or exceeding the Planck scale, a new theory of quantum gravity is necessary. Approaches to this problem include string theory and M-theory, loop quantum gravity, noncommutative geometry, and causal set theory.

In cosmology

In Big Bang cosmology, the Planck epoch or Planck era is the earliest stage of the Big Bang, before the time passed was equal to the Planck time, tP, or approximately 10−43 seconds. There is no currently available physical theory to describe such short times, and it is not clear in what sense the concept of time is meaningful for values smaller than the Planck time. It is generally assumed that quantum effects of gravity dominate physical interactions at this time scale. At this scale, the unified force of the Standard Model is assumed to be unified with gravitation. Immeasurably hot and dense, the state of the Planck epoch was succeeded by the grand unification epoch, where gravitation is separated from the unified force of the Standard Model, in turn followed by the inflationary epoch, which ended after about 10−32 seconds (or about 1011 tP).

Table 3 lists properties of the observable universe today expressed in Planck units.

Table 3: Today's universe in Planck units
Property of
present-day observable universe
Approximate number
of Planck units
Equivalents
Age 8.08 × 1060 tP 4.35 × 1017 s or 1.38 × 1010 years
Diameter 5.4 × 1061 lP 8.7 × 1026 m or 9.2 × 1010 light-years
Mass approx. 1060 mP 3 × 1052 kg or 1.5 × 1022 solar masses (only counting stars)
1080 protons (sometimes known as the Eddington number)
Density 1.8 × 10−123 mPlP−3 9.9 × 10−27 kg⋅m−3
Temperature 1.9 × 10−32 TP 2.725 K
temperature of the cosmic microwave background radiation
Cosmological constant ≈ 10−122 l −2
P
≈ 10−52 m−2
Hubble constant ≈ 10−61 t −1
P
≈ 10−18 s−1 ≈ 102 (km/s)/Mpc

After the measurement of the cosmological constant (Λ) in 1998, estimated at 10−122 in Planck units, it was noted that this is suggestively close to the reciprocal of the age of the universe (T) squared. Barrow and Shaw proposed a modified theory in which Λ is a field evolving in such a way that its value remains Λ ~ T−2 throughout the history of the universe.

Analysis of the units

Planck length

The Planck length, denoted P, is a unit of length defined as:It is equal to 1.616255(18)×10−35 m (the two digits enclosed by parentheses are the estimated standard error associated with the reported numerical value) or about 10−20 times the diameter of a proton. It can be motivated in various ways, such as considering a particle whose reduced Compton wavelength is comparable to its Schwarzschild radius, though whether those concepts are in fact simultaneously applicable is open to debate. (The same heuristic argument simultaneously motivates the Planck mass.)

The Planck length is a distance scale of interest in speculations about quantum gravity. The Bekenstein–Hawking entropy of a black hole is one-fourth the area of its event horizon in units of Planck length squared. Since the 1950s, it has been conjectured that quantum fluctuations of the spacetime metric might make the familiar notion of distance inapplicable below the Planck length. This is sometimes expressed by saying that "spacetime becomes a foam at the Planck scale". It is possible that the Planck length is the shortest physically measurable distance, since any attempt to investigate the possible existence of shorter distances, by performing higher-energy collisions, would result in black hole production. Higher-energy collisions, rather than splitting matter into finer pieces, would simply produce bigger black holes.

The strings of string theory are modeled to be on the order of the Planck length. In theories with large extra dimensions, the Planck length calculated from the observed value of can be smaller than the true, fundamental Planck length.

Planck time

The Planck time, denoted tP, is defined as: This is the time required for light to travel a distance of 1 Planck length in vacuum, which is a time interval of approximately 5.39×10−44 s. No current physical theory can describe timescales shorter than the Planck time, such as the earliest events after the Big Bang. Some conjectures state that the structure of time need not remain smooth on intervals comparable to the Planck time.

Planck energy

The Planck energy EP is approximately equal to the energy released in the combustion of the fuel in an automobile fuel tank (57.2 L at 34.2 MJ/L of chemical energy). The ultra-high-energy cosmic ray observed in 1991 had a measured energy of about 50 J, equivalent to about 2.5×10−8 EP.

Proposals for theories of doubly special relativity posit that, in addition to the speed of light, an energy scale is also invariant for all inertial observers. Typically, this energy scale is chosen to be the Planck energy.

Planck unit of force

The Planck unit of force may be thought of as the derived unit of force in the Planck system if the Planck units of time, length, and mass are considered to be base units.It is the gravitational attractive force of two bodies of 1 Planck mass each that are held 1 Planck length apart. One convention for the Planck charge is to choose it so that the electrostatic repulsion of two objects with Planck charge and mass that are held 1 Planck length apart balances the Newtonian attraction between them.

Some authors have argued that the Planck force is on the order of the maximum force that can occur between two bodies. However, the validity of these conjectures has been disputed.

Planck temperature

The Planck temperature TP is 1.416784(16)×1032 K. At this temperature, the wavelength of light emitted by thermal radiation reaches the Planck length. There are no known physical models able to describe temperatures greater than TP; a quantum theory of gravity would be required to model the extreme energies attained. Hypothetically, a system in thermal equilibrium at the Planck temperature might contain Planck-scale black holes, constantly being formed from thermal radiation and decaying via Hawking evaporation. Adding energy to such a system might decrease its temperature by creating larger black holes, whose Hawking temperature is lower.

Nondimensionalized equations

Physical quantities that have different dimensions (such as time and length) cannot be equated even if they are numerically equal (e.g., 1 second is not the same as 1 metre). In theoretical physics, however, this scruple may be set aside, by a process called nondimensionalization. The effective result is that many fundamental equations of physics, which often include some of the constants used to define Planck units, become equations where these constants are replaced by a 1.

Examples include the energy–momentum relation (which becomes ) and the Dirac equation (which becomes ).

Alternative choices of normalization

As already stated above, Planck units are derived by "normalizing" the numerical values of certain fundamental constants to 1. These normalizations are neither the only ones possible nor necessarily the best. Moreover, the choice of what factors to normalize, among the factors appearing in the fundamental equations of physics, is not evident, and the values of the Planck units are sensitive to this choice.

The factor 4π is ubiquitous in theoretical physics because in three-dimensional space, the surface area of a sphere of radius r is 4πr2. This, along with the concept of flux, are the basis for the inverse-square law, Gauss's law, and the divergence operator applied to flux density. For example, gravitational and electrostatic fields produced by point objects have spherical symmetry, and so the electric flux through a sphere of radius r around a point charge will be distributed uniformly over that sphere. From this, it follows that a factor of 4πr2 will appear in the denominator of Coulomb's law in rationalized form. (Both the numerical factor and the power of the dependence on r would change if space were higher-dimensional; the correct expressions can be deduced from the geometry of higher-dimensional spheres.) Likewise for Newton's law of universal gravitation: a factor of 4π naturally appears in Poisson's equation when relating the gravitational potential to the distribution of matter.

Hence a substantial body of physical theory developed since Planck's 1899 paper suggests normalizing not G but 4πG (or 8πG) to 1. Doing so would introduce a factor of 1/4π (or 1/8π) into the nondimensionalized form of the law of universal gravitation, consistent with the modern rationalized formulation of Coulomb's law in terms of the vacuum permittivity. In fact, alternative normalizations frequently preserve the factor of 1/4π in the nondimensionalized form of Coulomb's law as well, so that the nondimensionalized Maxwell's equations for electromagnetism and gravitoelectromagnetism both take the same form as those for electromagnetism in SI, which do not have any factors of 4π. When this is applied to electromagnetic constants, ε0, this unit system is called "rationalized". When applied additionally to gravitation and Planck units, these are called rationalized Planck units and are seen in high-energy physics.

The rationalized Planck units are defined so that c = 4πG = ħ = ε0 = kB = 1.

There are several possible alternative normalizations.

Gravitational constant

In 1899, Newton's law of universal gravitation was still seen as exact, rather than as a convenient approximation holding for "small" velocities and masses (the approximate nature of Newton's law was shown following the development of general relativity in 1915). Hence Planck normalized to 1 the gravitational constant G in Newton's law. In theories emerging after 1899, G nearly always appears in formulae multiplied by 4π or a small integer multiple thereof. Hence, a choice to be made when designing a system of natural units is which, if any, instances of 4π appearing in the equations of physics are to be eliminated via the normalization.

Virtual particle

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Virtual_particle

A virtual particle is a theoretical transient particle that exhibits some of the characteristics of an ordinary particle, while having its existence limited by the uncertainty principle, which allows the virtual particles to spontaneously emerge from vacuum at short time and space ranges. The concept of virtual particles arises in the perturbation theory of quantum field theory (QFT) where interactions between ordinary particles are described in terms of exchanges of virtual particles. A process involving virtual particles can be described by a schematic representation known as a Feynman diagram, in which virtual particles are represented by internal lines.

Virtual particles do not necessarily carry the same mass as the corresponding ordinary particle, although they always conserve energy and momentum. The closer its characteristics come to those of ordinary particles, the longer the virtual particle exists. They are important in the physics of many processes, including particle scattering and Casimir forces. In quantum field theory, forces—such as the electromagnetic repulsion or attraction between two charges—can be thought of as resulting from the exchange of virtual photons between the charges. Virtual photons are the exchange particles for the electromagnetic interaction.

The term is somewhat loose and vaguely defined, in that it refers to the view that the world is made up of "real particles". "Real particles" are better understood to be excitations of the underlying quantum fields. Virtual particles are also excitations of the underlying fields, but are "temporary" in the sense that they appear in calculations of interactions, but never as asymptotic states or indices to the scattering matrix. The accuracy and use of virtual particles in calculations is firmly established, but as they cannot be detected in experiments, deciding how to precisely describe them is a topic of debate. Although widely used, they are by no means a necessary feature of QFT, but rather are mathematical conveniences — as demonstrated by lattice field theory, which avoids using the concept altogether.

Properties

The concept of virtual particles arises in the perturbation theory of quantum field theory, an approximation scheme in which interactions (in essence, forces) between actual particles are calculated in terms of exchanges of virtual particles. Such calculations are often performed using schematic representations known as Feynman diagrams, in which virtual particles appear as internal lines. By expressing the interaction in terms of the exchange of a virtual particle with four-momentum q, where q is given by the difference between the four-momenta of the particles entering and leaving the interaction vertex, both momentum and energy are conserved at the interaction vertices of the Feynman diagram.

A virtual particle does not precisely obey the energy–momentum relation m2c4 = E2p2c2. Its kinetic energy may not have the usual relationship to velocity. It can be negative. This is expressed by the phrase off mass shell. The probability amplitude for a virtual particle to exist tends to be canceled out by destructive interference over longer distances and times. As a consequence, a real photon is massless and thus has only two polarization states, whereas a virtual one, being effectively massive, has three polarization states.

Quantum tunnelling may be considered a manifestation of virtual particle exchanges. The range of forces carried by virtual particles is limited by the uncertainty principle, which regards energy and time as conjugate variables; thus, virtual particles of larger mass have more limited range.

Written in the usual mathematical notations, in the equations of physics, there is no mark of the distinction between virtual and actual particles. The amplitudes of processes with a virtual particle interfere with the amplitudes of processes without it, whereas for an actual particle the cases of existence and non-existence cease to be coherent with each other and do not interfere any more. In the quantum field theory view, actual particles are viewed as being detectable excitations of underlying quantum fields. Virtual particles are also viewed as excitations of the underlying fields, but appear only as forces, not as detectable particles. They are "temporary" in the sense that they appear in some calculations, but are not detected as single particles. Thus, in mathematical terms, they never appear as indices to the scattering matrix, which is to say, they never appear as the observable inputs and outputs of the physical process being modelled.

There are two principal ways in which the notion of virtual particles appears in modern physics. They appear as intermediate terms in Feynman diagrams; that is, as terms in a perturbative calculation. They also appear as an infinite set of states to be summed or integrated over in the calculation of a semi-non-perturbative effect. In the latter case, it is sometimes said that virtual particles contribute to a mechanism that mediates the effect, or that the effect occurs through the virtual particles.

Manifestations

There are many observable physical phenomena that arise in interactions involving virtual particles. For bosonic particles that exhibit rest mass when they are free and actual, virtual interactions are characterized by the relatively short range of the force interaction produced by particle exchange. Confinement can lead to a short range, too. Examples of such short-range interactions are the strong and weak forces, and their associated field bosons.

For the gravitational and electromagnetic forces, the zero rest-mass of the associated boson particle permits long-range forces to be mediated by virtual particles. However, in the case of photons, power and information transfer by virtual particles is a relatively short-range phenomenon (existing only within a few wavelengths of the field-disturbance, which carries information or transferred power), as for example seen in the characteristically short range of inductive and capacitative effects in the near field zone of coils and antennas.

Some field interactions which may be seen in terms of virtual particles are:

  • The Coulomb force (static electric force) between electric charges. It is caused by the exchange of virtual photons. In symmetric 3-dimensional space this exchange results in the inverse square law for electric force. Since the photon has no mass, the coulomb potential has an infinite range.
  • The magnetic field between magnetic dipoles. It is caused by the exchange of virtual photons. In symmetric 3-dimensional space, this exchange results in the inverse cube law for magnetic force. Since the photon has no mass, the magnetic potential has an infinite range. Even though the range is infinite, the time lapse allowed for a virtual photon existence is not infinite.
  • Electromagnetic induction. This phenomenon transfers energy to and from a magnetic coil via a changing (electro)magnetic field.
  • The strong nuclear force between quarks is the result of interaction of virtual gluons. The residual of this force outside of quark triplets (neutron and proton) holds neutrons and protons together in nuclei, and is due to virtual mesons such as the pi meson and rho meson.
  • The weak nuclear force is the result of exchange by virtual W and Z bosons.
  • The spontaneous emission of a photon during the decay of an excited atom or excited nucleus; such a decay is prohibited by ordinary quantum mechanics and requires the quantization of the electromagnetic field for its explanation.
  • The Casimir effect, where the ground state of the quantized electromagnetic field causes attraction between a pair of electrically neutral metal plates.
  • The van der Waals force, which is partly due to the Casimir effect between two atoms.
  • Vacuum polarization, which involves pair production or the decay of the vacuum, which is the spontaneous production of particle-antiparticle pairs (such as electron-positron).
  • Lamb shift of positions of atomic levels.
  • The impedance of free space, which defines the ratio between the electric field strength |E| and the magnetic field strength |H|: Z0 = |E| / |H|.
  • Much of the so-called near-field of radio antennas, where the magnetic and electric effects of the changing current in the antenna wire and the charge effects of the wire's capacitive charge may be (and usually are) important contributors to the total EM field close to the source, but both of which effects are dipole effects that decay with increasing distance from the antenna much more quickly than do the influence of "conventional" electromagnetic waves that are "far" from the source. These far-field waves, for which E is (in the limit of long distance) equal to cB, are composed of actual photons. Actual and virtual photons are mixed near an antenna, with the virtual photons responsible only for the "extra" magnetic-inductive and transient electric-dipole effects, which cause any imbalance between E and cB. As distance from the antenna grows, the near-field effects (as dipole fields) die out more quickly, and only the "radiative" effects that are due to actual photons remain as important effects. Although virtual effects extend to infinity, they drop off in field strength as 1/r2 rather than the field of EM waves composed of actual photons, which drop as 1/r.

Most of these have analogous effects in solid-state physics; indeed, one can often gain a better intuitive understanding by examining these cases. In semiconductors, the roles of electrons, positrons and photons in field theory are replaced by electrons in the conduction band, holes in the valence band, and phonons or vibrations of the crystal lattice. A virtual particle is in a virtual state where the probability amplitude is not conserved. Examples of macroscopic virtual phonons, photons, and electrons in the case of the tunneling process were presented by Günter Nimtz and Alfons A. Stahlhofen.

Feynman diagrams

One particle exchange scattering diagram

The calculation of scattering amplitudes in theoretical particle physics requires the use of some rather large and complicated integrals over a large number of variables. These integrals do, however, have a regular structure, and may be represented as Feynman diagrams. The appeal of the Feynman diagrams is strong, as it allows for a simple visual presentation of what would otherwise be a rather arcane and abstract formula. In particular, part of the appeal is that the outgoing legs of a Feynman diagram can be associated with actual, on-shell particles. Thus, it is natural to associate the other lines in the diagram with particles as well, called the "virtual particles". In mathematical terms, they correspond to the propagators appearing in the diagram.

In the adjacent image, the solid lines correspond to actual particles (of momentum p1 and so on), while the dotted line corresponds to a virtual particle carrying momentum k. For example, if the solid lines were to correspond to electrons interacting by means of the electromagnetic interaction, the dotted line would correspond to the exchange of a virtual photon. In the case of interacting nucleons, the dotted line would be a virtual pion. In the case of quarks interacting by means of the strong force, the dotted line would be a virtual gluon, and so on.

One-loop diagram with fermion propagator

Virtual particles may be mesons or vector bosons, as in the example above; they may also be fermions. However, in order to preserve quantum numbers, most simple diagrams involving fermion exchange are prohibited. The image to the right shows an allowed diagram, a one-loop diagram. The solid lines correspond to a fermion propagator, the wavy lines to bosons.

Vacuums

In formal terms, a particle is considered to be an eigenstate of the particle number operator aa, where a is the particle annihilation operator and a the particle creation operator (sometimes collectively called ladder operators). In many cases, the particle number operator does not commute with the Hamiltonian for the system. This implies the number of particles in an area of space is not a well-defined quantity but, like other quantum observables, is represented by a probability distribution. Since these particles are not certain to exist, they are called virtual particles or vacuum fluctuations of vacuum energy. In a certain sense, they can be understood to be a manifestation of the time-energy uncertainty principle in a vacuum.

An important example of the "presence" of virtual particles in a vacuum is the Casimir effect. Here, the explanation of the effect requires that the total energy of all of the virtual particles in a vacuum can be added together. Thus, although the virtual particles themselves are not directly observable in the laboratory, they do leave an observable effect: Their zero-point energy results in forces acting on suitably arranged metal plates or dielectrics. On the other hand, the Casimir effect can be interpreted as the relativistic van der Waals force.

Pair production

Virtual particles are often popularly described as coming in pairs, a particle and antiparticle which can be of any kind. These pairs exist for an extremely short time, and then mutually annihilate, or in some cases, the pair may be boosted apart using external energy so that they avoid annihilation and become actual particles, as described below.

This may occur in one of two ways. In an accelerating frame of reference, the virtual particles may appear to be actual to the accelerating observer; this is known as the Unruh effect. In short, the vacuum of a stationary frame appears, to the accelerated observer, to be a warm gas of actual particles in thermodynamic equilibrium.

Another example is pair production in very strong electric fields, sometimes called vacuum decay. If, for example, a pair of atomic nuclei are merged to very briefly form a nucleus with a charge greater than about 140, (that is, larger than about the inverse of the fine-structure constant, which is a dimensionless quantity), the strength of the electric field will be such that it will be energetically favorable to create positron–electron pairs out of the vacuum or Dirac sea, with the electron attracted to the nucleus to annihilate the positive charge. This pair-creation amplitude was first calculated by Julian Schwinger in 1951.

Compared to actual particles

As a consequence of quantum mechanical uncertainty, any object or process that exists for a limited time or in a limited volume cannot have a precisely defined energy or momentum. For this reason, virtual particles – which exist only temporarily as they are exchanged between ordinary particles – do not typically obey the mass-shell relation; the longer a virtual particle exists, the more the energy and momentum approach the mass-shell relation.

The lifetime of real particles is typically vastly longer than the lifetime of the virtual particles. Electromagnetic radiation consists of real photons which may travel light years between the emitter and absorber, but (Coulombic) electrostatic attraction and repulsion is a relatively short-range force that is a consequence of the exchange of virtual photons

Philosophy of science

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Philosophy_of_science Philosophy of ...