Search This Blog

Sunday, May 28, 2023

Observational cosmology

From Wikipedia, the free encyclopedia

Observational cosmology is the study of the structure, the evolution and the origin of the universe through observation, using instruments such as telescopes and cosmic ray detectors.

Early observations

The science of physical cosmology as it is practiced today had its subject material defined in the years following the Shapley-Curtis debate when it was determined that the universe had a larger scale than the Milky Way galaxy. This was precipitated by observations that established the size and the dynamics of the cosmos that could be explained by Albert Einstein's General Theory of Relativity. In its infancy, cosmology was a speculative science based on a very limited number of observations and characterized by a dispute between steady state theorists and promoters of Big Bang cosmology. It was not until the 1990s and beyond that the astronomical observations would be able to eliminate competing theories and drive the science to the "Golden Age of Cosmology" which was heralded by David Schramm at a National Academy of Sciences colloquium in 1992.

Hubble's law and the cosmic distance ladder

Distance measurements in astronomy have historically been and continue to be confounded by considerable measurement uncertainty. In particular, while stellar parallax can be used to measure the distance to nearby stars, the observational limits imposed by the difficulty in measuring the minuscule parallaxes associated with objects beyond our galaxy meant that astronomers had to look for alternative ways to measure cosmic distances. To this end, a standard candle measurement for Cepheid variables was discovered by Henrietta Swan Leavitt in 1908 which would provide Edwin Hubble with the rung on the cosmic distance ladder he would need to determine the distance to spiral nebula. Hubble used the 100-inch Hooker Telescope at Mount Wilson Observatory to identify individual stars in those galaxies, and determine the distance to the galaxies by isolating individual Cepheids. This firmly established the spiral nebula as being objects well outside the Milky Way galaxy. Determining the distance to "island universes", as they were dubbed in the popular media, established the scale of the universe and settled the Shapley-Curtis debate once and for all.

In 1927, by combining various measurements, including Hubble's distance measurements and Vesto Slipher's determinations of redshifts for these objects, Georges LemaĆ®tre was the first to estimate a constant of proportionality between galaxies' distances and what was termed their "recessional velocities", finding a value of about 600 km/s/Mpc. He showed that this was theoretically expected in a universe model based on general relativity. Two years later, Hubble showed that the relation between the distances and velocities was a positive correlation and had a slope of about 500 km/s/Mpc. This correlation would come to be known as Hubble's law and would serve as the observational foundation for the expanding universe theories on which cosmology is still based. The publication of the observations by Slipher, Wirtz, Hubble and their colleagues and the acceptance by the theorists of their theoretical implications in light of Einstein's General theory of relativity is considered the beginning of the modern science of cosmology.

Nuclide abundances

Determination of the cosmic abundance of elements has a history dating back to early spectroscopic measurements of light from astronomical objects and the identification of emission and absorption lines which corresponded to particular electronic transitions in chemical elements identified on Earth. For example, the element Helium was first identified through its spectroscopic signature in the Sun before it was isolated as a gas on Earth.

Computing relative abundances was achieved through corresponding spectroscopic observations to measurements of the elemental composition of meteorites.

Detection of the cosmic microwave background

the CMB seen by WMAP

A cosmic microwave background was predicted in 1948 by George Gamow and Ralph Alpher, and by Alpher and Robert Herman as due to the hot Big Bang model. Moreover, Alpher and Herman were able to estimate the temperature, but their results were not widely discussed in the community. Their prediction was rediscovered by Robert Dicke and Yakov Zel'dovich in the early 1960s with the first published recognition of the CMB radiation as a detectable phenomenon appeared in a brief paper by Soviet astrophysicists A. G. Doroshkevich and Igor Novikov, in the spring of 1964. In 1964, David Todd Wilkinson and Peter Roll, Dicke's colleagues at Princeton University, began constructing a Dicke radiometer to measure the cosmic microwave background. In 1965, Arno Penzias and Robert Woodrow Wilson at the Crawford Hill location of Bell Telephone Laboratories in nearby Holmdel Township, New Jersey had built a Dicke radiometer that they intended to use for radio astronomy and satellite communication experiments. Their instrument had an excess 3.5 K antenna temperature which they could not account for. After receiving a telephone call from Crawford Hill, Dicke famously quipped: "Boys, we've been scooped." A meeting between the Princeton and Crawford Hill groups determined that the antenna temperature was indeed due to the microwave background. Penzias and Wilson received the 1978 Nobel Prize in Physics for their discovery.

Modern observations

Today, observational cosmology continues to test the predictions of theoretical cosmology and has led to the refinement of cosmological models. For example, the observational evidence for dark matter has heavily influenced theoretical modeling of structure and galaxy formation. When trying to calibrate the Hubble diagram with accurate supernova standard candles, observational evidence for dark energy was obtained in the late 1990s. These observations have been incorporated into a six-parameter framework known as the Lambda-CDM model which explains the evolution of the universe in terms of its constituent material. This model has subsequently been verified by detailed observations of the cosmic microwave background, especially through the WMAP experiment.

Included here are the modern observational efforts that have directly influenced cosmology.

Redshift surveys

With the advent of automated telescopes and improvements in spectroscopes, a number of collaborations have been made to map the universe in redshift space. By combining redshift with angular position data, a redshift survey maps the 3D distribution of matter within a field of the sky. These observations are used to measure properties of the large-scale structure of the universe. The Great Wall, a vast supercluster of galaxies over 500 million light-years wide, provides a dramatic example of a large-scale structure that redshift surveys can detect.

3D visualization of the dark matter distribution from the Hyper Suprime-Cam redshift survey on Subaru Telescope in 2018

The first redshift survey was the CfA Redshift Survey, started in 1977 with the initial data collection completed in 1982. More recently, the 2dF Galaxy Redshift Survey determined the large-scale structure of one section of the Universe, measuring z-values for over 220,000 galaxies; data collection was completed in 2002, and the final data set was released 30 June 2003. (In addition to mapping large-scale patterns of galaxies, 2dF established an upper limit on neutrino mass.) Another notable investigation, the Sloan Digital Sky Survey (SDSS), is ongoing as of 2011 and aims to obtain measurements on around 100 million objects. SDSS has recorded redshifts for galaxies as high as 0.4, and has been involved in the detection of quasars beyond z = 6. The DEEP2 Redshift Survey uses the Keck telescopes with the new "DEIMOS" spectrograph; a follow-up to the pilot program DEEP1, DEEP2 is designed to measure faint galaxies with redshifts 0.7 and above, and it is therefore planned to provide a complement to SDSS and 2dF.

Cosmic microwave background experiments

Comparison of CMB results from COBE, WMAP and Planck (March 21, 2013)

Subsequent to the discovery of the CMB, hundreds of cosmic microwave background experiments have been conducted to measure and characterize the signatures of the radiation. The most famous experiment is probably the NASA Cosmic Background Explorer (COBE) satellite that orbited in 1989–1996 and which detected and quantified the large scale anisotropies at the limit of its detection capabilities. Inspired by the initial COBE results of an extremely isotropic and homogeneous background, a series of ground- and balloon-based experiments quantified CMB anisotropies on smaller angular scales over the next decade. The primary goal of these experiments was to measure the angular scale of the first acoustic peak, for which COBE did not have sufficient resolution. These measurements were able to rule out cosmic strings as the leading theory of cosmic structure formation, and suggested cosmic inflation was the right theory.

During the 1990s, the first peak was measured with increasing sensitivity and by 2000 the BOOMERanG experiment reported that the highest power fluctuations occur at scales of approximately one degree. Together with other cosmological data, these results implied that the geometry of the universe is flat. A number of ground-based interferometers provided measurements of the fluctuations with higher accuracy over the next three years, including the Very Small Array, Degree Angular Scale Interferometer (DASI), and the Cosmic Background Imager (CBI). DASI made the first detection of the polarization of the CMB and the CBI provided the first E-mode polarization spectrum with compelling evidence that it is out of phase with the T-mode spectrum.

In June 2001, NASA launched a second CMB space mission, WMAP, to make much more precise measurements of the large scale anisotropies over the full sky. WMAP used symmetric, rapid-multi-modulated scanning, rapid switching radiometers to minimize non-sky signal noise. The first results from this mission, disclosed in 2003, were detailed measurements of the angular power spectrum at a scale of less than one degree, tightly constraining various cosmological parameters. The results are broadly consistent with those expected from cosmic inflation as well as various other competing theories, and are available in detail at NASA's data bank for Cosmic Microwave Background (CMB) (see links below). Although WMAP provided very accurate measurements of the large scale angular fluctuations in the CMB (structures about as broad in the sky as the moon), it did not have the angular resolution to measure the smaller scale fluctuations which had been observed by former ground-based interferometers.

A third space mission, the ESA (European Space Agency) Planck Surveyor, was launched in May 2009 and performed an even more detailed investigation until it was shut down in October 2013. Planck employed both HEMT radiometers and bolometer technology and measured the CMB at a smaller scale than WMAP. Its detectors were trialled in the Antarctic Viper telescope as ACBAR (Arcminute Cosmology Bolometer Array Receiver) experiment—which has produced the most precise measurements at small angular scales to date—and in the Archeops balloon telescope.

On 21 March 2013, the European-led research team behind the Planck cosmology probe released the mission's all-sky map (565x318 jpeg, 3600x1800 jpeg) of the cosmic microwave background. The map suggests the universe is slightly older than researchers expected. According to the map, subtle fluctuations in temperature were imprinted on the deep sky when the cosmos was about 370000 years old. The imprint reflects ripples that arose as early, in the existence of the universe, as the first nonillionth of a second. Apparently, these ripples gave rise to the present vast cosmic web of galaxy clusters and dark matter. Based on the 2013 data, the universe contains 4.9% ordinary matter, 26.8% dark matter and 68.3% dark energy. On 5 February 2015, new data was released by the Planck mission, according to which the age of the universe is 13.799±0.021 billion years old and the Hubble constant was measured to be 67.74±0.46 (km/s)/Mp

Additional ground-based instruments such as the South Pole Telescope in Antarctica and the proposed Clover Project, Atacama Cosmology Telescope and the QUIET telescope in Chile will provide additional data not available from satellite observations, possibly including the B-mode polarization.

Telescope observations

Radio

The brightest sources of low-frequency radio emission (10 MHz and 100 GHz) are radio galaxies which can be observed out to extremely high redshifts. These are subsets of the active galaxies that have extended features known as lobes and jets which extend away from the galactic nucleus distances on the order of megaparsecs. Because radio galaxies are so bright, astronomers have used them to probe extreme distances and early times in the evolution of the universe.

Infrared

Far infrared observations including submillimeter astronomy have revealed a number of sources at cosmological distances. With the exception of a few atmospheric windows, most of infrared light is blocked by the atmosphere, so the observations generally take place from balloon or space-based instruments. Current observational experiments in the infrared include NICMOS, the Cosmic Origins Spectrograph, the Spitzer Space Telescope, the Keck Interferometer, the Stratospheric Observatory For Infrared Astronomy, and the Herschel Space Observatory. The next large space telescope planned by NASA, the James Webb Space Telescope will also explore in the infrared.

An additional infrared survey, the Two-Micron All Sky Survey, has also been very useful in revealing the distribution of galaxies, similar to other optical surveys described below.

Optical rays (visible to human eyes)

Optical light is still the primary means by which astronomy occurs, and in the context of cosmology, this means observing distant galaxies and galaxy clusters in order to learn about the large scale structure of the Universe as well as galaxy evolution. Redshift surveys have been a common means by which this has been accomplished with some of the most famous including the 2dF Galaxy Redshift Survey, the Sloan Digital Sky Survey, and the upcoming Large Synoptic Survey Telescope. These optical observations generally use either photometry or spectroscopy to measure the redshift of a galaxy and then, via Hubble's Law, determine its distance modulo redshift distortions due to peculiar velocities. Additionally, the position of the galaxies as seen on the sky in celestial coordinates can be used to gain information about the other two spatial dimensions.

Very deep observations (which is to say sensitive to dim sources) are also useful tools in cosmology. The Hubble Deep Field, Hubble Ultra Deep Field, Hubble Extreme Deep Field, and Hubble Deep Field South are all examples of this.

Ultraviolet

See Ultraviolet astronomy.

X-rays

See X-ray astronomy.

Gamma-rays

See Gamma-ray astronomy.

Cosmic ray observations

See Cosmic-ray observatory.

Future observations

Cosmic neutrinos

It is a prediction of the Big Bang model that the universe is filled with a neutrino background radiation, analogous to the cosmic microwave background radiation. The microwave background is a relic from when the universe was about 380,000 years old, but the neutrino background is a relic from when the universe was about two seconds old.

If this neutrino radiation could be observed, it would be a window into very early stages of the universe. Unfortunately, these neutrinos would now be very cold, and so they are effectively impossible to observe directly.

Gravitational waves

Gravitational-wave observatory

From Wikipedia, the free encyclopedia
A schematic diagram of a laser interferometer.

A gravitational-wave detector (used in a gravitational-wave observatory) is any device designed to measure tiny distortions of spacetime called gravitational waves. Since the 1960s, various kinds of gravitational-wave detectors have been built and constantly improved. The present-day generation of laser interferometers has reached the necessary sensitivity to detect gravitational waves from astronomical sources, thus forming the primary tool of gravitational-wave astronomy.

The first direct detection of gravitational waves made in 2015 by the Advanced LIGO observatories, a feat which was awarded the 2017 Nobel Prize in Physics.

Challenge

The direct detection of gravitational waves is complicated by the extraordinarily small effect the waves produce on a detector. The amplitude of a spherical wave falls off as the inverse of the distance from the source. Thus, even waves from extreme systems such as merging binary black holes die out to a very small amplitude by the time they reach the Earth. Astrophysicists predicted that some gravitational waves passing the Earth might produce differential motion on the order 10−18 m in a LIGO-size instrument.

Resonant mass antennas

A simple device to detect the expected wave motion is called a resonant mass antenna – a large, solid body of metal isolated from outside vibrations. This type of instrument was the first type of gravitational-wave detector. Strains in space due to an incident gravitational wave excite the body's resonant frequency and could thus be amplified to detectable levels. Conceivably, a nearby supernova might be strong enough to be seen without resonant amplification. However, up to 2018, no gravitational wave observation that would have been widely accepted by the research community has been made on any type of resonant mass antenna, despite certain claims of observation by researchers operating the antennas.

There are three types of resonant mass antenna that have been built: room-temperature bar antennas, cryogenically cooled bar antennas and cryogenically cooled spherical antennas.

The earliest type was the room-temperature bar-shaped antenna called a Weber bar; these were dominant in 1960s and 1970s and many were built around the world. It was claimed by Weber and some others in the late 1960s and early 1970s that these devices detected gravitational waves; however, other experimenters failed to detect gravitational waves using them, and a consensus developed that Weber bars would not be a practical means to detect gravitational waves.

The second generation of resonant mass antennas, developed in the 1980s and 1990s, were the cryogenic bar antennas which are also sometimes called Weber bars. In the 1990s there were five major cryogenic bar antennas: AURIGA (Padua, Italy), NAUTILUS (Rome, Italy), EXPLORER (CERN, Switzerland), ALLEGRO (Louisiana, US), and NIOBE (Perth, Australia). In 1997, these five antennas run by four research groups formed the International Gravitational Event Collaboration (IGEC) for collaboration. While there were several cases of unexplained deviations from the background signal, there were no confirmed instances of the observation of gravitational waves with these detectors.

In the 1980s, there was also a cryogenic bar antenna called ALTAIR, which, along with a room-temperature bar antenna called GEOGRAV, was built in Italy as a prototype for later bar antennas. Operators of the GEOGRAV-detector claimed to have observed gravitational waves coming from the supernova SN1987A (along with another room-temperature bar antenna), but these claims were not adopted by the wider community.

These modern cryogenic forms of the Weber bar operated with superconducting quantum interference devices to detect vibration (ALLEGRO, for example). Some of them continued in operation after the interferometric antennas started to reach astrophysical sensitivity, such as AURIGA, an ultracryogenic resonant cylindrical bar gravitational wave detector based at INFN in Italy. The AURIGA and LIGO teams collaborated in joint observations.

In the 2000s, the third generation of resonant mass antennas, the spherical cryogenic antennas, emerged. Four spherical antennas were proposed around year 2000 and two of them were built as downsized versions, the others were cancelled. The proposed antennas were GRAIL (Netherlands, downsized to MiniGRAIL), TIGA (US, small prototypes made), SFERA (Italy), and Graviton (Brasil, downsized to Mario Schenberg).

The two downsized antennas, MiniGRAIL and the Mario Schenberg, are similar in design and are operated as a collaborative effort. MiniGRAIL is based at Leiden University, and consists of an exactingly machined 1,150 kg (2,540 lb) sphere cryogenically cooled to 20 mK (−273.1300 °C; −459.6340 °F). The spherical configuration allows for equal sensitivity in all directions, and is somewhat experimentally simpler than larger linear devices requiring high vacuum. Events are detected by measuring deformation of the detector sphere. MiniGRAIL is highly sensitive in the 2–4 kHz range, suitable for detecting gravitational waves from rotating neutron star instabilities or small black hole mergers.

It is the current consensus that current cryogenic resonant mass detectors are not sensitive enough to detect anything but extremely powerful (and thus very rare) gravitational waves. As of 2020, no detection of gravitational waves by cryogenic resonant antennas has occurred.

Laser interferometers

Simplified operation of a gravitational wave observatory
Figure 1: A beamsplitter (green line) splits coherent light (from the white box) into two beams which reflect off the mirrors (cyan oblongs); only one outgoing and reflected beam in each arm is shown, and separated for clarity. The reflected beams recombine and an interference pattern is detected (purple circle).
Figure 2: A gravitational wave passing over the left arm (yellow) changes its length and thus the interference pattern.

A more sensitive detector uses laser interferometry to measure gravitational-wave induced motion between separated 'free' masses. This allows the masses to be separated by large distances (increasing the signal size); a further advantage is that it is sensitive to a wide range of frequencies (not just those near a resonance as is the case for Weber bars). Ground-based interferometers are now operational. Currently, the most sensitive is LIGO – the Laser Interferometer Gravitational Wave Observatory. LIGO has two detectors: one in Livingston, Louisiana; the other at the Hanford site in Richland, Washington. Each consists of two light storage arms which are 4 km in length. These are at 90 degree angles to each other, with the light passing through 1 m (3 ft 3 in) diameter vacuum tubes running the entire 4 kilometres (2.5 mi). A passing gravitational wave will slightly stretch one arm as it shortens the other. This is precisely the motion to which a Michelson interferometer is most sensitive.

Even with such long arms, the strongest gravitational waves will only change the distance between the ends of the arms by at most roughly 10−18 meters. LIGO should be able to detect gravitational waves as small as . Upgrades to LIGO and other detectors such as Virgo, GEO600, and TAMA 300 should increase the sensitivity further, and the next generation of instruments (Advanced LIGO Plus and Advanced Virgo Plus) will be more sensitive still. Another highly sensitive interferometer (KAGRA) began operations in 2020. A key point is that a ten-times increase in sensitivity (radius of "reach") increases the volume of space accessible to the instrument by one thousand. This increases the rate at which detectable signals should be seen from one per tens of years of observation, to tens per year.

Interferometric detectors are limited at high frequencies by shot noise, which occurs because the lasers produce photons randomly. One analogy is to rainfall: the rate of rainfall, like the laser intensity, is measurable, but the raindrops, like photons, fall at random times, causing fluctuations around the average value. This leads to noise at the output of the detector, much like radio static. In addition, for sufficiently high laser power, the random momentum transferred to the test masses by the laser photons shakes the mirrors, masking signals at low frequencies. Thermal noise (e.g., Brownian motion) is another limit to sensitivity. In addition to these "stationary" (constant) noise sources, all ground-based detectors are also limited at low frequencies by seismic noise and other forms of environmental vibration, and other "non-stationary" noise sources; creaks in mechanical structures, lightning or other large electrical disturbances, etc. may also create noise masking an event or may even imitate an event. All these must be taken into account and excluded by analysis before a detection may be considered a true gravitational-wave event.

Space-based interferometers, such as LISA and DECIGO, are also being developed. LISA's design calls for three test masses forming an equilateral triangle, with lasers from each spacecraft to each other spacecraft forming two independent interferometers. LISA is planned to occupy a solar orbit trailing the Earth, with each arm of the triangle being five million kilometers. This puts the detector in an excellent vacuum far from Earth-based sources of noise, though it will still be susceptible to shot noise, as well as artifacts caused by cosmic rays and solar wind.

Einstein@Home

In some sense, the easiest signals to detect should be constant sources. Supernovae and neutron star or black hole mergers should have larger amplitudes and be more interesting, but the waves generated will be more complicated. The waves given off by a spinning, bumpy neutron star would be "monochromatic" – like a pure tone in acoustics. It would not change very much in amplitude or frequency.

The Einstein@Home project is a distributed computing project similar to SETI@home intended to detect this type of simple gravitational wave. By taking data from LIGO and GEO, and sending it out in little pieces to thousands of volunteers for parallel analysis on their home computers, Einstein@Home can sift through the data far more quickly than would be possible otherwise.

Pulsar timing arrays

A different approach to detecting gravitational waves is used by pulsar timing arrays, such as the European Pulsar Timing Array, the North American Nanohertz Observatory for Gravitational Waves, and the Parkes Pulsar Timing Array. These projects propose to detect gravitational waves by looking at the effect these waves have on the incoming signals from an array of 20–50 well-known millisecond pulsars. As a gravitational wave passing through the Earth contracts space in one direction and expands space in another, the times of arrival of pulsar signals from those directions are shifted correspondingly. By studying a fixed set of pulsars across the sky, these arrays should be able to detect gravitational waves in the nanohertz range. Such signals are expected to be emitted by pairs of merging supermassive black holes.

Detection in the cosmic microwave background

The cosmic microwave background, radiation left over from when the Universe cooled sufficiently for the first atoms to form, can contain the imprint of gravitational waves from the very early Universe. The microwave radiation is polarized. The pattern of polarization can be split into two classes called E-modes and B-modes. This is in analogy to electrostatics where the electric field (E-field) has a vanishing curl and the magnetic field (B-field) has a vanishing divergence. The E-modes can be created by a variety of processes, but the B-modes can only be produced by gravitational lensing, gravitational waves, or scattering from dust.

On 17 March 2014, astronomers at the Harvard-Smithsonian Center for Astrophysics announced the apparent detection of the imprint gravitational waves in the cosmic microwave background, which, if confirmed, would provide strong evidence for inflation and the Big Bang. However, on 19 June 2014, lowered confidence in confirming the findings was reported; and on 19 September 2014, even more lowered confidence. Finally, on 30 January 2015, the European Space Agency announced that the signal can be entirely attributed to dust in the Milky Way.

Novel detector designs

There are currently two detectors focusing on detections at the higher end of the gravitational-wave spectrum (10−7 to 105 Hz): one at University of Birmingham, England, and the other at INFN Genoa, Italy. A third is under development at Chongqing University, China. The Birmingham detector measures changes in the polarization state of a microwave beam circulating in a closed loop about one meter across. Two have been fabricated and they are currently expected to be sensitive to periodic spacetime strains of , given as an amplitude spectral density. The INFN Genoa detector is a resonant antenna consisting of two coupled spherical superconducting harmonic oscillators a few centimeters in diameter. The oscillators are designed to have (when uncoupled) almost equal resonant frequencies. The system is currently expected to have a sensitivity to periodic spacetime strains of , with an expectation to reach a sensitivity of . The Chongqing University detector is planned to detect relic high-frequency gravitational waves with the predicted typical parameters ~ 1010 Hz (10 GHz) and h ~ 10−30 to 10−31.

Levitated Sensor Detector is a proposed detector for gravitational waves with a frequency between 10 kHz and 300 kHz, potentially coming from primordial black holes. It will use optically-levitated dielectric particles in an optical cavity.

A torsion-bar antenna (TOBA) is a proposed design composed of two, long, thin bars, suspended as torsion pendula in a cross-like fashion, in which the differential angle is sensitive to tidal gravitational wave forces.

Detectors based on matter waves (atom interferometers) have also been proposed and are being developed. There have been proposals since the beginning of the 2000s. Atom interferometry is proposed to extend the detection bandwidth in the infrasound band (10 mHz – 10 Hz), where current ground based detectors are limited by low frequency gravity noise. A demonstrator project called Matter wave laser based Interferometer Gravitation Antenna (MIGA) started construction in 2018 in the underground environment of LSBB (Rustrel, France).

List of gravitational wave detectors

Noise curves for a selection of detectors as a function of frequency. The characteristic strain of potential astrophysical sources are also shown. To be detectable the characteristic strain of a signal must be above the noise curve.

Resonant mass detectors

Interferometers

Interferometric gravitational-wave detectors are often grouped into generations based on the technology used. The interferometric detectors deployed in the 1990s and 2000s were proving grounds for many of the foundational technologies necessary for initial detection and are commonly referred to as the first generation. The second generation of detectors operating in the 2010s, mostly at the same facilities like LIGO and Virgo, improved on these designs with sophisticated techniques such as cryogenic mirrors and the injection of squeezed vacuum. This led to the first unambiguous detection of a gravitational wave by Advanced LIGO in 2015. The third generation of detectors are currently in the planning phase, and seek to improve over the second generation by achieving greater detection sensitivity and a larger range of accessible frequencies. All these experiments involve many technologies under continuous development over multiple decades, so the categorization by generation is necessarily only rough.

Virtual particle

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Virtual_particle

A virtual particle is a theoretical transient particle that exhibits some of the characteristics of an ordinary particle, while having its existence limited by the uncertainty principle. The concept of virtual particles arises in the perturbation theory of quantum field theory where interactions between ordinary particles are described in terms of exchanges of virtual particles. A process involving virtual particles can be described by a schematic representation known as a Feynman diagram, in which virtual particles are represented by internal lines.

Virtual particles do not necessarily carry the same mass as the corresponding real particle, although they always conserve energy and momentum. The closer its characteristics come to those of ordinary particles, the longer the virtual particle exists. They are important in the physics of many processes, including particle scattering and Casimir forces. In quantum field theory, forces—such as the electromagnetic repulsion or attraction between two charges—can be thought of as due to the exchange of virtual photons between the charges. Virtual photons are the exchange particle for the electromagnetic interaction.

The term is somewhat loose and vaguely defined, in that it refers to the view that the world is made up of "real particles". "Real particles" are better understood to be excitations of the underlying quantum fields. Virtual particles are also excitations of the underlying fields, but are "temporary" in the sense that they appear in calculations of interactions, but never as asymptotic states or indices to the scattering matrix. The accuracy and use of virtual particles in calculations is firmly established, but as they cannot be detected in experiments, deciding how to precisely describe them is a topic of debate. Although widely used, they are by no means a necessary feature of QFT, but rather are mathematical conveniences - as demonstrated by lattice field theory, which avoids using the concept altogether.

Properties

The concept of virtual particles arises in the perturbation theory of quantum field theory, an approximation scheme in which interactions (in essence, forces) between actual particles are calculated in terms of exchanges of virtual particles. Such calculations are often performed using schematic representations known as Feynman diagrams, in which virtual particles appear as internal lines. By expressing the interaction in terms of the exchange of a virtual particle with four-momentum q, where q is given by the difference between the four-momenta of the particles entering and leaving the interaction vertex, both momentum and energy are conserved at the interaction vertices of the Feynman diagram.[4]: 119 

A virtual particle does not precisely obey the energy–momentum relation m2c4 = E2p2c2. Its kinetic energy may not have the usual relationship to velocity. It can be negative. This is expressed by the phrase off mass shell. The probability amplitude for a virtual particle to exist tends to be canceled out by destructive interference over longer distances and times. As a consequence, a real photon is massless and thus has only two polarization states, whereas a virtual one, being effectively massive, has three polarization states.

Quantum tunnelling may be considered a manifestation of virtual particle exchanges. The range of forces carried by virtual particles is limited by the uncertainty principle, which regards energy and time as conjugate variables; thus, virtual particles of larger mass have more limited range.

Written in the usual mathematical notations, in the equations of physics, there is no mark of the distinction between virtual and actual particles. The amplitudes of processes with a virtual particle interfere with the amplitudes of processes without it, whereas for an actual particle the cases of existence and non-existence cease to be coherent with each other and do not interfere any more. In the quantum field theory view, actual particles are viewed as being detectable excitations of underlying quantum fields. Virtual particles are also viewed as excitations of the underlying fields, but appear only as forces, not as detectable particles. They are "temporary" in the sense that they appear in some calculations, but are not detected as single particles. Thus, in mathematical terms, they never appear as indices to the scattering matrix, which is to say, they never appear as the observable inputs and outputs of the physical process being modelled.

There are two principal ways in which the notion of virtual particles appears in modern physics. They appear as intermediate terms in Feynman diagrams; that is, as terms in a perturbative calculation. They also appear as an infinite set of states to be summed or integrated over in the calculation of a semi-non-perturbative effect. In the latter case, it is sometimes said that virtual particles contribute to a mechanism that mediates the effect, or that the effect occurs through the virtual particles.

Manifestations

There are many observable physical phenomena that arise in interactions involving virtual particles. For bosonic particles that exhibit rest mass when they are free and actual, virtual interactions are characterized by the relatively short range of the force interaction produced by particle exchange. Confinement can lead to a short range, too. Examples of such short-range interactions are the strong and weak forces, and their associated field bosons.

For the gravitational and electromagnetic forces, the zero rest-mass of the associated boson particle permits long-range forces to be mediated by virtual particles. However, in the case of photons, power and information transfer by virtual particles is a relatively short-range phenomenon (existing only within a few wavelengths of the field-disturbance, which carries information or transferred power), as for example seen in the characteristically short range of inductive and capacitative effects in the near field zone of coils and antennas.

Some field interactions which may be seen in terms of virtual particles are:

  • The Coulomb force (static electric force) between electric charges. It is caused by the exchange of virtual photons. In symmetric 3-dimensional space this exchange results in the inverse square law for electric force. Since the photon has no mass, the coulomb potential has an infinite range.
  • The magnetic field between magnetic dipoles. It is caused by the exchange of virtual photons. In symmetric 3-dimensional space, this exchange results in the inverse cube law for magnetic force. Since the photon has no mass, the magnetic potential has an infinite range.
  • Electromagnetic induction. This phenomenon transfers energy to and from a magnetic coil via a changing (electro)magnetic field.
  • The strong nuclear force between quarks is the result of interaction of virtual gluons. The residual of this force outside of quark triplets (neutron and proton) holds neutrons and protons together in nuclei, and is due to virtual mesons such as the pi meson and rho meson.
  • The weak nuclear force is the result of exchange by virtual W and Z bosons.
  • The spontaneous emission of a photon during the decay of an excited atom or excited nucleus; such a decay is prohibited by ordinary quantum mechanics and requires the quantization of the electromagnetic field for its explanation.
  • The Casimir effect, where the ground state of the quantized electromagnetic field causes attraction between a pair of electrically neutral metal plates.
  • The van der Waals force, which is partly due to the Casimir effect between two atoms.
  • Vacuum polarization, which involves pair production or the decay of the vacuum, which is the spontaneous production of particle-antiparticle pairs (such as electron-positron).
  • Lamb shift of positions of atomic levels.
  • The Impedance of free space, which defines the ratio between the electric field strength |E| and the magnetic field strength |H|: Z0 = |E||H|.
  • Much of the so-called near-field of radio antennas, where the magnetic and electric effects of the changing current in the antenna wire and the charge effects of the wire's capacitive charge may be (and usually are) important contributors to the total EM field close to the source, but both of which effects are dipole effects that decay with increasing distance from the antenna much more quickly than do the influence of "conventional" electromagnetic waves that are "far" from the source. These far-field waves, for which E is (in the limit of long distance) equal to cB, are composed of actual photons. Actual and virtual photons are mixed near an antenna, with the virtual photons responsible only for the "extra" magnetic-inductive and transient electric-dipole effects, which cause any imbalance between E and cB. As distance from the antenna grows, the near-field effects (as dipole fields) die out more quickly, and only the "radiative" effects that are due to actual photons remain as important effects. Although virtual effects extend to infinity, they drop off in field strength as 1r2 rather than the field of EM waves composed of actual photons, which drop 1r.

Most of these have analogous effects in solid-state physics; indeed, one can often gain a better intuitive understanding by examining these cases. In semiconductors, the roles of electrons, positrons and photons in field theory are replaced by electrons in the conduction band, holes in the valence band, and phonons or vibrations of the crystal lattice. A virtual particle is in a virtual state where the probability amplitude is not conserved. Examples of macroscopic virtual phonons, photons, and electrons in the case of the tunneling process were presented by GĆ¼nter Nimtz and Alfons A. Stahlhofen.

Feynman diagrams

One particle exchange scattering diagram

The calculation of scattering amplitudes in theoretical particle physics requires the use of some rather large and complicated integrals over a large number of variables. These integrals do, however, have a regular structure, and may be represented as Feynman diagrams. The appeal of the Feynman diagrams is strong, as it allows for a simple visual presentation of what would otherwise be a rather arcane and abstract formula. In particular, part of the appeal is that the outgoing legs of a Feynman diagram can be associated with actual, on-shell particles. Thus, it is natural to associate the other lines in the diagram with particles as well, called the "virtual particles". In mathematical terms, they correspond to the propagators appearing in the diagram.

In the adjacent image, the solid lines correspond to actual particles (of momentum p1 and so on), while the dotted line corresponds to a virtual particle carrying momentum k. For example, if the solid lines were to correspond to electrons interacting by means of the electromagnetic interaction, the dotted line would correspond to the exchange of a virtual photon. In the case of interacting nucleons, the dotted line would be a virtual pion. In the case of quarks interacting by means of the strong force, the dotted line would be a virtual gluon, and so on.

One-loop diagram with fermion propagator

Virtual particles may be mesons or vector bosons, as in the example above; they may also be fermions. However, in order to preserve quantum numbers, most simple diagrams involving fermion exchange are prohibited. The image to the right shows an allowed diagram, a one-loop diagram. The solid lines correspond to a fermion propagator, the wavy lines to bosons.

Vacuums

In formal terms, a particle is considered to be an eigenstate of the particle number operator aa, where a is the particle annihilation operator and a the particle creation operator (sometimes collectively called ladder operators). In many cases, the particle number operator does not commute with the Hamiltonian for the system. This implies the number of particles in an area of space is not a well-defined quantity but, like other quantum observables, is represented by a probability distribution. Since these particles are not certain to exist, they are called virtual particles or vacuum fluctuations of vacuum energy. In a certain sense, they can be understood to be a manifestation of the time-energy uncertainty principle in a vacuum.

An important example of the "presence" of virtual particles in a vacuum is the Casimir effect. Here, the explanation of the effect requires that the total energy of all of the virtual particles in a vacuum can be added together. Thus, although the virtual particles themselves are not directly observable in the laboratory, they do leave an observable effect: Their zero-point energy results in forces acting on suitably arranged metal plates or dielectrics. On the other hand, the Casimir effect can be interpreted as the relativistic van der Waals force.

Pair production

Virtual particles are often popularly described as coming in pairs, a particle and antiparticle which can be of any kind. These pairs exist for an extremely short time, and then mutually annihilate, or in some cases, the pair may be boosted apart using external energy so that they avoid annihilation and become actual particles, as described below.

This may occur in one of two ways. In an accelerating frame of reference, the virtual particles may appear to be actual to the accelerating observer; this is known as the Unruh effect. In short, the vacuum of a stationary frame appears, to the accelerated observer, to be a warm gas of actual particles in thermodynamic equilibrium.

Another example is pair production in very strong electric fields, sometimes called vacuum decay. If, for example, a pair of atomic nuclei are merged to very briefly form a nucleus with a charge greater than about 140, (that is, larger than about the inverse of the fine-structure constant, which is a dimensionless quantity), the strength of the electric field will be such that it will be energetically favorable to create positron–electron pairs out of the vacuum or Dirac sea, with the electron attracted to the nucleus to annihilate the positive charge. This pair-creation amplitude was first calculated by Julian Schwinger in 1951.

Compared to actual particles

As a consequence of quantum mechanical uncertainty, any object or process that exists for a limited time or in a limited volume cannot have a precisely defined energy or momentum. For this reason, virtual particles – which exist only temporarily as they are exchanged between ordinary particles – do not typically obey the mass-shell relation; the longer a virtual particle exists, the more the energy and momentum approach the mass-shell relation.

The lifetime of real particles is typically vastly longer than the lifetime of the virtual particles. Electromagnetic radiation consists of real photons which may travel light years between the emitter and absorber, but (Coulombic) electrostatic attraction and repulsion is a relatively short-range force that is a consequence of the exchange of virtual photons.

Entropy (information theory)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Entropy_(information_theory) In info...