Search This Blog

Tuesday, May 23, 2023

Antimatter rocket

From Wikipedia, the free encyclopedia
 
A proposed antimatter rocket

An antimatter rocket is a proposed class of rockets that use antimatter as their power source. There are several designs that attempt to accomplish this goal. The advantage to this class of rocket is that a large fraction of the rest mass of a matter/antimatter mixture may be converted to energy, allowing antimatter rockets to have a far higher energy density and specific impulse than any other proposed class of rocket.

Methods

Antimatter rockets can be divided into three types of application: those that directly use the products of antimatter annihilation for propulsion, those that heat a working fluid or an intermediate material which is then used for propulsion, and those that heat a working fluid or an intermediate material to generate electricity for some form of electric spacecraft propulsion system. The propulsion concepts that employ these mechanisms generally fall into four categories: solid core, gaseous core, plasma core, and beamed core configurations. The alternatives to direct antimatter annihilation propulsion offer the possibility of feasible vehicles with, in some cases, vastly smaller amounts of antimatter but require a lot more matter propellant. Then there are hybrid solutions using antimatter to catalyze fission/fusion reactions for propulsion.

Pure antimatter rocket: direct use of reaction products

Antiproton annihilation reactions produce charged and uncharged pions, in addition to neutrinos and gamma rays. The charged pions can be channelled by a magnetic nozzle, producing thrust. This type of antimatter rocket is a pion rocket or beamed core configuration. It is not perfectly efficient; energy is lost as the rest mass of the charged (22.3%) and uncharged pions (14.38%), lost as the kinetic energy of the uncharged pions (which can't be deflected for thrust); and lost as neutrinos and gamma rays (see antimatter as fuel).

Positron annihilation has also been proposed for rocketry. Annihilation of positrons produces only gamma rays. Early proposals for this type of rocket, such as those developed by Eugen Sänger, assumed the use of some material that could reflect gamma rays, used as a light sail or parabolic shield to derive thrust from the annihilation reaction, but no known form of matter (consisting of atoms or ions) interacts with gamma rays in a manner that would enable specular reflection. The momentum of gamma rays can, however, be partially transferred to matter by Compton scattering.

One method to reach relativistic velocities uses a matter-antimatter GeV gamma ray laser photon rocket made possible by a relativistic proton-antiproton pinch discharge, where the recoil from the laser beam is transmitted by the Mössbauer effect to the spacecraft.

A new annihilation process has allegedly been developed by researchers from Gothenborg University. Several annihilation reactors have been constructed in the past years where hydrogen or deuterium is converted into relativistic particles by laser annihilation. The technology has been demonstrated by research groups led by Prof. Leif Holmlid and Sindre Zeiner-Gundersen at research facilities in both Sweden and Oslo. A third relativistic particle reactor is currently being built at the University of Iceland. The emitted particles from hydrogen annihilation processes may reach 0.94c and can be used in space propulsion. Note, however, that the veracity of Leif Holmlid's research is under dispute.

Thermal antimatter rocket: heating of a propellant

This type of antimatter rocket is termed a thermal antimatter rocket as the energy or heat from the annihilation is harnessed to create an exhaust from non-exotic material or propellant.

The solid core concept uses antiprotons to heat a solid, high-atomic weight (Z), refractory metal core. Propellant is pumped into the hot core and expanded through a nozzle to generate thrust. The performance of this concept is roughly equivalent to that of the nuclear thermal rocket ( ~ 103 sec) due to temperature limitations of the solid. However, the antimatter energy conversion and heating efficiencies are typically high due to the short mean path between collisions with core atoms (efficiency ~ 85%). Several methods for the liquid-propellant thermal antimatter engine using the gamma rays produced by antiproton or positron annihilation have been proposed. These methods resemble those proposed for nuclear thermal rockets. One proposed method is to use positron annihilation gamma rays to heat a solid engine core. Hydrogen gas is ducted through this core, heated, and expelled from a rocket nozzle. A second proposed engine type uses positron annihilation within a solid lead pellet or within compressed xenon gas to produce a cloud of hot gas, which heats a surrounding layer of gaseous hydrogen. Direct heating of the hydrogen by gamma rays was considered impractical, due to the difficulty of compressing enough of it within an engine of reasonable size to absorb the gamma rays. A third proposed engine type uses annihilation gamma rays to heat an ablative sail, with the ablated material providing thrust. As with nuclear thermal rockets, the specific impulse achievable by these methods is limited by materials considerations, typically being in the range of 1000–2000 seconds.

The gaseous core system substitutes the low-melting point solid with a high temperature gas (i.e. tungsten gas/plasma), thus permitting higher operational temperatures and performance ( ~ 2 × 103 sec). However, the longer mean free path for thermalization and absorption results in much lower energy conversion efficiencies ( ~ 35%).

The plasma core allows the gas to ionize and operate at even higher effective temperatures. Heat loss is suppressed by magnetic confinement in the reaction chamber and nozzle. Although performance is extremely high ( ~ 104-105 sec), the long mean free path results in very low energy utilization ( ~ 10%)

Antimatter power generation

The idea of using antimatter to power an electric space drive has also been proposed. These proposed designs are typically similar to those suggested for nuclear electric rockets. Antimatter annihilations are used to directly or indirectly heat a working fluid, as in a nuclear thermal rocket, but the fluid is used to generate electricity, which is then used to power some form of electric space propulsion system. The resulting system shares many of the characteristics of other charged particle/electric propulsion proposals, that typically being high specific impulse and low thrust (An associated article further detailing antimatter power generation).

Catalyzed fission/fusion or spiked fusion

This is a hybrid approach in which antiprotons are used to catalyze a fission/fusion reaction or to "spike" the propulsion of a fusion rocket or any similar applications.

The antiproton-driven Inertial confinement fusion (ICF) Rocket concept uses pellets for the D-T reaction. The pellet consists of a hemisphere of fissionable material such as U235 with a hole through which a pulse of antiprotons and positrons is injected. It is surrounded by a hemisphere of fusion fuel, for example deuterium-tritium, or lithium deuteride. Antiproton annihilation occurs at the surface of the hemisphere, which ionizes the fuel. These ions heat the core of the pellet to fusion temperatures.

The antiproton-driven Magnetically Insulated Inertial Confinement Fusion Propulsion (MICF) concept relies on self-generated magnetic field which insulates the plasma from the metallic shell that contains it during the burn. The lifetime of the plasma was estimated to be two orders of magnitude greater than implosion inertial fusion, which corresponds to a longer burn time, and hence, greater gain.

The antimatter-driven P-B11 concept uses antiprotons to ignite the P-B11 reactions in an MICF scheme. Excessive radiation losses are a major obstacle to ignition and require modifying the particle density, and plasma temperature to increase the gain. It was concluded that it is entirely feasible that this system could achieve Isp~105s.

A different approach was envisioned for AIMStar in which small fusion fuel droplets would be injected into a cloud of antiprotons confined in a very small volume within a reaction Penning trap. Annihilation takes place on the surface of the antiproton cloud, peeling back 0.5% of the cloud. The power density released is roughly comparable to a 1 kJ, 1 ns laser depositing its energy over a 200 μm ICF target.

The ICAN-II project employs the antiproton catalyzed microfission (ACMF) concept which uses pellets with a molar ratio of 9:1 of D-T:U235 for Nuclear pulse propulsion.

Difficulties with antimatter rockets

The chief practical difficulties with antimatter rockets are the problems of creating antimatter and storing it. Creating antimatter requires input of vast amounts of energy, at least equivalent to the rest energy of the created particle/antiparticle pairs, and typically (for antiproton production) tens of thousands to millions of times more. Most storage schemes proposed for interstellar craft require the production of frozen pellets of antihydrogen. This requires cooling of antiprotons, binding to positrons, and capture of the resulting antihydrogen atoms - tasks which have, as of 2010, been performed only for small numbers of individual atoms. Storage of antimatter is typically done by trapping electrically charged frozen antihydrogen pellets in Penning or Paul traps. There is no theoretical barrier to these tasks being performed on the scale required to fuel an antimatter rocket. However, they are expected to be extremely (and perhaps prohibitively) expensive due to current production abilities being only able to produce small numbers of atoms, a scale approximately 1023 times smaller than needed for a 10-gram trip to Mars.

Generally, the energy from antiproton annihilation is deposited over such a large region that it cannot efficiently drive nuclear capsules. Antiproton-induced fission and self-generated magnetic fields may greatly enhance energy localization and efficient use of annihilation energy.

A secondary problem is the extraction of useful energy or momentum from the products of antimatter annihilation, which are primarily in the form of extremely energetic ionizing radiation. The antimatter mechanisms proposed to date have for the most part provided plausible mechanisms for harnessing energy from these annihilation products. The classic rocket equation with its "wet" mass ()(with propellant mass fraction) to "dry" mass ()(with payload) fraction (), the velocity change () and specific impulse () no longer holds due to the mass losses occurring in antimatter annihilation.

Another general problem with high powered propulsion is excess heat or waste heat, and as with antimatter-matter annihilation also includes extreme radiation. A proton-antiproton annihilation propulsion system transforms 39% of the propellant mass into an intense high-energy flux of gamma radiation. The gamma rays and the high-energy charged pions will cause heating and radiation damage if they are not shielded against. Unlike neutrons, they will not cause the exposed material to become radioactive by transmutation of the nuclei. The components needing shielding are the crew, the electronics, the cryogenic tankage, and the magnetic coils for magnetically assisted rockets. Two types of shielding are needed: radiation protection and thermal protection (different from Heat shield or thermal insulation).

Finally, relativistic considerations have to be taken into account. As the by products of annihilation move at relativistic velocities the rest mass changes according to relativistic mass–energy. For example, the total mass–energy content of the neutral pion is converted into gammas, not just its rest mass. It is necessary to use a relativistic rocket equation that takes into account the relativistic effects of both the vehicle and propellant exhaust (charged pions) moving near the speed of light. These two modifications to the two rocket equations result in a mass ratio () for a given () and () that is much higher for a relativistic antimatter rocket than for either a classical or relativistic "conventional" rocket.

Modified relativistic rocket equation

The loss of mass specific to antimatter annihilation requires a modification of the relativistic rocket equation given as

 

 

 

 

(I)

where is the speed of light, and is the specific impulse (i.e. =0.69).

The derivative form of the equation is

 

 

 

 

(II)

where is the non-relativistic (rest) mass of the rocket ship, and is the fraction of the original (on board) propellant mass (non-relativistic) remaining after annihilation (i.e., =0.22 for the charged pions).

Eq.II is difficult to integrate analytically. If it is assumed that , such that then the resulting equation is

 

 

 

 

(III)

Eq.III can be integrated and the integral evaluated for and , and initial and final velocities ( and ). The resulting relativistic rocket equation with loss of propellant is

 

 

 

 

(IV)

Other general issues

The cosmic background hard radiation will ionize the rocket's hull over time and poses a health threat. Also, gas plasma interactions may cause space charge. The major interaction of concern is differential charging of various parts of a spacecraft, leading to high electric fields and arcing between spacecraft components. This can be resolved with well placed plasma contactor. However, there is no solution yet for when plasma contactors are turned off to allow maintenance work on the hull. Long term space flight at interstellar velocities causes erosion of the rocket's hull due to collision with particles, gas, dust and micrometeorites. At 0.2 for a 6 light year distance, erosion is estimated to be in the order of about 30 kg/m2 or about 1 cm of aluminum shielding.

Nuclear pulse propulsion

From Wikipedia, the free encyclopedia
An artist's conception of the Project Orion "basic" spacecraft, powered by nuclear pulse propulsion.

Nuclear pulse propulsion or external pulsed plasma propulsion is a hypothetical method of spacecraft propulsion that uses nuclear explosions for thrust. It originated as Project Orion with support from DARPA, after a suggestion by Stanislaw Ulam in 1947. Newer designs using inertial confinement fusion have been the baseline for most later designs, including Project Daedalus and Project Longshot.

History

Los Alamos

Calculations for a potential use of this technology were made at the laboratory from and toward the close of the 1940s to the mid-1950s.

Project Orion

A nuclear pulse propulsion unit. The explosive charge ablatively vaporizes the propellant, propelling it away from the charge, and simultaneously creating a plasma out of the propellant. The propellant then goes on to impact the pusher plate at the bottom of the Orion spacecraft, imparting a pulse of 'pushing' energy.

Project Orion was the first serious attempt to design a nuclear pulse rocket. A design was formed at General Atomics during the late 1950s and early 1960s, with the idea of reacting small directional nuclear explosives utilizing a variant of the Teller–Ulam two-stage bomb design against a large steel pusher plate attached to the spacecraft with shock absorbers. Efficient directional explosives maximized the momentum transfer, leading to specific impulses in the range of 6,000 seconds, or about thirteen times that of the Space Shuttle main engine. With refinements a theoretical maximum of 100,000 seconds (1 MN·s/kg) might be possible. Thrusts were in the millions of tons, allowing spacecraft larger than 8×106 tons to be built with 1958 materials.

The reference design was to be constructed of steel using submarine-style construction with a crew of more than 200 and a vehicle takeoff weight of several thousand tons. This single-stage reference design would reach Mars and return in four weeks from the Earth's surface (compared to 12 months for NASA's current chemically powered reference mission). The same craft could visit Saturn's moons in a seven-month mission (compared to chemically powered missions of about nine years). Notable engineering problems that occurred were related to crew shielding and pusher-plate lifetime.

Although the system appeared to be workable, the project was shut down in 1965, primarily because the Partial Test Ban Treaty made it illegal; in fact, before the treaty, the US and Soviet Union had already separately detonated a combined number of at least nine nuclear bombs, including thermonuclear, in space, i.e., at altitudes of over 100 km (see high-altitude nuclear explosions). Ethical issues complicated the launch of such a vehicle within the Earth's magnetosphere: calculations using the (disputed) linear no-threshold model of radiation damage showed that the fallout from each takeoff would cause the death of approximately 1 to 10 individuals. In a threshold model, such extremely low levels of thinly distributed radiation would have no associated ill-effects, while under hormesis models, such tiny doses would be negligibly beneficial. The use of less efficient clean nuclear bombs for achieving orbit and then more efficient, higher yield dirtier bombs for travel would significantly reduce the amount of fallout caused from an Earth-based launch.

One useful mission would be to deflect an asteroid or comet on collision course with the Earth, depicted dramatically in the 1998 film Deep Impact. The high performance would permit even a late launch to succeed, and the vehicle could effectively transfer a large amount of kinetic energy to the asteroid by simple impact. The prospect of an imminent asteroid impact would obviate concerns over the few predicted deaths from fallout. An automated mission would remove the challenge of designing a shock absorber that would protect the crew.

Orion is one of very few interstellar space drives that could theoretically be constructed with available technology, as discussed in a 1968 paper, "Interstellar Transport" by Freeman Dyson.

Project Daedalus

Project Daedalus was a study conducted between 1973 and 1978 by the British Interplanetary Society (BIS) to design an interstellar uncrewed spacecraft that could reach a nearby star within about 50 years. A dozen scientists and engineers led by Alan Bond worked on the project. At the time fusion research appeared to be making great strides, and in particular, inertial confinement fusion (ICF) appeared to be adaptable as a rocket engine.

ICF uses small pellets of fusion fuel, typically lithium deuteride (6Li2H) with a small deuterium/tritium trigger at the center. The pellets are thrown into a reaction chamber where they are hit on all sides by lasers or another form of beamed energy. The heat generated by the beams explosively compresses the pellet to the point where fusion takes place. The result is a hot plasma, and a very small "explosion" compared to the minimum size bomb that would be required to instead create the necessary amount of fission.

For Daedalus, this process was to be run within a large electromagnet that formed the rocket engine. After the reaction, ignited by electron beams, the magnet funnelled the hot gas to the rear for thrust. Some of the energy was diverted to run the ship's systems and engine. In order to make the system safe and energy efficient, Daedalus was to be powered by a helium-3 fuel collected from Jupiter.

Medusa

Conceptual diagram of a Medusa propulsion spacecraft, showing: (A) the payload capsule, (B) the winch mechanism, (C) the optional main tether cable, (D) riser tethers, and (E) the parachute mechanism.
 
Operating sequence of the Medusa propulsion system. This diagram shows the operating sequence of a Medusa propulsion spacecraft (1) Starting at moment of explosive-pulse unit firing, (2) As the explosive pulse reaches the parachute canopy, (3) Pushes the canopy, accelerating it away from the explosion as the spacecraft plays out the main tether with the winch, generating electricity as it extends, and accelerating the spacecraft, (4) And finally winches the spacecraft forward to the canopy and uses excess electricity for other purposes.

The Medusa design has more in common with solar sails than with conventional rockets. It was envisioned by Johndale Solem in the 1990s and published in the Journal of the British Interplanetary Society (JBIS).

A Medusa spacecraft would deploy a large sail ahead of it, attached by independent cables, and then launch nuclear explosives forward to detonate between itself and its sail. The sail would be accelerated by the plasma and photonic impulse, running out the tethers as when a fish flees a fisher, generating electricity at the "reel". The spacecraft would use some of the generated electricity to reel itself up towards the sail, constantly smoothly accelerating as it goes.

In the original design, multiple tethers connected to multiple motor generators. The advantage over the single tether is to increase the distance between the explosion and the tethers, thus reducing damage to the tethers.

For heavy payloads, performance could be improved by taking advantage of lunar materials, for example, wrapping the explosive with lunar rock or water, stored previously at a stable Lagrange point.

Medusa performs better than the classical Orion design because its sail intercepts more of the explosive impulse, its shock-absorber stroke is much longer, and its major structures are in tension and hence can be quite lightweight. Medusa-type ships would be capable of a specific impulse between 50,000 and 100,000 seconds (500 to 1000 kN·s/kg).

Medusa became widely known to the public in the BBC documentary film To Mars By A-Bomb: The Secret History of Project Orion. A short film shows an artist's conception of how the Medusa spacecraft works "by throwing bombs into a sail that's ahead of it".

Project Longshot

Project Longshot was a NASA-sponsored research project carried out in conjunction with the US Naval Academy in the late 1980s. Longshot was in some ways a development of the basic Daedalus concept, in that it used magnetically funneled ICF. The key difference was that they felt that the reaction could not power both the rocket and the other systems, and instead included a 300 kW conventional nuclear reactor for running the ship. The added weight of the reactor reduced performance somewhat, but even using LiD fuel it would be able to reach neighboring star Alpha Centauri in 100 years (approx. velocity of 13,411 km/s, at a distance of 4.5 light years, equivalent to 4.5% of light speed).

Antimatter-catalyzed nuclear reaction

In the mid-1990s, research at Pennsylvania State University led to the concept of using antimatter to catalyze nuclear reactions. Antiprotons would react inside the nucleus of uranium, releasing energy that breaks the nucleus apart as in conventional nuclear reactions. Even a small number of such reactions can start the chain reaction that would otherwise require a much larger volume of fuel to sustain. Whereas the "normal" critical mass for plutonium is about 11.8 kilograms (for a sphere at standard density), with antimatter catalyzed reactions this could be well under one gram.

Several rocket designs using this reaction were proposed, some which would use all-fission reactions for interplanetary missions, and others using fission-fusion (effectively a very small version of Orion's bombs) for interstellar missions.

Magneto-inertial fusion

MSNW magneto-inertial fusion driven rocket
The Fusion Driven Rocket powered spacecraft.jpg
Concept graphic of a fusion-driven rocket powered spacecraft arriving at Mars
DesignerMSNW LLC
ApplicationInterplanetary
StatusTheoretical
Performance
Specific impulse1,606 s to 5,722 s (depending on fusion gain)
Burn time1 day to 90 days (10 days optimal with gain of 40)
Notes
  • Fuel: Deuterium-tritium cryogenic pellet
  • Propellant: Lithium or aluminum
  • Power requirements: 100 kW to 1,000 kW

NASA funded MSNW LLC and the University of Washington in 2011 to study and develop a fusion rocket through the NASA Innovative Advanced Concepts NIAC Program.

The rocket uses a form of magneto-inertial fusion to produce a direct thrust fusion rocket. Magnetic fields cause large metal rings to collapse around the deuterium-tritium plasma, triggering fusion. The energy heats and ionizes the shell of metal formed by the crushed rings. The hot, ionized metal is shot out of a magnetic rocket nozzle at a high speed (up to 30 km/s). Repeating this process roughly every minute would propel the spacecraft. The fusion reaction is not self-sustaining and requires electrical energy to explode each pulse. With electrical requirements estimated to be between 100 kW to 1,000 kW (300 kW average), designs incorporate solar panels to produce the required energy.

Foil Liner Compression creates fusion at the proper energy scale. The proof of concept experiment in Redmond, Washington, was to use aluminum liners for compression. However, the ultimate design was to use lithium liners.

Performance characteristics are dependent on the fusion energy gain factor achieved by the reactor. Gains were expected to be between 20 and 200, with an estimated average of 40. Higher gains produce higher exhaust velocity, higher specific impulse and lower electrical power requirements. The table below summarizes different performance characteristics for a theoretical 90-day Mars transfer at gains of 20, 40 and 200.

FDR parameters for 90 Mars transfer burn
Total gain Gain of 20 Gain of 40 Gain of 200
Liner mass (kg) 0.365 0.365 0.365
Specific impulse (s) 1,606 2,435 5,722
Mass fraction 0.33 0.47 0.68
Specific mass (kg/kW) 0.8 0.53 0.23
Mass propellant (kg) 110,000 59,000 20,000
Mass initial (kg) 184,000 130,000 90,000
Electrical power required (kW) 1,019 546 188

By April 2013, MSNW had demonstrated subcomponents of the systems: heating deuterium plasma up to fusion temperatures and concentrating the magnetic fields needed to create fusion. They planned to put the two technologies together for a test before the end of 2013.

Pulsed fission-fusion propulsion

Pulsed Fission-Fusion (PuFF) propulsion is reliant on principles similar to magneto-inertial fusion, It aims to solve the problem of the extreme stress induced on containment by an Orion-like motor by ejecting the plasma obtained from small fuel pellets that undergo autocatalytic fission and fusion reactions initiated by a Z-pinch. It is a theoretical propulsion system researched through the NIAC Program by the University of Alabama in Huntsville. It is in essence a fusion rocket that uses a Z-pinch configuration, but coupled with a fission reaction to boost the fusion process.

A PuFF fuel pellet, around 1 cm in diameter, consists of two components: A deuterium-tritium (D-T) cylinder of plasma, called the target, which undergoes fusion, and a surrounding U-235 sheath that undergoes fission enveloped by a lithium liner. Liquid lithium, serving as a moderator, fills the space between the D-T cylinder and the uranium sheath. Current is run through the liquid lithium, a Lorentz force is generated which then compresses the D-T plasma by a factor of 10 in what is known as a Z-pinch. The compressed plasma reaches criticality and undergoes fusion reactions. However, the fusion energy gain (Q) of these reactions is far below breakeven (Q < 1), meaning that the reaction consumes more energy than it produces.

In a PuFF design, the fast neutrons released by the initial fusion reaction induce fission in the U-235 sheath. The resultant heat causes the sheath to expand, increasing its implosion velocity onto the D-T core and compressing it further, releasing more fast neutrons. Those again amplify the fission rate in the sheath, rendering the process autocatalytic. It is hoped that this results in a complete burn up of both the fission and fusion fuels, making PuFF more efficient than other nuclear pulse concepts. Much like in a magneto-inertial fusion rocket, the performance of the engine is dependent on the degree to which the fusion gain of the D-T target is increased.

One "pulse" consist of the injection of a fuel pellet into the combustion chamber, its consumption through a series of fission-fusion reactions, and finally the ejection of the released plasma through a magnetic nozzle, thus generating thrust. A single pulse is expected to take only a fraction of a second to complete.

Gamma ray

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Gamma_ray
 
Illustration of an emission of a gamma ray (γ) from an atomic nucleus
 
Gamma rays are emitted during nuclear fission in nuclear explosions.
 
NASA guide to electromagnetic spectrum showing overlap of frequency between X-rays and gamma rays

A gamma ray, also known as gamma radiation (symbol γ or ), is a penetrating form of electromagnetic radiation arising from the radioactive decay of atomic nuclei. It consists of the shortest wavelength electromagnetic waves, typically shorter than those of X-rays. With frequencies above 30 exahertz (3×1019 Hz), it imparts the highest photon energy. Paul Villard, a French chemist and physicist, discovered gamma radiation in 1900 while studying radiation emitted by radium. In 1903, Ernest Rutherford named this radiation gamma rays based on their relatively strong penetration of matter; in 1900 he had already named two less penetrating types of decay radiation (discovered by Henri Becquerel) alpha rays and beta rays in ascending order of penetrating power.

Gamma rays from radioactive decay are in the energy range from a few kiloelectronvolts (keV) to approximately 8 megaelectronvolts (MeV), corresponding to the typical energy levels in nuclei with reasonably long lifetimes. The energy spectrum of gamma rays can be used to identify the decaying radionuclides using gamma spectroscopy. Very-high-energy gamma rays in the 100–1000 teraelectronvolt (TeV) range have been observed from sources such as the Cygnus X-3 microquasar.

Natural sources of gamma rays originating on Earth are mostly a result of radioactive decay and secondary radiation from atmospheric interactions with cosmic ray particles. However, there are other rare natural sources, such as terrestrial gamma-ray flashes, which produce gamma rays from electron action upon the nucleus. Notable artificial sources of gamma rays include fission, such as that which occurs in nuclear reactors, and high energy physics experiments, such as neutral pion decay and nuclear fusion.

Gamma rays and X-rays are both electromagnetic radiation, and since they overlap in the electromagnetic spectrum, the terminology varies between scientific disciplines. In some fields of physics, they are distinguished by their origin: Gamma rays are created by nuclear decay while X-rays originate outside the nucleus. In astrophysics, gamma rays are conventionally defined as having photon energies above 100 keV and are the subject of gamma ray astronomy, while radiation below 100 keV is classified as X-rays and is the subject of X-ray astronomy.

Gamma rays are ionizing radiation and are thus hazardous to life. Due to their high penetration power, they can damage bone marrow and internal organs. Unlike alpha and beta rays, they easily pass through the body and thus pose a formidable radiation protection challenge, requiring shielding made from dense materials such as lead or concrete. On Earth, the magnetosphere protects life from most types of lethal cosmic radiation other than gamma rays.

History of discovery

The first gamma ray source to be discovered was the radioactive decay process called gamma decay. In this type of decay, an excited nucleus emits a gamma ray almost immediately upon formation. Paul Villard, a French chemist and physicist, discovered gamma radiation in 1900, while studying radiation emitted from radium. Villard knew that his described radiation was more powerful than previously described types of rays from radium, which included beta rays, first noted as "radioactivity" by Henri Becquerel in 1896, and alpha rays, discovered as a less penetrating form of radiation by Rutherford, in 1899. However, Villard did not consider naming them as a different fundamental type. Later, in 1903, Villard's radiation was recognized as being of a type fundamentally different from previously named rays by Ernest Rutherford, who named Villard's rays "gamma rays" by analogy with the beta and alpha rays that Rutherford had differentiated in 1899. The "rays" emitted by radioactive elements were named in order of their power to penetrate various materials, using the first three letters of the Greek alphabet: alpha rays as the least penetrating, followed by beta rays, followed by gamma rays as the most penetrating. Rutherford also noted that gamma rays were not deflected (or at least, not easily deflected) by a magnetic field, another property making them unlike alpha and beta rays.

Gamma rays were first thought to be particles with mass, like alpha and beta rays. Rutherford initially believed that they might be extremely fast beta particles, but their failure to be deflected by a magnetic field indicated that they had no charge. In 1914, gamma rays were observed to be reflected from crystal surfaces, proving that they were electromagnetic radiation. Rutherford and his co-worker Edward Andrade measured the wavelengths of gamma rays from radium, and found they were similar to X-rays, but with shorter wavelengths and thus, higher frequency. This was eventually recognized as giving them more energy per photon, as soon as the latter term became generally accepted. A gamma decay was then understood to usually emit a gamma photon.

Sources

Natural sources of gamma rays on Earth include gamma decay from naturally occurring radioisotopes such as potassium-40, and also as a secondary radiation from various atmospheric interactions with cosmic ray particles. Some rare terrestrial natural sources that produce gamma rays that are not of a nuclear origin, are lightning strikes and terrestrial gamma-ray flashes, which produce high energy emissions from natural high-energy voltages. Gamma rays are produced by a number of astronomical processes in which very high-energy electrons are produced. Such electrons produce secondary gamma rays by the mechanisms of bremsstrahlung, inverse Compton scattering and synchrotron radiation. A large fraction of such astronomical gamma rays are screened by Earth's atmosphere. Notable artificial sources of gamma rays include fission, such as occurs in nuclear reactors, as well as high energy physics experiments, such as neutral pion decay and nuclear fusion.

A sample of gamma ray-emitting material that is used for irradiating or imaging is known as a gamma source. It is also called a radioactive source, isotope source, or radiation source, though these more general terms also apply to alpha and beta-emitting devices. Gamma sources are usually sealed to prevent radioactive contamination, and transported in heavy shielding.

Radioactive decay (gamma decay)

Gamma rays are produced during gamma decay, which normally occurs after other forms of decay occur, such as alpha or beta decay. A radioactive nucleus can decay by the emission of an
α
or
β
particle. The daughter nucleus that results is usually left in an excited state. It can then decay to a lower energy state by emitting a gamma ray photon, in a process called gamma decay.

The emission of a gamma ray from an excited nucleus typically requires only 10−12 seconds. Gamma decay may also follow nuclear reactions such as neutron capture, nuclear fission, or nuclear fusion. Gamma decay is also a mode of relaxation of many excited states of atomic nuclei following other types of radioactive decay, such as beta decay, so long as these states possess the necessary component of nuclear spin. When high-energy gamma rays, electrons, or protons bombard materials, the excited atoms emit characteristic "secondary" gamma rays, which are products of the creation of excited nuclear states in the bombarded atoms. Such transitions, a form of nuclear gamma fluorescence, form a topic in nuclear physics called gamma spectroscopy. Formation of fluorescent gamma rays are a rapid subtype of radioactive gamma decay.

In certain cases, the excited nuclear state that follows the emission of a beta particle or other type of excitation, may be more stable than average, and is termed a metastable excited state, if its decay takes (at least) 100 to 1000 times longer than the average 10−12 seconds. Such relatively long-lived excited nuclei are termed nuclear isomers, and their decays are termed isomeric transitions. Such nuclei have half-lifes that are more easily measurable, and rare nuclear isomers are able to stay in their excited state for minutes, hours, days, or occasionally far longer, before emitting a gamma ray. The process of isomeric transition is therefore similar to any gamma emission, but differs in that it involves the intermediate metastable excited state(s) of the nuclei. Metastable states are often characterized by high nuclear spin, requiring a change in spin of several units or more with gamma decay, instead of a single unit transition that occurs in only 10−12 seconds. The rate of gamma decay is also slowed when the energy of excitation of the nucleus is small.

An emitted gamma ray from any type of excited state may transfer its energy directly to any electrons, but most probably to one of the K shell electrons of the atom, causing it to be ejected from that atom, in a process generally termed the photoelectric effect (external gamma rays and ultraviolet rays may also cause this effect). The photoelectric effect should not be confused with the internal conversion process, in which a gamma ray photon is not produced as an intermediate particle (rather, a "virtual gamma ray" may be thought to mediate the process).

Decay schemes

Radioactive decay scheme of 60
Co

Gamma emission spectrum of cobalt-60

One example of gamma ray production due to radionuclide decay is the decay scheme for cobalt-60, as illustrated in the accompanying diagram. First, 60
Co
decays to excited 60
Ni
by beta decay emission of an electron of 0.31 MeV. Then the excited 60
Ni
decays to the ground state (see nuclear shell model) by emitting gamma rays in succession of 1.17 MeV followed by 1.33 MeV. This path is followed 99.88% of the time:

60
27
Co
 
→  60
28
Ni*
 

e
 

ν
e
 

γ
 
1.17 MeV
60
28
Ni*
 
→  60
28
Ni
 
       
γ
 
1.33 MeV

Another example is the alpha decay of 241
Am
to form 237
Np
; which is followed by gamma emission. In some cases, the gamma emission spectrum of the daughter nucleus is quite simple, (e.g. 60
Co
/60
Ni
) while in other cases, such as with (241
Am
/237
Np
and 192
Ir
/192
Pt
), the gamma emission spectrum is complex, revealing that a series of nuclear energy levels exist.

Particle physics

Gamma rays are produced in many processes of particle physics. Typically, gamma rays are the products of neutral systems which decay through electromagnetic interactions (rather than a weak or strong interaction). For example, in an electron–positron annihilation, the usual products are two gamma ray photons. If the annihilating electron and positron are at rest, each of the resulting gamma rays has an energy of ~ 511 keV and frequency of ~ 1.24×1020 Hz. Similarly, a neutral pion most often decays into two photons. Many other hadrons and massive bosons also decay electromagnetically. High energy physics experiments, such as the Large Hadron Collider, accordingly employ substantial radiation shielding. Because subatomic particles mostly have far shorter wavelengths than atomic nuclei, particle physics gamma rays are generally several orders of magnitude more energetic than nuclear decay gamma rays. Since gamma rays are at the top of the electromagnetic spectrum in terms of energy, all extremely high-energy photons are gamma rays; for example, a photon having the Planck energy would be a gamma ray.

Other sources

A few gamma rays in astronomy are known to arise from gamma decay (see discussion of SN1987A), but most do not.

Photons from astrophysical sources that carry energy in the gamma radiation range are often explicitly called gamma-radiation. In addition to nuclear emissions, they are often produced by sub-atomic particle and particle-photon interactions. Those include electron-positron annihilation, neutral pion decay, bremsstrahlung, inverse Compton scattering, and synchrotron radiation.

Laboratory sources

In October 2017, scientists from various European universities proposed a means for sources of GeV photons using lasers as exciters through a controlled interplay between the cascade and anomalous radiative trapping.

Terrestrial thunderstorms

Thunderstorms can produce a brief pulse of gamma radiation called a terrestrial gamma-ray flash. These gamma rays are thought to be produced by high intensity static electric fields accelerating electrons, which then produce gamma rays by bremsstrahlung as they collide with and are slowed by atoms in the atmosphere. Gamma rays up to 100 MeV can be emitted by terrestrial thunderstorms, and were discovered by space-borne observatories. This raises the possibility of health risks to passengers and crew on aircraft flying in or near thunderclouds.

Solar flares

The most effusive solar flares emit across the entire EM spectrum, including γ-rays. The first confident observation occurred in 1972.

Cosmic rays

Extraterrestrial, high energy gamma rays include the gamma ray background produced when cosmic rays (either high speed electrons or protons) collide with ordinary matter, producing pair-production gamma rays at 511 keV. Alternatively, bremsstrahlung are produced at energies of tens of MeV or more when cosmic ray electrons interact with nuclei of sufficiently high atomic number (see gamma ray image of the Moon near the end of this article, for illustration).

Image of entire sky in 100 MeV or greater gamma rays as seen by the EGRET instrument aboard the CGRO spacecraft. Bright spots within the galactic plane are pulsars while those above and below the plane are thought to be quasars.

Pulsars and magnetars

The gamma ray sky (see illustration at right) is dominated by the more common and longer-term production of gamma rays that emanate from pulsars within the Milky Way. Sources from the rest of the sky are mostly quasars. Pulsars are thought to be neutron stars with magnetic fields that produce focused beams of radiation, and are far less energetic, more common, and much nearer sources (typically seen only in our own galaxy) than are quasars or the rarer gamma-ray burst sources of gamma rays. Pulsars have relatively long-lived magnetic fields that produce focused beams of relativistic speed charged particles, which emit gamma rays (bremsstrahlung) when those strike gas or dust in their nearby medium, and are decelerated. This is a similar mechanism to the production of high-energy photons in megavoltage radiation therapy machines (see bremsstrahlung). Inverse Compton scattering, in which charged particles (usually electrons) impart energy to low-energy photons boosting them to higher energy photons. Such impacts of photons on relativistic charged particle beams is another possible mechanism of gamma ray production. Neutron stars with a very high magnetic field (magnetars), thought to produce astronomical soft gamma repeaters, are another relatively long-lived star-powered source of gamma radiation.

Quasars and active galaxies

More powerful gamma rays from very distant quasars and closer active galaxies are thought to have a gamma ray production source similar to a particle accelerator. High energy electrons produced by the quasar, and subjected to inverse Compton scattering, synchrotron radiation, or bremsstrahlung, are the likely source of the gamma rays from those objects. It is thought that a supermassive black hole at the center of such galaxies provides the power source that intermittently destroys stars and focuses the resulting charged particles into beams that emerge from their rotational poles. When those beams interact with gas, dust, and lower energy photons they produce X-rays and gamma rays. These sources are known to fluctuate with durations of a few weeks, suggesting their relatively small size (less than a few light-weeks across). Such sources of gamma and X-rays are the most commonly visible high intensity sources outside the Milky Way galaxy. They shine not in bursts (see illustration), but relatively continuously when viewed with gamma ray telescopes. The power of a typical quasar is about 1040 watts, a small fraction of which is gamma radiation. Much of the rest is emitted as electromagnetic waves of all frequencies, including radio waves.

A hypernova. Artist's illustration showing the life of a massive star as nuclear fusion converts lighter elements into heavier ones. When fusion no longer generates enough pressure to counteract gravity, the star rapidly collapses to form a black hole. Theoretically, energy may be released during the collapse along the axis of rotation to form a long duration gamma-ray burst.

Gamma-ray bursts

The most intense sources of gamma rays, are also the most intense sources of any type of electromagnetic radiation presently known. They are the "long duration burst" sources of gamma rays in astronomy ("long" in this context, meaning a few tens of seconds), and they are rare compared with the sources discussed above. By contrast, "short" gamma-ray bursts of two seconds or less, which are not associated with supernovae, are thought to produce gamma rays during the collision of pairs of neutron stars, or a neutron star and a black hole.

The so-called long-duration gamma-ray bursts produce a total energy output of about 1044 joules (as much energy as the Sun will produce in its entire life-time) but in a period of only 20 to 40 seconds. Gamma rays are approximately 50% of the total energy output. The leading hypotheses for the mechanism of production of these highest-known intensity beams of radiation, are inverse Compton scattering and synchrotron radiation from high-energy charged particles. These processes occur as relativistic charged particles leave the region of the event horizon of a newly formed black hole created during supernova explosion. The beam of particles moving at relativistic speeds are focused for a few tens of seconds by the magnetic field of the exploding hypernova. The fusion explosion of the hypernova drives the energetics of the process. If the narrowly directed beam happens to be pointed toward the Earth, it shines at gamma ray frequencies with such intensity, that it can be detected even at distances of up to 10 billion light years, which is close to the edge of the visible universe.

Properties

Penetration of matter

Alpha radiation consists of helium nuclei and is readily stopped by a sheet of paper. Beta radiation, consisting of electrons or positrons, is stopped by an aluminium plate, but gamma radiation requires shielding by dense material such as lead or concrete.

Due to their penetrating nature, gamma rays require large amounts of shielding mass to reduce them to levels which are not harmful to living cells, in contrast to alpha particles, which can be stopped by paper or skin, and beta particles, which can be shielded by thin aluminium. Gamma rays are best absorbed by materials with high atomic numbers (Z) and high density, which contribute to the total stopping power. Because of this, a lead (high Z) shield is 20–30% better as a gamma shield than an equal mass of another low-Z shielding material, such as aluminium, concrete, water, or soil; lead's major advantage is not in lower weight, but rather its compactness due to its higher density. Protective clothing, goggles and respirators can protect from internal contact with or ingestion of alpha or beta emitting particles, but provide no protection from gamma radiation from external sources.

The higher the energy of the gamma rays, the thicker the shielding made from the same shielding material is required. Materials for shielding gamma rays are typically measured by the thickness required to reduce the intensity of the gamma rays by one half (the half value layer or HVL). For example, gamma rays that require 1 cm (0.4 inch) of lead to reduce their intensity by 50% will also have their intensity reduced in half by 4.1 cm of granite rock, 6 cm (2.5 inches) of concrete, or 9 cm (3.5 inches) of packed soil. However, the mass of this much concrete or soil is only 20–30% greater than that of lead with the same absorption capability. Depleted uranium is used for shielding in portable gamma ray sources, but here the savings in weight over lead are larger, as a portable source is very small relative to the required shielding, so the shielding resembles a sphere to some extent. The volume of a sphere is dependent on the cube of the radius; so a source with its radius cut in half will have its volume (and weight) reduced by a factor of eight, which will more than compensate for uranium's greater density (as well as reducing bulk). In a nuclear power plant, shielding can be provided by steel and concrete in the pressure and particle containment vessel, while water provides a radiation shielding of fuel rods during storage or transport into the reactor core. The loss of water or removal of a "hot" fuel assembly into the air would result in much higher radiation levels than when kept under water.

Matter interaction

The total absorption coefficient of aluminium (atomic number 13) for gamma rays, plotted versus gamma energy, and the contributions by the three effects. As is usual, the photoelectric effect is largest at low energies, Compton scattering dominates at intermediate energies, and pair production dominates at high energies.
The total absorption coefficient of lead (atomic number 82) for gamma rays, plotted versus gamma energy, and the contributions by the three effects. Here, the photoelectric effect dominates at low energy. Above 5 MeV, pair production starts to dominate.

When a gamma ray passes through matter, the probability for absorption is proportional to the thickness of the layer, the density of the material, and the absorption cross section of the material. The total absorption shows an exponential decrease of intensity with distance from the incident surface:

where x is the thickness of the material from the incident surface, μ= nσ is the absorption coefficient, measured in cm−1, n the number of atoms per cm3 of the material (atomic density) and σ the absorption cross section in cm2.

As it passes through matter, gamma radiation ionizes via three processes:

  • The photoelectric effect: This describes the case in which a gamma photon interacts with and transfers its energy to an atomic electron, causing the ejection of that electron from the atom. The kinetic energy of the resulting photoelectron is equal to the energy of the incident gamma photon minus the energy that originally bound the electron to the atom (binding energy). The photoelectric effect is the dominant energy transfer mechanism for X-ray and gamma ray photons with energies below 50 keV (thousand electronvolts), but it is much less important at higher energies.
  • Compton scattering: This is an interaction in which an incident gamma photon loses enough energy to an atomic electron to cause its ejection, with the remainder of the original photon's energy emitted as a new, lower energy gamma photon whose emission direction is different from that of the incident gamma photon, hence the term "scattering". The probability of Compton scattering decreases with increasing photon energy. It is thought to be the principal absorption mechanism for gamma rays in the intermediate energy range 100 keV to 10 MeV. It is relatively independent of the atomic number of the absorbing material, which is why very dense materials like lead are only modestly better shields, on a per weight basis, than are less dense materials.
  • Pair production: This becomes possible with gamma energies exceeding 1.02 MeV, and becomes important as an absorption mechanism at energies over 5 MeV (see illustration at right, for lead). By interaction with the electric field of a nucleus, the energy of the incident photon is converted into the mass of an electron-positron pair. Any gamma energy in excess of the equivalent rest mass of the two particles (totaling at least 1.02 MeV) appears as the kinetic energy of the pair and in the recoil of the emitting nucleus. At the end of the positron's range, it combines with a free electron, and the two annihilate, and the entire mass of these two is then converted into two gamma photons of at least 0.51 MeV energy each (or higher according to the kinetic energy of the annihilated particles).

The secondary electrons (and/or positrons) produced in any of these three processes frequently have enough energy to produce much ionization themselves.

Additionally, gamma rays, particularly high energy ones, can interact with atomic nuclei resulting in ejection of particles in photodisintegration, or in some cases, even nuclear fission (photofission).

Light interaction

High-energy (from 80 GeV to ~10 TeV) gamma rays arriving from far-distant quasars are used to estimate the extragalactic background light in the universe: The highest-energy rays interact more readily with the background light photons and thus the density of the background light may be estimated by analyzing the incoming gamma ray spectra.

Gamma spectroscopy

Gamma spectroscopy is the study of the energetic transitions in atomic nuclei, which are generally associated with the absorption or emission of gamma rays. As in optical spectroscopy (see Franck–Condon effect) the absorption of gamma rays by a nucleus is especially likely (i.e., peaks in a "resonance") when the energy of the gamma ray is the same as that of an energy transition in the nucleus. In the case of gamma rays, such a resonance is seen in the technique of Mössbauer spectroscopy. In the Mössbauer effect the narrow resonance absorption for nuclear gamma absorption can be successfully attained by physically immobilizing atomic nuclei in a crystal. The immobilization of nuclei at both ends of a gamma resonance interaction is required so that no gamma energy is lost to the kinetic energy of recoiling nuclei at either the emitting or absorbing end of a gamma transition. Such loss of energy causes gamma ray resonance absorption to fail. However, when emitted gamma rays carry essentially all of the energy of the atomic nuclear de-excitation that produces them, this energy is also sufficient to excite the same energy state in a second immobilized nucleus of the same type.

Applications

Gamma-ray image of a truck with two stowaways taken with a VACIS (vehicle and container imaging system)

Gamma rays provide information about some of the most energetic phenomena in the universe; however, they are largely absorbed by the Earth's atmosphere. Instruments aboard high-altitude balloons and satellites missions, such as the Fermi Gamma-ray Space Telescope, provide our only view of the universe in gamma rays.

Gamma-induced molecular changes can also be used to alter the properties of semi-precious stones, and is often used to change white topaz into blue topaz.

Non-contact industrial sensors commonly use sources of gamma radiation in refining, mining, chemicals, food, soaps and detergents, and pulp and paper industries, for the measurement of levels, density, and thicknesses. Gamma-ray sensors are also used for measuring the fluid levels in water and oil industries. Typically, these use Co-60 or Cs-137 isotopes as the radiation source.

In the US, gamma ray detectors are beginning to be used as part of the Container Security Initiative (CSI). These machines are advertised to be able to scan 30 containers per hour.

Gamma radiation is often used to kill living organisms, in a process called irradiation. Applications of this include the sterilization of medical equipment (as an alternative to autoclaves or chemical means), the removal of decay-causing bacteria from many foods and the prevention of the sprouting of fruit and vegetables to maintain freshness and flavor.

Despite their cancer-causing properties, gamma rays are also used to treat some types of cancer, since the rays also kill cancer cells. In the procedure called gamma-knife surgery, multiple concentrated beams of gamma rays are directed to the growth in order to kill the cancerous cells. The beams are aimed from different angles to concentrate the radiation on the growth while minimizing damage to surrounding tissues.

Gamma rays are also used for diagnostic purposes in nuclear medicine in imaging techniques. A number of different gamma-emitting radioisotopes are used. For example, in a PET scan a radiolabeled sugar called fluorodeoxyglucose emits positrons that are annihilated by electrons, producing pairs of gamma rays that highlight cancer as the cancer often has a higher metabolic rate than the surrounding tissues. The most common gamma emitter used in medical applications is the nuclear isomer technetium-99m which emits gamma rays in the same energy range as diagnostic X-rays. When this radionuclide tracer is administered to a patient, a gamma camera can be used to form an image of the radioisotope's distribution by detecting the gamma radiation emitted (see also SPECT). Depending on which molecule has been labeled with the tracer, such techniques can be employed to diagnose a wide range of conditions (for example, the spread of cancer to the bones via bone scan).

Health effects

Gamma rays cause damage at a cellular level and are penetrating, causing diffuse damage throughout the body. However, they are less ionising than alpha or beta particles, which are less penetrating.

Low levels of gamma rays cause a stochastic health risk, which for radiation dose assessment is defined as the probability of cancer induction and genetic damage. High doses produce deterministic effects, which is the severity of acute tissue damage that is certain to happen. These effects are compared to the physical quantity absorbed dose measured by the unit gray (Gy).

Body response

When gamma radiation breaks DNA molecules, a cell may be able to repair the damaged genetic material, within limits. However, a study of Rothkamm and Lobrich has shown that this repair process works well after high-dose exposure but is much slower in the case of a low-dose exposure.

Risk assessment

The natural outdoor exposure in the United Kingdom ranges from 0.1 to 0.5 µSv/h with significant increase around known nuclear and contaminated sites. Natural exposure to gamma rays is about 1 to 2 mSv per year, and the average total amount of radiation received in one year per inhabitant in the USA is 3.6 mSv. There is a small increase in the dose, due to naturally occurring gamma radiation, around small particles of high atomic number materials in the human body caused by the photoelectric effect.

By comparison, the radiation dose from chest radiography (about 0.06 mSv) is a fraction of the annual naturally occurring background radiation dose.[21] A chest CT delivers 5 to 8 mSv. A whole-body PET/CT scan can deliver 14 to 32 mSv depending on the protocol. The dose from fluoroscopy of the stomach is much higher, approximately 50 mSv (14 times the annual background).

An acute full-body equivalent single exposure dose of 1 Sv (1000 mSv), or 1 Gy, will cause mild symptoms of acute radiation sickness, such as nausea and vomiting; and a dose of 2.0–3.5 Sv (2.0–3.5 Gy) causes more severe symptoms (i.e. nausea, diarrhea, hair loss, hemorrhaging, and inability to fight infections), and will cause death in a sizable number of cases —- about 10% to 35% without medical treatment. A dose of 5 Sv (5 Gy) is considered approximately the LD50 (lethal dose for 50% of exposed population) for an acute exposure to radiation even with standard medical treatment. A dose higher than 5 Sv (5 Gy) brings an increasing chance of death above 50%. Above 7.5–10 Sv (7.5–10 Gy) to the entire body, even extraordinary treatment, such as bone-marrow transplants, will not prevent the death of the individual exposed (see radiation poisoning). (Doses much larger than this may, however, be delivered to selected parts of the body in the course of radiation therapy.)

For low-dose exposure, for example among nuclear workers, who receive an average yearly radiation dose of 19 mSv, the risk of dying from cancer (excluding leukemia) increases by 2 percent. For a dose of 100 mSv, the risk increase is 10 percent. By comparison, risk of dying from cancer was increased by 32 percent for the survivors of the atomic bombing of Hiroshima and Nagasaki.

Units of measurement and exposure

The following table shows radiation quantities in SI and non-SI units:

Ionizing radiation related quantities view  talk  edit
Quantity Unit Symbol Derivation Year SI equivalent
Activity (A) becquerel Bq s−1 1974 SI unit
curie Ci 3.7 × 1010 s−1 1953 3.7×1010 Bq
rutherford Rd 106 s−1 1946 1,000,000 Bq
Exposure (X) coulomb per kilogram C/kg C⋅kg−1 of air 1974 SI unit
röntgen R esu / 0.001293 g of air 1928 2.58 × 10−4 C/kg
Absorbed dose (D) gray Gy J⋅kg−1 1974 SI unit
erg per gram erg/g erg⋅g−1 1950 1.0 × 10−4 Gy
rad rad 100 erg⋅g−1 1953 0.010 Gy
Equivalent dose (H) sievert Sv J⋅kg−1 × WR 1977 SI unit
röntgen equivalent man rem 100 erg⋅g−1 × WR 1971 0.010 Sv
Effective dose (E) sievert Sv J⋅kg−1 × WR × WT 1977 SI unit
röntgen equivalent man rem 100 erg⋅g−1 × WR × WT 1971 0.010 Sv

The measure of the ionizing effect of gamma and X-rays in dry air is called the exposure, for which a legacy unit, the röntgen was used from 1928. This has been replaced by kerma, now mainly used for instrument calibration purposes but not for received dose effect. The effect of gamma and other ionizing radiation on living tissue is more closely related to the amount of energy deposited in tissue rather than the ionisation of air, and replacement radiometric units and quantities for radiation protection have been defined and developed from 1953 onwards. These are:

  • The gray (Gy), is the SI unit of absorbed dose, which is the amount of radiation energy deposited in the irradiated material. For gamma radiation this is numerically equivalent to equivalent dose measured by the sievert, which indicates the stochastic biological effect of low levels of radiation on human tissue. The radiation weighting conversion factor from absorbed dose to equivalent dose is 1 for gamma, whereas alpha particles have a factor of 20, reflecting their greater ionising effect on tissue.
  • The rad is the deprecated CGS unit for absorbed dose and the rem is the deprecated CGS unit of equivalent dose, used mainly in the USA.

Distinction from X-rays

In practice, gamma ray energies overlap with the range of X-rays, especially in the higher-frequency region referred to as "hard" X-rays. This depiction follows the older convention of distinguishing by wavelength.

The conventional distinction between X-rays and gamma rays has changed over time. Originally, the electromagnetic radiation emitted by X-ray tubes almost invariably had a longer wavelength than the radiation (gamma rays) emitted by radioactive nuclei. Older literature distinguished between X- and gamma radiation on the basis of wavelength, with radiation shorter than some arbitrary wavelength, such as 10−11 m, defined as gamma rays. Since the energy of photons is proportional to their frequency and inversely proportional to wavelength, this past distinction between X-rays and gamma rays can also be thought of in terms of its energy, with gamma rays considered to be higher energy electromagnetic radiation than are X-rays.

However, since current artificial sources are now able to duplicate any electromagnetic radiation that originates in the nucleus, as well as far higher energies, the wavelengths characteristic of radioactive gamma ray sources vs. other types now completely overlap. Thus, gamma rays are now usually distinguished by their origin: X-rays are emitted by definition by electrons outside the nucleus, while gamma rays are emitted by the nucleus. Exceptions to this convention occur in astronomy, where gamma decay is seen in the afterglow of certain supernovas, but radiation from high energy processes known to involve other radiation sources than radioactive decay is still classed as gamma radiation.

The Moon as seen by the Compton Gamma Ray Observatory, in gamma rays of greater than 20 MeV. These are produced by cosmic ray bombardment of its surface. The Sun, which has no similar surface of high atomic number to act as target for cosmic rays, cannot usually be seen at all at these energies, which are too high to emerge from primary nuclear reactions, such as solar nuclear fusion (though occasionally the Sun produces gamma rays by cyclotron-type mechanisms, during solar flares). Gamma rays typically have higher energy than X-rays.

For example, modern high-energy X-rays produced by linear accelerators for megavoltage treatment in cancer often have higher energy (4 to 25 MeV) than do most classical gamma rays produced by nuclear gamma decay. One of the most common gamma ray emitting isotopes used in diagnostic nuclear medicine, technetium-99m, produces gamma radiation of the same energy (140 keV) as that produced by diagnostic X-ray machines, but of significantly lower energy than therapeutic photons from linear particle accelerators. In the medical community today, the convention that radiation produced by nuclear decay is the only type referred to as "gamma" radiation is still respected.

Due to this broad overlap in energy ranges, in physics the two types of electromagnetic radiation are now often defined by their origin: X-rays are emitted by electrons (either in orbitals outside of the nucleus, or while being accelerated to produce bremsstrahlung-type radiation), while gamma rays are emitted by the nucleus or by means of other particle decays or annihilation events. There is no lower limit to the energy of photons produced by nuclear reactions, and thus ultraviolet or lower energy photons produced by these processes would also be defined as "gamma rays". The only naming-convention that is still universally respected is the rule that electromagnetic radiation that is known to be of atomic nuclear origin is always referred to as "gamma rays", and never as X-rays. However, in physics and astronomy, the converse convention (that all gamma rays are considered to be of nuclear origin) is frequently violated.

In astronomy, higher energy gamma and X-rays are defined by energy, since the processes that produce them may be uncertain and photon energy, not origin, determines the required astronomical detectors needed. High-energy photons occur in nature that are known to be produced by processes other than nuclear decay but are still referred to as gamma radiation. An example is "gamma rays" from lightning discharges at 10 to 20 MeV, and known to be produced by the bremsstrahlung mechanism.

Another example is gamma-ray bursts, now known to be produced from processes too powerful to involve simple collections of atoms undergoing radioactive decay. This is part and parcel of the general realization that many gamma rays produced in astronomical processes result not from radioactive decay or particle annihilation, but rather in non-radioactive processes similar to X-rays. Although the gamma rays of astronomy often come from non-radioactive events, a few gamma rays in astronomy are specifically known to originate from gamma decay of nuclei (as demonstrated by their spectra and emission half life). A classic example is that of supernova SN 1987A, which emits an "afterglow" of gamma-ray photons from the decay of newly made radioactive nickel-56 and cobalt-56. Most gamma rays in astronomy, however, arise by other mechanisms.

Politics of Europe

From Wikipedia, the free encyclopedia ...