Search This Blog

Saturday, February 16, 2019

Radiation-induced cancer

From Wikipedia, the free encyclopedia

Up to 10% of invasive cancers are related to radiation exposure, including both ionizing radiation and non-ionizing radiation. Additionally, the vast majority of non-invasive cancers are non-melanoma skin cancers caused by non-ionizing ultraviolet radiation. Ultraviolet's position on the electromagnetic spectrum is on the boundary between ionizing and non-ionizing radiation. Non-ionizing radio frequency radiation from mobile phones, electric power transmission, and other similar sources have been described as a possible carcinogen by the World Health Organization's International Agency for Research on Cancer, but the link remains unproven.

Exposure to ionizing radiation is known to increase the future incidence of cancer, particularly leukemia. The mechanism by which this occurs is well understood, but quantitative models predicting the level of risk remain controversial. The most widely accepted model posits that the incidence of cancers due to ionizing radiation increases linearly with effective radiation dose at a rate of 5.5% per sievert. If the linear model is correct, then natural background radiation is the most hazardous source of radiation to general public health, followed by medical imaging as a close second.

Causes

According to the prevalent model, any radiation exposure can increase the risk of cancer. Typical contributors to such risk include natural background radiation, medical procedures, occupational exposures, nuclear accidents, and many others. Some major contributors are discussed below.

Radon

Radon is responsible for the worldwide majority of the mean public exposure to ionizing radiation. It is often the single largest contributor to an individual's background radiation dose, and is the most variable from location to location. Radon gas from natural sources can accumulate in buildings, especially in confined areas such as attics, and basements. It can also be found in some spring waters and hot springs.

Epidemiological evidence shows a clear link between lung cancer and high concentrations of radon, with 21,000 radon-induced U.S. lung cancer deaths per year—second only to cigarette smoking—according to the United States Environmental Protection Agency. Thus in geographic areas where radon is present in heightened concentrations, radon is considered a significant indoor air contaminant. 

Residential exposure to radon gas has similar cancer risks as passive smoking. Radiation is a more potent source of cancer when it is combined with other cancer-causing agents, such as radon gas exposure plus smoking tobacco.

Medical

In industrialized countries, Medical imaging contributes almost as much radiation dose to the public as natural background radiation. Collective dose to Americans from medical imaging grew by a factor of six from 1990 to 2006, mostly due to growing use of 3D scans that impart much more dose per procedure than traditional radiographs. CT scans alone, which account for half the medical imaging dose to the public, are estimated to be responsible for 0.4% of current cancers in the United States, and this may increase to as high as 1.5-2% with 2007 rates of CT usage; however, this estimate is disputed. Other nuclear medicine techniques involve the injection of radioactive pharmaceuticals directly into the bloodstream, and radiotherapy treatments deliberately deliver lethal doses (on a cellular level) to tumors and surrounding tissues.

It has been estimated that CT scans performed in the US in 2007 alone will result in 29,000 new cancer cases in future years. This estimate is criticized by the American College of Radiology (ACR), which maintains that the life expectancy of CT scanned patients is not that of the general population and that the model of calculating cancer is based on total-body radiation exposure and thus faulty.

Occupational

In accordance with ICRP recommendations, most regulators permit nuclear energy workers to receive up to 20 times more radiation dose than is permitted for the general public. Higher doses are usually permitted when responding to an emergency. The majority of workers are routinely kept well within regulatory limits, while a few essential technicians will routinely approach their maximum each year. Accidental overexposures beyond regulatory limits happen globally several times a year. Astronauts on long missions are at higher risk of cancer, see cancer and spaceflight

Some occupations are exposed to radiation without being classed as nuclear energy workers. Airline crews receive occupational exposures from cosmic radiation because of reduced atmospheric shielding at altitude. Mine workers receive occupational exposures to radon, especially in uranium mines. Anyone working in a granite building, such as the US Capitol, is likely to receive a dose from natural uranium in the granite.

Accidental

Chernobyl radiation map from 1996
 
Nuclear accidents can have dramatic consequences to their surroundings, but their global impact on cancer is less than that of natural and medical exposures.

The most severe nuclear accident is probably the Chernobyl disaster. In addition to conventional fatalities and acute radiation syndrome fatalities, nine children died of thyroid cancer, and it is estimated that there may be up to 4,000 excess cancer deaths among the approximately 600,000 most highly exposed people. Of the 100 million curies (4 exabecquerels) of radioactive material, the short lived radioactive isotopes such as 131I Chernobyl released were initially the most dangerous. Due to their short half-lives of 5 and 8 days they have now decayed, leaving the more long-lived 137Cs (with a half-life of 30.07 years) and 90Sr (with a half-life of 28.78 years) as main dangers.

In March 2011, an earthquake and tsunami caused damage that led to explosions and partial meltdowns at the Fukushima I Nuclear Power Plant in Japan. Significant release of radioactive material took place following hydrogen explosions at three reactors, as technicians tried to pump in seawater to keep the uranium fuel rods cool, and bled radioactive gas from the reactors in order to make room for the seawater. Concerns about the large-scale release of radioactivity resulted in 20 km exclusion zone being set up around the power plant and people within the 20–30 km zone being advised to stay indoors. On March 24, 2011, Japanese officials announced that "radioactive iodine-131 exceeding safety limits for infants had been detected at 18 water-purification plants in Tokyo and five other prefectures".

Other serious radiation accidents include the Kyshtym disaster (estimated 49 to 55 cancer deaths), and the Windscale fire (an estimated 33 cancer deaths).

Mechanism

Cancer is a stochastic effect of radiation, meaning that the probability of occurrence increases with effective radiation dose, but the severity of the cancer is independent of dose. The speed at which cancer advances, the prognosis, the degree of pain, and every other feature of the disease are not functions of the radiation dose to which the person is exposed. This contrasts with the deterministic effects of acute radiation syndrome which increase in severity with dose above a threshold. Cancer starts with a single cell whose operation is disrupted. Normal cell operation is controlled by the chemical structure of DNA molecules, also called chromosomes

When radiation deposits enough energy in organic tissue to cause ionization, this tends to break molecular bonds, and thus alter the molecular structure of the irradiated molecules. Less energetic radiation, such as visible light, only causes excitation, not ionization, which is usually dissipated as heat with relatively little chemical damage. Ultraviolet light is usually categorized as non-ionizing, but it is actually in an intermediate range that produces some ionization and chemical damage. Hence the carcinogenic mechanism of ultraviolet radiation is similar to that of ionizing radiation. 

Unlike chemical or physical triggers for cancer, penetrating radiation hits molecules within cells randomly. Molecules broken by radiation can become highly reactive free radicals that cause further chemical damage. Some of this direct and indirect damage will eventually impact chromosomes and epigenetic factors that control the expression of genes. Cellular mechanisms will repair some of this damage, but some repairs will be incorrect and some chromosome abnormalities will turn out to be irreversible. 

DNA double-strand breaks (DSBs) are generally accepted to be the most biologically significant lesion by which ionizing radiation causes cancer. In vitro experiments show that ionizing radiation cause DSBs at a rate of 35 DSBs per cell per Gray, and removes a portion of the epigenetic markers of the DNA, which regulate the gene expression. Most of the induced DSBs are repaired within 24h after exposure, however, 25% of the repaired strands are repaired incorrectly and about 20% of fibroblast cells that were exposed to 200 mGy died within 4 days after exposure. A portion of the population possess a flawed DNA repair mechanism, and thus suffer a greater insult due to exposure to radiation.

Major damage normally results in the cell dying or being unable to reproduce. This effect is responsible for acute radiation syndrome, but these heavily damaged cells cannot become cancerous. Lighter damage may leave a stable, partly functional cell that may be capable of proliferating and eventually developing into cancer, especially if tumor suppressor genes are damaged. The latest research suggests that mutagenic events do not occur immediately after irradiation. Instead, surviving cells appear to have acquired a genomic instability which causes an increased rate of mutations in future generations. The cell will then progress through multiple stages of neoplastic transformation that may culminate into a tumor after years of incubation. The neoplastic transformation can be divided into three major independent stages: morphological changes to the cell, acquisition of cellular immortality (losing normal, life-limiting cell regulatory processes), and adaptations that favor formation of a tumor.

In some cases, a small radiation dose reduces the impact of a subsequent, larger radiation dose. This has been termed an 'adaptive response' and is related to hypothetical mechanisms of hormesis.

A latent period of decades may elapse between radiation exposure and the detection of cancer. Those cancers that may develop as a result of radiation exposure are indistinguishable from those that occur naturally or as a result of exposure to other carcinogens. Furthermore, National Cancer Institute literature indicates that chemical and physical hazards and lifestyle factors, such as smoking, alcohol consumption, and diet, significantly contribute to many of these same diseases. Evidence from uranium miners suggests that smoking may have a multiplicative, rather than additive, interaction with radiation. Evaluations of radiation's contribution to cancer incidence can only be done through large epidemiological studies with thorough data about all other confounding risk factors.

Skin cancer

Prolonged exposure to ultraviolet radiation from the sun can lead to melanoma and other skin malignancies. Clear evidence establishes ultraviolet radiation, especially the non-ionizing medium wave UVB, as the cause of most non-melanoma skin cancers, which are the most common forms of cancer in the world.

Skin cancer may occur following ionizing radiation exposure following a latent period averaging 20 to 40 years. A Chronic radiation keratosis is a precancerous keratotic skin lesion that may arise on the skin many years after exposure to ionizing radiation. Various malignancies may develop, most frequency basal-cell carcinoma followed by squamous-cell carcinoma. Elevated risk is confined to the site of radiation exposure. Several studies have also suggested the possibility of a causal relationship between melanoma and ionizing radiation exposure. The degree of carcinogenic risk arising from low levels of exposure is more contentious, but the available evidence points to an increased risk that is approximately proportional to the dose received. Radiologists and radiographers are among the earliest occupational groups exposed to radiation. It was the observation of the earliest radiologists that led to the recognition of radiation-induced skin cancer—the first solid cancer linked to radiation—in 1902. While the incidence of skin cancer secondary to medical ionizing radiation was higher in the past, there is also some evidence that risks of certain cancers, notably skin cancer, may be increased among more recent medical radiation workers, and this may be related to specific or changing radiologic practices. Available evidence indicates that the excess risk of skin cancer lasts for 45 years or more following irradiation.

Epidemiology

Cancer is a stochastic effect of radiation, meaning that it only has a probability of occurrence, as opposed to deterministic effects which always happen over a certain dose threshold. The consensus of the nuclear industry, nuclear regulators, and governments, is that the incidence of cancers due to ionizing radiation can be modeled as increasing linearly with effective radiation dose at a rate of 5.5% per sievert. Individual studies, alternate models, and earlier versions of the industry consensus have produced other risk estimates scattered around this consensus model. There is general agreement that the risk is much higher for infants and fetuses than adults, higher for the middle-aged than for seniors, and higher for women than for men, though there is no quantitative consensus about this. This model is widely accepted for external radiation, but its application to internal contamination is disputed. For example, the model fails to account for the low rates of cancer in early workers at Los Alamos National Laboratory who were exposed to plutonium dust, and the high rates of thyroid cancer in children following the Chernobyl accident, both of which were internal exposure events. The European Committee on Radiation Risk calls the ICRP model "fatally flawed" when it comes to internal exposure.

Radiation can cause cancer in most parts of the body, in all animals, and at any age, although radiation-induced solid tumors usually take 10–15 years, and can take up to 40 years, to become clinically manifest, and radiation-induced leukemias typically require 2–9 years to appear. Some people, such as those with nevoid basal cell carcinoma syndrome or retinoblastoma, are more susceptible than average to developing cancer from radiation exposure. Children and adolescents are twice as likely to develop radiation-induced leukemia as adults; radiation exposure before birth has ten times the effect.

Radiation exposure can cause cancer in any living tissue, but high-dose whole-body external exposure is most closely associated with leukemia, reflecting the high radiosensitivity of bone marrow. Internal exposures tend to cause cancer in the organs where the radioactive material concentrates, so that radon predominantly causes lung cancer, iodine-131 for thyroid cancer is most likely to cause leukemia.

Data sources

Increased Risk of Solid Cancer with Dose for A-bomb survivors
 
The associations between ionizing radiation exposure and the development of cancer are based primarily on the "LSS cohort" of Japanese atomic bomb survivors, the largest human population ever exposed to high levels of ionizing radiation. However this cohort was also exposed to high heat, both from the initial nuclear flash of infrared light and following the blast due their exposure to the firestorm and general fires that developed in both cities respectively, so the survivors also underwent Hyperthermia therapy to various degrees. Hyperthermia, or heat exposure following irradiation is well known in the field of radiation therapy to markedly increase the severity of free-radical insults to cells following irradiation. Presently however no attempts have been made to cater for this confounding factor, it is not included or corrected for in the dose-response curves for this group.

Additional data has been collected from recipients of selected medical procedures and the 1986 Chernobyl disaster. There is a clear link (see the UNSCEAR 2000 Report, Volume 2: Effects) between the Chernobyl accident and the unusually large number, approximately 1,800, of thyroid cancers reported in contaminated areas, mostly in children.

For low levels of radiation, the biological effects are so small they may not be detected in epidemiological studies. Although radiation may cause cancer at high doses and high dose rates, public health data regarding lower levels of exposure, below about 10 mSv (1,000 mrem), are harder to interpret. To assess the health impacts of lower radiation doses, researchers rely on models of the process by which radiation causes cancer; several models that predict differing levels of risk have emerged. 

Studies of occupational workers exposed to chronic low levels of radiation, above normal background, have provided mixed evidence regarding cancer and transgenerational effects. Cancer results, although uncertain, are consistent with estimates of risk based on atomic bomb survivors and suggest that these workers do face a small increase in the probability of developing leukemia and other cancers. One of the most recent and extensive studies of workers was published by Cardis, et al. in 2005 . There is evidence that low level, brief radiation exposures are not harmful.

Modelling

Alternative assumptions for the extrapolation of the cancer risk vs. radiation dose to low-dose levels, given a known risk at a high dose: supra-linearity (A), linear (B), linear-quadratic (C) and hormesis (D).
 
The linear dose-response model suggests that any increase in dose, no matter how small, results in an incremental increase in risk. The linear no-threshold model (LNT) hypothesis is accepted by the International Commission on Radiological Protection (ICRP) and regulators around the world. According to this model, about 1% of the global population develop cancer as a result of natural background radiation at some point in their lifetime. For comparison, 13% of deaths in 2008 are attributed to cancer, so background radiation could plausibly be a small contributor.

Many parties have criticized the ICRP's adoption of the linear no-threshold model for exaggerating the effects of low radiation doses. The most frequently cited alternatives are the “linear quadratic” model and the “hormesis” model. The linear quadratic model is widely viewed in radiotherapy as the best model of cellular survival, and it is the best fit to leukemia data from the LSS cohort.

Linear no-threshold F(D)=α⋅D
Linear quadratic F(D)=α⋅D+β⋅D2
Hormesis F(D)=α⋅[D−β]

In all three cases, the values of alpha and beta must be determined by regression from human exposure data. Laboratory experiments on animals and tissue samples is of limited value. Most of the high quality human data available is from high dose individuals, above 0.1 Sv, so any use of the models at low doses is an extrapolation that might be under-conservative or over-conservative. There is not enough human data available to settle decisively which of these model might be most accurate at low doses. The consensus has been to assume linear no-threshold because it the simplest and most conservative of the three. 

Radiation hormesis is the conjecture that a low level of ionizing radiation (i.e., near the level of Earth's natural background radiation) helps "immunize" cells against DNA damage from other causes (such as free radicals or larger doses of ionizing radiation), and decreases the risk of cancer. The theory proposes that such low levels activate the body's DNA repair mechanisms, causing higher levels of cellular DNA-repair proteins to be present in the body, improving the body's ability to repair DNA damage. This assertion is very difficult to prove in humans (using, for example, statistical cancer studies) because the effects of very low ionizing radiation levels are too small to be statistically measured amid the "noise" of normal cancer rates.

The idea of radiation hormesis is considered unproven by regulatory bodies. If the hormesis model turns out to be accurate, it is conceivable that current regulations based on the LNT model will prevent or limit the hormetic effect, and thus have a negative impact on health.

Other non-linear effects have been observed, particularly for internal doses. For example, iodine-131 is notable in that high doses of the isotope are sometimes less dangerous than low doses, since they tend to kill thyroid tissues that would otherwise become cancerous as a result of the radiation. Most studies of very-high-dose I-131 for treatment of Graves disease have failed to find any increase in thyroid cancer, even though there is linear increase in thyroid cancer risk with I-131 absorption at moderate doses.

Public safety

Low-dose exposures, such as living near a nuclear power plant or a coal-fired power plant, which has higher emissions than nuclear plants, are generally believed to have no or very little effect on cancer development, barring accidents. Greater concerns include radon in buildings and overuse of medical imaging. 

The International Commission on Radiological Protection (ICRP) recommends limiting artificial irradiation of the public to an average of 1 mSv (0.001 Sv) of effective dose per year, not including medical and occupational exposures. For comparison, radiation levels inside the US capitol building are 0.85 mSv/yr, close to the regulatory limit, because of the uranium content of the granite structure. According to the ICRP model, someone who spent 20 years inside the capitol building would have an extra one in a thousand chance of getting cancer, over and above any other existing risk. (20 yr X 0.85 mSv/yr X 0.001 Sv/mSv X 5.5%/Sv = ~0.1%) That "existing risk" is much higher; an average American would have a one in ten chance of getting cancer during this same 20-year period, even without any exposure to artificial radiation. 

Internal contamination due to ingestion, inhalation, injection, or absorption is a particular concern because the radioactive material may stay in the body for an extended period of time, "committing" the subject to accumulating dose long after the initial exposure has ceased, albeit at low dose rates. Over a hundred people, including Eben Byers and the radium girls, have received committed doses in excess of 10 Gy and went on to die of cancer or natural causes, whereas the same amount of acute external dose would invariably cause an earlier death by acute radiation syndrome.

Internal exposure of the public is controlled by regulatory limits on the radioactive content of food and water. These limits are typically expressed in becquerel/kilogram, with different limits set for each contaminant.

History

Although radiation was discovered in late 19th century, the dangers of radioactivity and of radiation were not immediately recognized. Acute effects of radiation were first observed in the use of X-rays when Wilhelm Röntgen intentionally subjected his fingers to X-rays in 1895. He published his observations concerning the burns that developed, though he attributed them to ozone rather than to X-rays. His injuries healed later. 

The genetic effects of radiation, including the effects on cancer risk, were recognized much later. In 1927 Hermann Joseph Muller published research showing genetic effects, and in 1946 was awarded the Nobel prize for his findings. Radiation was soon linked to bone cancer in the radium dial painters, but this was not confirmed until large-scale animal studies after World War II. The risk was then quantified through long-term studies of atomic bomb survivors

Before the biological effects of radiation were known, many physicians and corporations had begun marketing radioactive substances as patent medicine and radioactive quackery. Examples were radium enema treatments, and radium-containing waters to be drunk as tonics. Marie Curie spoke out against this sort of treatment, warning that the effects of radiation on the human body were not well understood. Curie later died of aplastic anemia, not cancer. Eben Byers, a famous American socialite, died of multiple cancers in 1932 after consuming large quantities of radium over several years; his death drew public attention to dangers of radiation. By the 1930s, after a number of cases of bone necrosis and death in enthusiasts, radium-containing medical products had nearly vanished from the market. 

In the United States, the experience of the so-called Radium Girls, where thousands of radium-dial painters contracted oral cancers, popularized the warnings of occupational health associated with radiation hazards. Robley D. Evans, at MIT, developed the first standard for permissible body burden of radium, a key step in the establishment of nuclear medicine as a field of study. With the development of nuclear reactors and nuclear weapons in the 1940s, heightened scientific attention was given to the study of all manner of radiation effects.

Radiophobia (updated)

From Wikipedia, the free encyclopedia

Radiation need not be feared, but it must command your respect.
Health physics poster exhorting respect for—rather than fear of—radiation. (ORNL, 1947)

Radiophobia is an obsessive fear of ionizing radiation, in particular, fear of X-rays. While in some cases radiation may be harmful (i.e. radiation-induced cancer, and acute radiation syndrome), the effects of poor information, understanding, or a traumatic experience may cause unnecessary or even irrational fear. The term is also used in a non-medical sense to describe the opposition to the use of nuclear technology (i.e. nuclear power) arising from concerns disproportionately greater than actual risks would merit.

Early use

The term was used in a paper entitled "Radio-phobia and radio-mania" presented by Dr Albert Soiland of Los Angeles in 1903. In the 1920s, the term was used to describe people who were afraid of radio broadcasting and receiving technology. In 1931, radiophobia was referred to in The Salt Lake Tribune as a "fear of loudspeakers", an affliction that Joan Crawford was reported as suffering. The term "radiophobia" was also printed in Australian newspapers in the 1930s and 1940s, assuming a similar meaning. The 1949 poem by Margarent Mercia Baker entitled "Radiophobia" laments the intrusion of advertising into radio broadcasts. The term remained in use with its original association with radios and radio broadcasting during the 1940s and 1950s.

During the 1950s and 1960s, the Science Service associated the term with fear of gamma radiation and the medical use of x-rays. A Science Service article published in several American newspapers proposed that "radiophobia" could be attributed to the publication of information regarding the "genetic hazards" of exposure to ionizing radiation by the National Academy of Sciences in 1956.

In a newspaper column published in 1970, Dr Harold Pettit MD wrote:
A healthy respect for the hazards of radiation is desirable. When atomic testing began in the early fifties, these hazards were grossly exaggerated, producing a new psychological disorder which has been called "radiophobia" or "nuclear neurosis.

Castle Bravo and its influence on public perception

March 1, 1954, the operation Castle Bravo testing of a then, first of its kind, experimental thermonuclear Shrimp device, overshot its predicted yield of 4–6 megatons and instead produced 15 megatons; this resulted in an unanticipated amount of Bikini snow or visible particles of nuclear fallout being produced, fallout which caught the Japanese fishing boat the Daigo Fukuryū Maru or Lucky Dragon in its plume, even though it was fishing outside the initially predicted ~5 megaton fallout area which had been cordoned off for the Castle Bravo test. Approximately 2 weeks after the test and fallout exposure, the 23-member fishing crew began to fall ill, with acute radiation sickness, largely brought on by beta burns that were caused by direct contact between the Bikini snow fallout and their skin, through their practice of scooping the "Bikini snow" into bags with their bare hands. One member of the crew, Kuboyama Aikichi the boat's chief radioman, died 7 months later, on September 23, 1954. It was later estimated that about a hundred fishing boats were contaminated to some degree by fallout from the test. Inhabitants of the Marshall Islands were also exposed to fallout, and a number of islands had to be evacuated.

This incident, due to the era of secrecy around nuclear weapons, created widespread fear of uncontrolled and unpredictable nuclear weapons, and also of radioactively contaminated fish affecting the Japanese food supply. With the publication of Joseph Rotblat's findings that the contamination caused by the fallout from the Castle Bravo test was nearly a thousand times greater than that stated officially, outcry in Japan reached such a level that the incident was dubbed by some as "a second Hiroshima". To prevent the subsequent strong anti-nuclear movement from turning into an anti-American movement, the Japanese and U.S. governments agreed on compensation of 2 million dollars for the contaminated fishery, with the surviving 22 crew men receiving about ¥ 2 million each, ($5,556 in 1954, $51,800 in 2019).

The surviving crew members, and their family, would later experience prejudice and discrimination, as local people thought that radiation was contagious.

Radiophobia in popular culture

The Castle Bravo test and the new fears of radioactive fallout inspired a new direction in art and cinema. The Godzilla films, beginning with Ishirō Honda's landmark 1954 film Gojira, are strong metaphors for post-war radiophobia. The opening scene of Gojira echoes the story of the Daigo Fukuryū Maru, from the initial distant flash of light to survivors being found with radiation burns. Although he found the special effects unconvincing, Roger Ebert stated that the film was "an important one" and "properly decoded, was the Fahrenheit 9/11 of its time."

A year after the Castle Bravo test, Akira Kurosawa examined one person's unreasoning terror of radiation and nuclear war in his 1955 film I Live in Fear. At the end of the film, the foundry worker who lives in fear has been declared incompetent by his family, but the possible partial validity of his fears has transferred over to his doctor. 

Nevil Shute's 1957 novel On the Beach depicts a future just six years later, based on the premise that a nuclear war has released so much radioactive fallout that all life in the Northern Hemisphere has been killed. The novel is set in Australia, which, along with the rest of the Southern Hemisphere, awaits a similar and inevitable fate. Helen Caldicott describes reading the novel in adolescence as 'a formative event' in her becoming part of the anti-nuclear movement.

Radiophobia and Chernobyl

In the former Soviet Union many patients with negligible radioactive exposure after the Chernobyl disaster displayed extreme anxiety about low level radiation exposure, and therefore developed many psychosomatic problems, with an increase in fatalistic alcoholism also being observed. As Japanese health and radiation specialist Shunichi Yamashita noted:
We know from Chernobyl that the psychological consequences are enormous. Life expectancy of the evacuees dropped from 65 to 58 years -- not [predominately] because of cancer, but because of depression, alcoholism and suicide. Relocation is not easy, the stress is very big. We must not only track those problems, but also treat them. Otherwise people will feel they are just guinea pigs in our research.
The term "radiation phobia syndrome" was introduced in 1987. by L. A. Ilyin and O. A. Pavlovsky in their report "Radiological consequences of the Chernobyl accident in the Soviet Union and measures taken to mitigate their impact".

The author of Chernobyl Poems Lyubov Sirota wrote in her poem "Radiophobia":
Is this only—a fear of radiation?
Perhaps rather—a fear of wars?
Perhaps—the dread of betrayal,

Cowardice, stupidity, lawlessness?
The term has been criticized by Adolph Kharash, Science Director at the Moscow State University because, he writes,
It treats the normal impulse to self-protection, natural to everything living, your moral suffering, your anguish and your concern about the fate of your children, relatives and friends, and your own physical suffering and sickness as a result of delirium, of pathological perversion.
However, it must be noted that the psychological phobia of radiation in sufferers may not coincide with an actual life-threatening exposure to an individual or their children. Radiophobia refers only to a display of anxiety disproportionate to the actual quantity of radiation one is exposed to, with, in many cases, radiation exposure values equal to, or not much higher than, that which individuals are naturally exposed to every day from background radiation. Anxiety following a response to an actual life-threatening level of exposure to radiation is not considered to be radiophobia, nor misplaced anxiety, but a normal, appropriate response. 

Marvin Goldman is an American doctor who provided commentary to newspapers claiming that radiophobia had taken a larger toll than the fallout itself had, and that radiophobia was to blame.

Chernobyl abortions

Following the accident, journalists mistrusted many medical professionals (such as the spokesman from the UK National Radiological Protection Board), and in turn encouraged the public to mistrust them.

Throughout the European continent, in nations where abortion is legal, many requests for induced abortions, of otherwise normal pregnancies, were obtained out of fears of radiation from Chernobyl; including an excess number of abortions of healthy human fetuses in Denmark in the months following the accident.
As the increase in radiation in Denmark was so low that almost no increased risk of birth defects was expected, the public debate and anxiety among the pregnant women and their husbands "caused" more fetal deaths in Denmark than the accident. This underlines the importance of public debate, the role of the mass media and of the way in which National Health authorities participate in this debate.
In Greece, following the accident there was panic and false rumors which led to many obstetricians initially thinking it prudent to interrupt otherwise wanted pregnancies and/or were unable to resist requests from worried pregnant mothers over fears of radiation, within a few weeks misconceptions within the medical profession were largely cleared up, although worries persisted in the general population. Although it was determined that the effective dose to Greeks would not exceed 1 mSv (0.1 rem), a dose much lower than that which could induce embryonic abnormalities or other non-stochastic effects, there was an observed 2500 excess of otherwise wanted pregnancies being terminated, probably out of fear in the mother of some kind of perceived radiation risk.

A "slightly" above the expected number of requested induced abortions occurred in Italy, were upon request, "a week of reflection" and then a 2 to 3 week "health system" delay usually occur before the procedure.

Radiophobia and health effects

My former colleague, William Clark, has likened the public’s frenzy over small environmental insults to the fear of witches in the later Middle Ages. Some million certified “witches” were executed because they could not prove that they had not caused harm to someone or something. In the same way, since one cannot prove that tiny amounts of radiation did not cause a particular leukemia—for that matter one cannot prove that they caused it either—those who wish to succumb to low-level phobia succumb. As a result nuclear energy […is] under siege. Not until the low–level controversy is resolved can we expect nuclear energy to be fully accepted. Alvin M. Weinberg
Geraldine Thomas in Adelaide, South Australia (2016)
 
The term "radiophobia" is also sometimes used in the arguments against proponents of the conservative LNT concept (Linear no-threshold response model for ionizing radiation) of radiation security proposed by the U.S. National Council on Radiation Protection and Measurements (NCRP) in 1949. The "no-threshold" position effectively assumes, from data extrapolated from the atomic bombings on Hiroshima and Nagasaki, that even negligible doses of radiation increase ones risk of cancer linearly as the exposure increases from a value of 0 up to high dose rates. This is a controversial model as the LNT model therefore suggests that radiation exposure from naturally occurring background radiation, the radiation exposure from flying at high altitudes in airplanes, as well as lying next to loved ones for extended periods and the eating of bananas, which are also weakly naturally radioactive (both mostly due to Potassium-40, a naturally occurring radioactive material required for human life) all increase one's chance of cancer.

Moreover, the lack of strong evidence supporting the LNT model, a model created from extrapolation from atomic bomb exposure, and not hard experimental evidence at low doses, has made the model controversial.[citation needed] As no irrefutable link between radiation induced negative health effects from low doses, in both human and other mammal exposure experiments, has been found.

On the contrary, many very low dose radiation exposure experiments find positive (hormetic) health effects at low doses of radiation, therefore the conservative LNT model when applied to low dose exposure remains controversial within the scientific community.

After the Fukushima disaster, the German news magazine Der Spiegel reported that Japanese residents were suffering from radiophobia. British medical scientist Geraldine Thomas has also attributed suffering of the Japanese to radiophobia in interviews and formal presentations. Four years after the event The New York Times reported that ″about 1,600 people died from the stress of the evacuation″. The forced evacuation of 154,000 people ″was not justified by the relatively moderate radiation levels″, but it was ordered because ″the government basically panicked″.

Radiophobia and Industrial and Healthcare Use

Radiation, most commonly in the form of X-rays, is used frequently in society in order to produce positive outcomes. The primary use of radiation in healthcare is in the use of radiography for radiographic examination or procedure, and in the use of radiotherapy in the treatment of cancerous conditions. Radiophobia can be a fear which patients experience before and after either of these procedures, it is therefore the responsibility of the healthcare professional at the time, often a Radiographer or Radiation Therapist, to reassure the patients about the stochastic and deterministic effects of radiation on human physiology. Advising patients and other irradiated persons of the various radiation protection measures that are enforced, including the use of lead-rubber aprons, dosimetry and Automatic Exposure Control (AEC) is a common method of informing and reassuring radiophobia sufferers. 

Similarly, in industrial radiography there is the possibility of persons to experience radiophobia when radiophobia sufferers are near industrial radiographic equipment.

Environmental radioactivity

From Wikipedia, the free encyclopedia

Environmental radioactivity is produced by radioactive materials in the human environment. While some radioisotopes, such as strontium-90 (90Sr) and technetium-99 (99Tc), are only found on Earth as a result of human activity, and some, like potassium-40 (40K), are only present due to natural processes, a few isotopes, e.g. tritium (3H), result from both natural processes and human activities. The concentration and location of some natural isotopes, particularly uranium-238 (238U), can be affected by human activity.

Background level in soils

Radioactivity is present everywhere, and has been since the formation of the earth. According to the IAEA, soil typically contains the following four natural radioisotopes: 40K, 226Ra, 238U, and 232Th. In one kilogram of soil, the potassium-40 amounts to an average 370 Bq of radiation, with a typical range of 100–700 Bq; the others each contribute some 25 Bq, with typical ranges of 10–50 Bq (7–50 Bq for the 232Th). Some soils may vary greatly from these norms.

Sea and river silt

A recent report on the Sava river in Serbia suggests that many of the river silts contain about 100 Bq kg−1 of natural radioisotopes (226Ra, 232Th, and 238U). According to the United Nations the normal concentration of uranium in soil ranges between 300 μg kg−1 and 11.7 mg kg−1. It is well known that some plants, called hyperaccumulators, are able to absorb and concentrate metals within their tissues; iodine was first isolated from seaweed in France, which suggests that seaweed is an iodine hyperaccumulator. 

Synthetic radioisotopes also can be detected in silt. Busby quotes a report on the plutonium activity in Welsh intertidal sediments by Garland et al. (1989), which suggests that the closer a site is to Sellafield, the higher is the concentration of plutonium in the silt. Some relationship between distance and activity can be seen in their data, when fitted to an exponential curve, but the scatter of the points is large (R2 = 0.3683).

Man-made

Per capita thyroid doses in the continental United States resulting from all exposure routes from all atmospheric nuclear tests conducted at the Nevada Test Site from 1951-1962.
 
The additional radioactivity in the biosphere caused by human activity due to the releases of man-made radioactivity and of Naturally Occurring Radioactive Materials (NORM) can be divided into several classes.
  1. Normal licensed releases which occur during the regular operation of a plant or process handling man-made radioactive materials.
    • For instance the release of 99Tc from a nuclear medicine department of a hospital which occurs when a person given a Tc imaging agent expels the agent.
  2. Releases of man-made radioactive materials which occur during an industrial or research accident.
  3. Releases which occur as a result of military activity.
    • For example, a nuclear weapons test.
  4. Releases which occur as a result of a crime.
    • For example, the Goiânia accident where thieves, unaware of its radioactive content, stole some medical equipment and as a result a number of people were exposed to radiation.
  5. Releases of naturally occurring radioactive materials (NORM) as a result of mining etc.
    • For example, the release of the trace quantities of uranium and thorium in coal, when it is burned in power stations.

Farming and the transfer to humans of deposited radioactivity

Just because a radioisotope lands on the surface of the soil, does not mean it will enter the human food chain. After release into the environment, radioactive materials can reach humans in a range of different routes, and the chemistry of the element usually dictates the most likely route. 

Airborne radioactive material can have an effect on humans via a range of routes.

Cows

Jiří Hála claims in his textbook "Radioactivity, Ionizing Radiation and Nuclear Energy" that cattle only pass a minority of the strontium, cesium, plutonium and americium they ingest to the humans who consume milk and meat. Using milk as an example, if the cow has a daily intake of 1000 Bq of the preceding isotopes then the milk will have the following activities.
  • 90Sr, 2 Bq dm−3
  • 137Cs, 5 Bq dm−3
  • 239Pu, 0.001 Bq dm−3
  • 241Am, 0.001 Bq dm−3

Soil

Jiří Hála's textbook states that soils vary greatly in their ability to bind radioisotopes, the clay particles and humic acids can alter the distribution of the isotopes between the soil water and the soil. The distribution coefficient Kd is the ratio of the soil's radioactivity (Bq g−1) to that of the soil water (Bq ml−1). If the radioactivity is tightly bonded to by the minerals in the soil then less radioactivity can be absorbed by crops and grass growing in the soil.

The Trinity test

Levels of radioactivity in the Trinity glass from two different samples as measured by gamma spectroscopy on lumps of the glass
 
One dramatic source of man-made radioactivity is a nuclear weapons test. The glassy trinitite formed by the first atom bomb contains radioisotopes formed by neutron activation and nuclear fission. In addition some natural radioisotopes are present. A recent paper reports the levels of long-lived radioisotopes in the trinitite. The trinitite was formed from feldspar and quartz which were melted by the heat. Two samples of trinitite were used, the first (left-hand-side bars in the graph) was taken from between 40 and 65 meters of ground zero while the other sample was taken from further away from the ground zero point. 

The 152Eu (half life 13.54 year) and 154Eu (half life 8.59 year) were mainly formed by the neutron activation of the europium in the soil, it is clear that the level of radioactivity for these isotopes is highest where the neutron dose to the soil was larger. Some of the 60Co (half life 5.27 year)is generated by activation of the cobalt in the soil, but some was also generated by the activation of the cobalt in the steel (100 foot) tower. This 60Co from the tower would have been scattered over the site reducing the difference in the soil levels. 

The 133Ba (half life 10.5 year) and 241Am (half life 432.6 year) are due to the neutron activation of barium and plutonium inside the bomb. The barium was present in the form of the nitrate in the chemical explosives used while the plutonium was the fissile fuel used. 

The 137Cs level is higher in the sample that was further away from the ground zero point – this is thought to be because the precursors to the 137Cs (137I and 137Xe) and, to a lesser degree, the caesium itself are volatile. The natural radioisotopes in the glass are about the same in both locations. 

Fallout around the Trinity site. The radioactive cloud moved towards northeast with high röntgen levels within about 100 miles (160 km).

Activation products

The action of neutrons on stable isotopes can form radioisotopes, for instance the neutron bombardment (neutron activation) of nitrogen-14 forms carbon-14. This radioisotope can be released from the nuclear fuel cycle; this is the radioisotope responsible for the majority of the dose experienced by the population as a result of the activities of the nuclear power industry.

Nuclear bomb tests have increased the specific activity of carbon, whereas the use of fossil fuels has decreased it. See the article on radiocarbon dating for further details.

Fission products

Discharges from nuclear plants within the nuclear fuel cycle introduce fission products to the environment. The releases from nuclear reprocessing plants tend to be medium to long-lived radioisotopes; this is because the nuclear fuel is allowed to cool for several years before being dissolved in the nitric acid. The releases from nuclear reactor accidents and bomb detonations will contain a greater amount of the short-lived radioisotopes (when the amounts are expressed in activity Bq)).

Short lived

The external gamma dose for a person in the open near the Chernobyl site.
 
The contributions made by the different isotopes to the dose (in air) caused in the contaminated area in the time shortly after the accident. This image was drawn using data from the OECD report, the Korean table of the isotopes and the second edition of 'The radiochemical manual'.
 
An example of a short-lived fission product is iodine-131, this can also be formed as an activation product by the neutron activation of tellurium

In both bomb fallout and a release from a power reactor accident, the short-lived isotopes cause the dose rate on day one to be much higher than that which will be experienced at the same site many days later. This holds true even if no attempts at decontamination are made. In the graphs below, the total gamma dose rate and the share of the dose due to each main isotope released by the Chernobyl accident are shown.

Medium lived

An example of a medium lived is 137Cs, which has a half-life of 30 years. Caesium is released in bomb fallout and from the nuclear fuel cycle. A paper has been written on the radioactivity found in oysters found in the Irish Sea, these were found by gamma spectroscopy to contain 141Ce, 144Ce, 103Ru, 106Ru, 137Cs, 95Zr and 95Nb. In addition, a zinc activation product (65Zn) was found, this is thought to be due to the corrosion of magnox fuel cladding in cooling ponds. It is likely that the modern releases of all these isotopes from Windscale is smaller.

An important part of the Chernobyl release was the cesium-137, this isotope is responsible for much of the long term (at least one year after the fire) external exposure which has occurred at the site. The caesium isotopes in the fallout have had an effect on farming. 

A large amount of cesium was released during the Goiânia accident where a radioactive source (made for medical use) was stolen and then smashed open during an attempt to convert it into scrap metal. The accident could have been stopped at several stages; first, the last legal owners of the source failed to make arrangements for the source to be stored in a safe and secure place; and second, the scrap metal workers who took it did not recognize the markings which indicated that it was a radioactive object. 

Soudek et al. reported in 2006 details of the uptake of 90Sr and 137Cs into sunflowers grown under hydroponic conditions. The cesium was found in the leaf veins, in the stem and in the apical leaves. It was found that 12% of the cesium entered the plant, and 20% of the strontium. This paper also reports details of the effect of potassium, ammonium and calcium ions on the uptake of the radioisotopes. 

Caesium binds tightly to clay minerals such as illite and montmorillonite; hence it remains in the upper layers of soil where it can be accessed by plants with shallow roots (such as grass). Hence grass and mushrooms can carry a considerable amount of 137Cs which can be transferred to humans through the food chain. One of the best countermeasures in dairy farming against 137Cs is to mix up the soil by deeply ploughing the soil. This has the effect of putting the 137Cs out of reach of the shallow roots of the grass, hence the level of radioactivity in the grass will be lowered. Also, after a nuclear war or serious accident, the removal of top few cm of soil and its burial in a shallow trench will reduce the long term gamma dose to humans due to 137Cs as the gamma photons will be attenuated by their passage through the soil. The more remote the trench is from humans and the deeper the trench is the better the degree of protection which will be afforded to the human population. 

In livestock farming, an important countermeasure against 137Cs is to feed to animals a little prussian blue. This iron potassium cyanide compound acts as an ion-exchanger. The cyanide is so tightly bonded to the iron that it is safe for a human to eat several grams of prussian blue per day. The prussian blue reduces the biological half-life (not to be confused with the nuclear half-life) of the cesium). The physical or nuclear half-life of 137Cs is about 30 years, which is a constant and can not be changed; however, the biological half-life will change according to the nature and habits of the organism for which it is expressed. Cesium in humans normally has a biological half-life of between one and four months. An added advantage of the prussian blue is that the cesium which is stripped from the animal in the droppings is in a form which is not available to plants. Hence, it prevents the caesium from being recycled. The form of prussian blue required for the treatment of humans or animals is a special grade. Attempts to use the pigment grade used in paints have not been successful.

Long lived

Examples of longed lived isotopes include iodine-129 and Tc-99, which have nuclear half-lives of 15 million and 200,000 years, respectively.

Plutonium and the other actinides

In popular culture, plutonium is credited with being the ultimate threat to life and limb which is wrong; while ingesting plutonium is not likely to be good for one's health, other radioisotopes such as radium are more toxic to humans. Regardless, the introduction of the transuranium elements such as plutonium into the environment should be avoided wherever possible. Currently, the activities of the nuclear reprocessing industry have been subject to great debate as one of the fears of those opposed to the industry is that large amounts of plutonium will be either mismanaged or released into the environment. 

In the past, one of the largest releases of plutonium into the environment has been nuclear bomb testing.
  • Those tests in the air scattered some plutonium over the entire globe; this great dilution of the plutonium has resulted in the threat to each exposed person being very small as each person is only exposed to a very small amount.
  • The underground tests tend to form molten rock, which rapidly cools and seals the actinides into the rock, so rendering them unable to move; again the threat to humans is small unless the site of the test is dug up.
  • The safety trials where bombs were subject to simulated accidents pose the greatest threat to people; some areas of land used for such experiments (conducted in the open air) have not been fully released for general use despite in one case an extensive decontamination.

Natural

Activation products from cosmic rays

Cosmogenic isotopes (or cosmogenic nuclides) are rare isotopes created when a high-energy cosmic ray interacts with the nucleus of an in situ atom. These isotopes are produced within earth materials such as rocks or soil, in Earth's atmosphere, and in extraterrestrial items such as meteorites. By measuring cosmogenic isotopes, scientists are able to gain insight into a range of geological and astronomical processes. There are both radioactive and stable cosmogenic isotopes. Some of these radioisotopes are tritium, carbon-14 and phosphorus-32.

Production modes

Here is a list of radioisotopes formed by the action of cosmic rays on the atmosphere; the list also contains the production mode of the isotope. These data were obtained from the SCOPE50 report, see table 1.9 of chapter 1

Isotopes formed by the action of cosmic rays on the air
Isotope Mode of formation
³H (tritium) 14N (n, 12C)³H
7Be Spallation (N and O)
10Be Spallation (N and O)
11C Spallation (N and O)
14C 14N (n, p) 14C
18F 18O (p, n)18F and Spallation (Ar)
22Na Spallation (Ar)
24Na Spallation (Ar)
28Mg Spallation (Ar)
31Si Spallation (Ar)
32Si Spallation (Ar)
32P Spallation (Ar)
34mCl Spallation (Ar)
35S Spallation (Ar)
36Cl 35Cl (n, )36Cl
37Ar 37Cl (p, n)37Ar
38Cl Spallation (Ar)
39Ar 38Ar (n, )39Ar
39Cl 40Ar (n, np)39Cl & spallation (Ar)
41Ar 40Ar (n, )41Ar
81Kr 80Kr (n, ) 81Kr

Transfer to ground

The level of beryllium-7 in the air is related to the sun spot cycle, as radiation from the sun forms this radioisotope in the atmosphere. The rate at which it is transferred from the air to the ground is controlled in part by the weather. 

The rate of delivery of Be-7 from the air to the ground in Japan (source M. Yamamoto et al., Journal of Environmental Radioactivity, 2006, 86, 110-131)

Applications in geology listed by isotope

Commonly measured long lived cosmogenic isotopes
element mass half-life (years) typical application
helium 3 - stable - exposure dating of olivine-bearing rocks
beryllium 10 1.36 million exposure dating of quartz-bearing rocks, sediment, dating of ice cores, measurement of erosion rates
carbon 14 5,730 dating of organic matter, water
neon 21 - stable - dating of very stable, long-exposed surfaces, including meteorites
aluminum 26 720,000 exposure dating of rocks, sediment
chlorine 36 308,000 exposure dating of rocks, groundwater tracer
calcium 41 103,000 exposure dating of carbonate rocks
iodine 129 15.7 million groundwater tracer

Applications of dating

Because cosmogenic isotopes have long half-lives (anywhere from thousands to millions of years), scientists find them useful for geologic dating. Cosmogenic isotopes are produced at or near the surface of the Earth, and thus are commonly applied to problems of measuring ages and rates of geomorphic and sedimentary events and processes.

Specific applications of cosmogenic isotopes include:

Methods of measurement for the long-lived isotopes

To measure cosmogenic isotopes produced within solid earth materials, such as rock, samples are generally first put through a process of mechanical separation. The sample is crushed and desirable material, such as a particular mineral (quartz in the case of Be-10), is separated from non-desirable material by using a density separation in a heavy liquid medium such as lithium sodium tungstate (LST). The sample is then dissolved, a common isotope carrier added (Be-9 carrier in the case of Be-10), and the aqueous solution is purified down to an oxide or other pure solid.

Finally, the ratio of the rare cosmogenic isotope to the common isotope is measured using accelerator mass spectrometry. The original concentration of cosmogenic isotope in the sample is then calculated using the measured isotopic ratio, the mass of the sample, and the mass of carrier added to the sample.

Radium and radon from the decay of long-lived actinides

Lead-210 deposition rate as a function of time as observed in Japan

Radium and radon are in the environment because they are decay products of uranium and thorium.

The radon (222Rn) released into the air decays to 210Pb and other radioisotopes, and the levels of 210Pb can be measured. The rate of deposition of this radioisotope is dependent on the weather. Below is a graph of the deposition rate observed in Japan.

Uranium-lead dating

Uranium-lead dating is usually performed on the mineral zircon (ZrSiO4), though other materials can be used. Zircon incorporates uranium atoms into its crystalline structure as substitutes for zirconium, but strongly rejects lead. It has a high blocking temperature, is resistant to mechanical weathering and is chemically inert. Zircon also forms multiple crystal layers during metamorphic events, which each may record an isotopic age of the event. These can be dated by a SHRIMP ion microprobe.

One of the advantages of this method is that any sample provides two clocks, one based on uranium-235's decay to lead-207 with a half-life of about 703 million years, and one based on uranium-238's decay to lead-206 with a half-life of about 4.5 billion years, providing a built-in crosscheck that allows accurate determination of the age of the sample even if some of the lead has been lost.

Entropy (information theory)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Entropy_(information_theory) In info...