The Maunder Minimum shown in a 400-year history of sunspot numbers
The Maunder Minimum, also known as the "prolonged sunspot minimum", is the name used for the period around 1645 to 1715 during which sunspots became exceedingly rare, as was then noted by solar observers.
The term was introduced after John A. Eddy published a landmark 1976 paper in Science. Astronomers before Eddy had also named the period after the solar astronomers Annie Russell Maunder (1868–1947) and her husband, Edward Walter Maunder (1851–1928), who studied how sunspot latitudes changed with time. The period which the Maunders examined included the second half of the 17th century.
Two papers were published in Edward Maunder's name in 1890 and 1894, and he cited earlier papers written by Gustav Spörer.
Because Annie Maunder had not received a university degree,
restrictions at the time caused her contribution not to be publicly
recognized.
Spörer noted that, during a 28-year period (1672–1699) within the
Maunder Minimum, observations revealed fewer than 50 sunspots. This
contrasts with the typical 40,000–50,000 sunspots seen in modern times
(over similar 25 year sampling).
The Maunder Minimum occurred with a much longer period of lower-than-average European temperatures which is likely to have been primarily caused by volcanic activity.
Sunspot observations
The Maunder Minimum occurred between 1645 and 1715 when very few sunspots were observed. That was not because of a lack of observations, as during the 17th century, Giovanni Domenico Cassini carried out a systematic program of solar observations at the Observatoire de Paris, thanks to the astronomers Jean Picard and Philippe de La Hire. Johannes Hevelius also performed observations on his own. Here is the total of sunspots recorded, by example, in the decennial years (omitting Wolf numbers):
Year
Sunspots
1610
9
1620
6
1630
9
1640
0
1650
3
1660
Some sunspots (20<) reported by Jan Heweliusz in Machina Coelestis
During the Maunder Minimum enough sunspots were sighted so that 11-year cycles could be extrapolated from the count.
The maxima occurred in 1676–1677, 1684, 1695, 1705 and 1718.
Sunspot activity was then concentrated in the southern hemisphere
of the Sun, except for the last cycle when the sunspots appeared in the
northern hemisphere, too.
According to Spörer's law, at the start of a cycle, spots appear at ever lower latitudes until they average at about latitude 15° at solar maximum.
The average then continues to drift lower to about 7° and after
that, while spots of the old cycle fade, new cycle spots start appearing
again at high latitudes.
The visibility of these spots is also affected by the velocity of the Sun's surface rotation at various latitudes:
Visibility is somewhat affected by observations being done from the ecliptic.
The ecliptic is inclined 7° from the plane of the Sun's equator (latitude 0°).
Little Ice Age
Comparison
of group sunspot numbers (top), Central England Temperature (CET)
observations (middle) and reconstructions and modeling of Northern
Hemisphere Temperatures (NHT). The CET in red are summer averages (for
June, July and August) and in blue winter averages (for December of
previous year, January and February). NHT in grey is the distribution
from basket of paleoclimate reconstructions (darker grey showing higher
probability values) and in red are from model simulations that account
for solar and volcanic variations. By way of comparison, on the same
scales the anomaly for modern data (after 31 December 1999) for summer
CET is +0.65oC, for winter CET is +1.34oC, and for NHT is +1.08oC. Sunspot data are as in supplementary data to and Central England Temperature data are as published by the UK Met Office The NHT data are described in box TS.5, Figure 1 of the IPCC AR5 report of Working Group 1.
The Maunder Minimum roughly coincided with the middle part of the Little Ice Age,
during which Europe and North America experienced colder than average
temperatures. Whether there is a causal relationship, however, is still
under evaluation. The current best hypothesis for the cause of the Little Ice Age is that it was the result of volcanic action. The onset of the Little Ice Age also occurred well before the beginning of the Maunder Minimum, and northern-hemisphere temperatures during the Maunder Minimum were not significantly different from the previous 80 years, suggesting a decline in solar activity was not the main causal driver of the Little Ice Age.
The correlation between low sunspot activity and cold winters in
England has recently been analyzed using the longest existing surface
temperature record, the Central England Temperature record.
They emphasize that this is a regional and seasonal effect relating to
European winters, and not a global effect. A potential explanation of
this has been offered by observations by NASA's Solar Radiation and Climate Experiment, which suggest that solar UV output is more variable over the course of the solar cycle than scientists had previously thought. In 2011, an article was published in the Nature Geoscience
journal that uses a climate model with stratospheric layers and the
SORCE data to tie low solar activity to jet stream behavior and mild
winters in some places (southern Europe and Canada/Greenland) and colder
winters in others (northern Europe and the United States). In Europe, examples of very cold winters are 1683–84, 1694–95, and the winter of 1708–09.
The term "Little Ice Age" applied to the Maunder Minimum is
something of a misnomer, as it implies a period of unremitting cold (and
on a global scale), which was not the case. For example, the coldest
winter in the Central England Temperature
record is 1683–1684, but summers during the Maunder Minimum were not
significantly different from those seen in subsequent years. The drop in
global average temperatures in paleoclimate reconstructions at the
start of the Little Ice Age was between about 1560 and 1600, whereas the
Maunder Minimum began almost 50 years later.
Other observations
Solar activity events recorded in radiocarbon.
Graph showing proxies of solar activity, including changes in sunspot number and cosmogenic isotope production.
Past solar activity may be recorded by various proxies, including carbon-14 and beryllium-10.
These indicate lower solar activity during the Maunder Minimum. The
scale of changes resulting in the production of carbon-14 in one cycle
is small (about one percent of medium abundance) and can be taken into
account when radiocarbon dating is used to determine the age of archaeological artifacts. The interpretation of the beryllium-10 and carbon-14 cosmogenic isotope abundance records stored in terrestrial reservoirs such as ice sheets and tree rings has been greatly aided by reconstructions of solar and heliospheric magnetic fields based on historic data on Geomagnetic storm
activity, which bridge the time gap between the end of the usable
cosmogenic isotope data and the start of modern spacecraft data.
Other historical sunspot minima have been detected either
directly or by the analysis of the cosmogenic isotopes; these include
the Spörer Minimum (1450–1540), and less markedly the Dalton Minimum (1790–1820). In a 2012 study, sunspot minima have been detected by analysis of carbon-14 in lake sediments.
In total, there seem to have been 18 periods of sunspot minima in the
last 8,000 years, and studies indicate that the Sun currently spends up
to a quarter of its time in these minima.
A paper based on an analysis of a Flamsteed drawing, suggests that the Sun's surface rotation slowed in the deep Maunder minimum (1684).
During the Maunder Minimum aurorae had been observed seemingly normally, with a regular decadal-scale cycle.
This is somewhat surprising because the later, and less deep, Dalton
sunspot minimum is clearly seen in auroral occurrence frequency, at
least at lower geomagnetic latitudes.
Because geomagnetic latitude is an important factor in auroral
occurrence, (lower-latitude aurorae requiring higher levels of
solar-terrestrial activity) it becomes important to allow for population
migration and other factors that may have influenced the number of
reliable auroral observers at a given magnetic latitude for the earlier
dates. Decadal-scale cycles during the Maunder minimum can also be seen in the abundances of the beryllium-10 cosmogenic isotope (which unlike carbon-14 can be studied with annual resolution)
but these appear to be in antiphase with any remnant sunspot activity.
An explanation in terms of solar cycles in loss of solar magnetic flux
was proposed in 2012.
The fundamental papers on the Maunder minimum have been published in Case studies on the Spörer, Maunder and Dalton Minima.
Caused a volcanic winter that dropped temperatures by 0.4–0.7 °C worldwide
The year 1816 is known as the Year Without a Summer (also the Poverty Year and Eighteen Hundred and Froze To Death) because of severe climate abnormalities that caused average global temperatures to decrease by 0.4–0.7 °C (0.72–1.26 °F). This resulted in major food shortages across the Northern Hemisphere.
The Year Without a Summer was an agricultural disaster. Historian John D. Post has called this "the last great subsistence crisis in the Western world". The climatic aberrations of 1816 had greatest effect on most of New England, Atlantic Canada, and parts of western Europe.
North America
In
the spring and summer of 1816, a persistent "dry fog" was observed in
parts of the eastern United States. The fog reddened and dimmed the
sunlight, such that sunspots were visible to the naked eye. Neither wind nor rainfall dispersed the "fog". It has been characterized as a "stratospheric sulfate aerosol veil".
The weather was not in itself a hardship for those accustomed to
long winters. The real problem lay in the weather's effect on crops and
thus on the supply of food and firewood. At higher elevations, where
farming was problematic in good years, the cooler climate did not quite
support agriculture. In May 1816, frost killed off most crops in the higher elevations of Massachusetts, New Hampshire, and Vermont, as well as upstate New York. On June 6, snow fell in Albany, New York, and Dennysville, Maine. In Cape May, New Jersey, frost was reported five nights in a row in late June, causing extensive crop damage.
Many commented on the phenomenon. Sarah Snell Bryant, of Cummington, Massachusetts, wrote in her diary, "Weather backward."
At the Church Family of Shakers near New Lebanon, New York,
Nicholas Bennet wrote in May 1816, "all was froze" and the hills were
"barren like winter". Temperatures went below freezing almost every day
in May. The ground froze on June 9. On June 12, the Shakers had to
replant crops destroyed by the cold. On July 7, it was so cold,
everything had stopped growing. The Berkshire Hills had frost again on August 23, as did much of the upper northeast.
A Massachusetts historian summed up the disaster:
Severe frosts occurred every month; June 7th and 8th snow
fell, and it was so cold that crops were cut down, even freezing the
roots ... In the early Autumn when corn was in the milk it was so
thoroughly frozen that it never ripened and was scarcely worth
harvesting. Breadstuffs were scarce and prices high and the poorer class
of people were often in straits for want of food. It must be remembered
that the granaries of the great west had not then been opened to us by
railroad communication, and people were obliged to rely upon their own
resources or upon others in their immediate locality.
In July and August, lake and river ice was observed as far south as northwestern Pennsylvania. Frost was reported as far south as Virginia on August 20 and 21.
Rapid, dramatic temperature swings were common, with temperatures
sometimes reverting from normal or above-normal summer temperatures as
high as 95 °F (35 °C) to near-freezing within hours. Thomas Jefferson, retired from the presidency and farming at Monticello,
sustained crop failures that sent him further into debt. On September
13, a Virginia newspaper reported that corn crops would be one half to
two-thirds short and lamented that "the cold as well as the drought has
nipt the buds of hope". A Norfolk, Virginia newspaper reported:
It is now the middle of July, and we have not yet had
what could properly be called summer. Easterly winds have prevailed for
nearly three months past ... the sun during that time has generally been
obscured and the sky overcast with clouds; the air has been damp and
uncomfortable, and frequently so chilling as to render the fireside a
desirable retreat.
Regional farmers did succeed in bringing some crops to maturity, but corn and other grain prices rose dramatically. The price of oats, for example, rose from 12¢ per bushel
($3.40/m³) in 1815 (equal to $1.64 today) to 92¢ per bushel ($26/m³) in
1816 ($13.58 today). Crop failures were aggravated by an inadequate
transportation network: with few roads or navigable inland waterways and
no railroads, it was expensive to import food.
Europe
Low temperatures and heavy rains resulted in failed harvests in Britain and Ireland. Families in Wales traveled long distances begging for food. Famine was prevalent in north and southwest Ireland, following the failure of wheat, oat,
and potato harvests. In Germany, the crisis was severe; food prices
rose sharply. With the cause of the problems unknown, people
demonstrated in front of grain markets and bakeries, and later riots,
arson, and looting took place in many European cities. It was the worst famine of 19th-century mainland Europe.
The effects were widespread and lasted beyond the winter. In western Switzerland, the summers of 1816 and 1817 were so cold that an ice dam formed below a tongue of the Giétro Glacier high in the Val de Bagnes. Despite engineer Ignaz Venetz's efforts to drain the growing lake, the ice dam collapsed catastrophically in June 1818, killing 40 people.
Asia
In China, the cold weather killed trees, rice crops, and even water buffalo, especially in the north. Floods destroyed many remaining crops. The monsoon season was disrupted, resulting in overwhelming floods in the Yangtze Valley. In India, the delayed summer monsoon caused late torrential rains that aggravated the spread of cholera from a region near the Ganges in Bengal to as far as Moscow.
In Japan, still exercising caution after the cold weather related Great Tenmei famine of 1782–1788, the cold damaged crops, but no crop failures were reported, and there were no adverse effects on population.
Sulfate concentration in ice cores from Greenland. An unknown eruption occurred before 1810. The peak after 1815 was caused by Mount Tambora.
The aberrations are now generally thought to have occurred because of the April 5–15, 1815, Mount Tambora volcanic eruption on the island of Sumbawa, Indonesia. The eruption had a volcanic explosivity index (VEI) ranking of 7, a colossal event that ejected at least 100 km3 (24 cu mi) of material. It was the world's largest eruption since the Hatepe eruption in AD 180.
Other large volcanic eruptions (with VEIs at least 4) around this time were:
These eruptions had built up a substantial amount of atmospheric
dust. As is common after a massive volcanic eruption, temperatures fell
worldwide because less sunlight passed through the stratosphere.
According to a 2012 analysis by Berkeley Earth Surface Temperature,
the 1815 Tambora eruption caused a temporary drop in the Earth's
average land temperature of about 1 °C. Smaller temperature drops were
recorded from the 1812–1814 eruptions.
The Earth had already been in a centuries-long period of global cooling that started in the 14th century. Known today as the Little Ice Age, it had already caused considerable agricultural distress in Europe.
The Little Ice Age's existing cooling was exacerbated by the eruption
of Tambora, which occurred near the end of the Little Ice Age.
This period also occurred during the Dalton Minimum (a period of relatively low solar activity), specifically Solar Cycle 6,
which ran from December 1810 to May 1823. May 1816 in particular had
the lowest sunspot number (0.1) to date since record keeping on solar
activity began. The lack of solar irradiance during this period was exacerbated by atmospheric opacity from volcanic dust.
Effects
As a result of the series of volcanic eruptions, crops in the
aforementioned areas had been poor for several years; the final blow
came in 1815 with the eruption of Tambora. Europe, still recuperating
from the Napoleonic Wars,
suffered from food shortages. Food riots broke out in the United
Kingdom and France, and grain warehouses were looted. The violence was
worst in landlocked Switzerland, where famine
caused the government to declare a national emergency. Huge storms and
abnormal rainfall with flooding of Europe's major rivers (including the Rhine) are attributed to the event, as is the August frost. A major typhus
epidemic occurred in Ireland between 1816 and 1819, precipitated by the
famine caused by the Year Without a Summer. An estimated 100,000 Irish
perished during this period. A BBC documentary, using figures compiled
in Switzerland, estimated that the fatality rates in 1816 were twice
that of average years, giving an approximate European fatality total of
200,000 deaths.
New England also experienced major consequences from the eruption
of Tambora. The corn crop in New England failed. Corn was reported to
have ripened so poorly that no more than a quarter of it was usable for
food. The crop failures in New England, Canada, and parts of Europe also
caused the price of wheat, grains, meat, vegetables, butter, milk, and
flour to rise sharply.
The eruption of Tambora caused Hungary
to experience brown snow. Italy's northern and north-central region
experienced something similar, with red snow falling throughout the
year. The cause of this is believed to have been volcanic ash in the
atmosphere.
In China, unusually low temperatures in summer and fall devastated rice production in Yunnan, resulting in widespread famine. Fort Shuangcheng, now in Heilongjiang, reported fields disrupted by frost and conscripts deserting as a result. Summer snowfall or otherwise mixed precipitation was reported in various locations in Jiangxi and Anhui, located at around 30°N. In Taiwan, which has a tropical climate, snow was reported in Hsinchu and Miaoli, and frost was reported in Changhua.
Cultural effects
High levels of tephra in the atmosphere led to unusually spectacular sunsets during this period, a feature celebrated in the paintings of J. M. W. Turner. This may have given rise to the yellow tinge predominant in his paintings such as Chichester Canal circa 1828. Similar phenomena were observed after the 1883 eruption of Krakatoa, and on the West Coast of the United States following the 1991 eruption of Mount Pinatubo in the Philippines.
The lack of oats to feed horses may have inspired the German inventor Karl Drais to research new ways of horseless transportation, which led to the invention of the draisine or velocipede. This was the ancestor of the modern bicycle and a step toward mechanized personal transport.
The crop failures of the "Year without a Summer" may have helped shape the settling of the "American Heartland",
as many thousands of people (particularly farm families who were wiped
out by the event) left New England for western New York and the Northwest Territory in search of a more hospitable climate, richer soil, and better growing conditions. Indiana became a state in December 1816 and Illinois two years later. British historian Lawrence Goldman has suggested that this migration into the Burned-over district of New York was responsible for the centering of the anti-slavery movement in that region.
According to historian L. D. Stillwell, Vermont alone experienced a decrease in population of between 10,000 and 15,000, erasing seven previous years of population growth. Among those who left Vermont were the family of Joseph Smith, who moved from Norwich, Vermont (though he was born in Sharon, Vermont) to Palmyra, New York. This move precipitated the series of events that culminated in the publication of the Book of Mormon and the founding of the Church of Jesus Christ of Latter-day Saints.
In June 1816, "incessant rainfall" during that "wet, ungenial summer" forced Mary Shelley, Percy Bysshe Shelley, Lord Byron and John William Polidori, and their friends to stay indoors at Villa Diodati overlooking Lake Geneva for much of their Swiss holiday. They decided to have a contest to see who could write the scariest story, leading Shelley to write Frankenstein, or The Modern Prometheus and Lord Byron to write "A Fragment", which Polidori later used as inspiration for The Vampyre – a precursor to Dracula. In addition, Lord Byron was inspired to write the poem "Darkness", by a single day when "the fowls all went to roost at noon and candles had to be lit as at midnight".
Justus von Liebig, a chemist who had experienced the famine as a child in Darmstadt, later studied plant nutrition and introduced mineral fertilizers.
The Heaven Lake eruption of Paektu Mountain between modern-day North Korea and the People's Republic of China, in 969 (± 20 years), is thought to have had a role in the downfall of Balhae.
An eruption of Kuwae, a Pacific volcano, has been implicated in events surrounding the Fall of Constantinople in 1453.
An eruption of Huaynaputina, in Peru, caused 1601 to be the coldest year in the Northern Hemisphere for six centuries (see Russian famine of 1601–1603); 1601 consisted of a bitterly cold winter, a cold frosty late (possibly nonexistent) spring, and a cool wet summer.
An eruption of Laki, in Iceland,
was responsible for up to hundreds of thousands of fatalities
throughout the Northern Hemisphere (over 25,000 in England alone), and
one of the coldest winters ever recorded in North America, 1783–84;
long-term consequences included poverty and famine that may have
contributed to the French Revolution in 1789.
The 1883 eruption of Krakatoa caused average Northern Hemisphere summer temperatures to fall by as much as 1.2 °C (2.2 °F).
The eruption of Mount Pinatubo in 1991 led to odd weather patterns and temporary cooling in the United States, particularly in the Midwest
and parts of the Northeast. An unusually mild winter was followed by an
unusually cool, wet summer and a cold, early autumn in 1992
(cooler-than-normal July, August, September, and October in 1992). More
rain than normal fell across the West Coast of the United States,
particularly California, during the 1991–92 and 1992–93 rainy seasons.
The American Midwest experienced more rain and major flooding during the
spring and summer of 1993.
In popular culture
American Murder Song, a musical project by Terrance Zdunich and Saar Hendelman, uses the "Year Without a Summer" as a backdrop for a collection of murder ballads.
A song about the event entitled "1816, the Year Without a Summer" is the opening track on Rasputina's 2007 album Oh Perilous World.
The 2013 novel Without a Summer by Mary Robinette Kowal is set during the volcanic winter event, though the eruption itself is mentioned only in passing.
During the interval of 1818-1858, several curious decreases in the
number of sunspot observing days per year are noted in the observing
record of Samuel Heinrich Schwabe, the discoverer of the sunspot cycle,
and in the reconstructed record of Rudolf Wolf, the founder of the now
familiar relative sunspot number. These decreases appear to be nonrandom
in nature and often extended for 13 yr (or more). Comparison of these
decreases with equivalent annual mean temperature (both annual means and
4-yr moving averages). as recorded at Armagh Observatory (Northern
Ireland), indicates that the temperature during the years of decreased
number of observing days trended downward near the start of' each
decrease and upward (suggesting some sort of recovery) just before the
end of each decrease. The drop in equivalent annual mean temperature
associated with each decrease, as determined from the moving averages,
measured about 0.1-0.7 C. The decreases in number of observing days are
found to be closely related to the occurrences of large, cataclysmic
volcanic eruptions in the tropics or northern hemisphere. In particular,
the interval of increasing number of observing days at the beginning of
the record (i.e., 1818-1819) may be related to the improving
atmospheric conditions in Europe following the 1815 eruption of Tambora
(Indonesia; 8 deg. S), which previously, has been linked to "the year
without a summer" (in 1816) and which is the strongest eruption in
recent history, while the decreases associated with the years of 1824,
1837, and 1847 may, be linked, respectively, to the large, catacivsmic
volcanic eruptions of Galunggung (Indonesia; 7 deg. S) in 1822,
Cosiguina (Nicaragua) in 1835, and, perhaps, Hekla (Iceland; 64 deg. N)
in 1845. Surprisingly, the number of observing days per year, as
recorded specifically b), SchAabe (from Dessau, Germany), is found to be
linearly correlated against the yearly mean temperature at Armagh
Observatory (r = 0.5 at the 2 percent level of significance); thus.
years of fewer sunspot observing days in the historical record seem to
indicate years of probable cooler clime, while years (if many sunspot
observing days seem to indicate years of probable warmer clime (and Vice
versa). Presuming this relationship to be real, one infers that the
observed decrease in the number of observing days near 1830 (i.e.,
during "the lost record years" of 1825 to 1833) provides a strong
indication that temperatures at Armagh (and, perhaps, most of Europe, as
well) were correspondingly cooler. If true, then, the inferred cooling
may have resulted from the eruption of Kliuchevsoi(Russia; 56 deg. N) in
1829.
Publication:
Technical Report, NASA/TP-1998-208592; M-889; NAS 1.60:208592
Smoke plumes from a few of the Kuwaiti Oil Fires on April 7, 1991.
The Kuwaiti oil fires were caused by Iraqimilitary forces setting fire to a reported 605 to 732 oil wells along with an unspecified number of oil filled low-lying areas, such as oil lakes and fire trenches, as part of a scorched earth policy while retreating from Kuwait in 1991 due to the advances of Coalition military forces in the Persian Gulf War. The fires were started in January and February 1991, and the first well fires were extinguished in early April 1991, with the last well capped on November 6, 1991.
Motives
Oil
well fires, south of Kuwait City. (Photo taken from inside a UH-60
Blackhawk; the door frame is the black bar on the right of the photo)
Kuwaiti oil well fire, south of Kuwait City, March, 1991
In addition, Kuwait had been producing oil above treaty limits established by OPEC. By the eve of the Iraqi invasion, Kuwait had set production quotas to almost 1.9 million barrels per day (300,000 m3/d),
which coincided with a sharp drop in the price of oil. By the summer of
1990, Kuwaiti overproduction had become a serious point of contention
with Iraq.
Some analysts have speculated that one of Saddam Hussein's main motivations in invading Kuwait was to punish the ruling al-Sabah family in Kuwait for not stopping its policy of overproduction, as well as his reasoning behind the destruction of said wells.
It is also hypothesized that Iraq decided to destroy the oil
fields to achieve a military advantage, believing the intense smoke
plumes serving as smoke screens created by the burning oil wells would inhibit Coalition offensive air strikes, foil allied precision guided weapons and spy satellites,
and could screen Iraq’s military movements. Furthermore, it is thought
that Iraq’s military leaders may have regarded the heat, smoke, and
debris from hundreds of burning oil wells as presenting a formidable area denial
obstacle to Coalition forces. The onset of the oil well destruction
supports this military dimension to the sabotage of the wells; for
example, during the early stage of the Coalition air campaign, the
number of oil wells afire was relatively small but the number increased
dramatically in late February with the arrival of the ground war.
The Iraqi military combat engineers also released oil into low-lying areas for defensive purposes against infantry and mechanized units
along Kuwait’s southern border, by constructing several "fire trenches"
roughly 1 kilometer long, 3 meters wide, and 3 meters deep to impede
the advance of Coalition ground forces.
The military use of the land based fires should also be seen in context with the coinciding, deliberate, sea based Gulf War oil spill, the apparent strategic goal of which was to foil a potential amphibious landing by U.S. Marines.
Extent
The
Kuwaiti oil fires were not just limited to burning oil wells, one of
which is seen here in the background, but burning "oil lakes", seen in
the foreground, also contributed to the smoke plumes, particularly the sootiest/blackest of them (1991).
As an international coalition under United States command assembled
in anticipation of an invasion of Iraqi-occupied Kuwait, the Iraqi
regime decided to destroy as much of Kuwait's oil reserves and
infrastructure as possible before withdrawing from that country. As
early as December 1990, Iraqi forces placed explosive charges on Kuwaiti
oil wells. The wells were systematically sabotaged beginning on January
16, 1991, when the allies commenced air strikes against Iraqi targets.
On February 8, satellite images detected the first smoke from burning
oil wells. The number of oil fires peaked between February 22 and 24,
when the allied ground offensive began.
According to the U.S. Environmental Protection Agency's report to Congress, "the retreating Iraqi army set fire to or damaged over 700 oil wells, storage tanks, refineries, and facilities in Kuwait."
Estimates placed the number of oil well fires from 605 to 732. A
further thirty-four wells had been destroyed by heavy coalition bombing
in January.
The Kuwait Petroleum Company's estimate as of September 1991 was that
there had been 610 fires, out of a total of 749 facilities damaged or on
fire along with an unspecified number of oil filled low-lying areas,
such as "oil lakes" and "fire trenches". These fires constituted approximately 50% of the total number of oil well fires in the history of the petroleum industry, and temporarily damaged or destroyed approximately 85% of the wells in every major Kuwaiti oil field.
Concerted efforts to bring the fires and other damage under
control began in April 1991. During the uncontrolled burning phase from
February to April, various sources estimated that the ignited wellheads burnt through between four and six million barrels of crude oil, and between seventy and one hundred million cubic meters of natural gas per day. Seven months later, 441 facilities had been brought under control, while 308 remained uncontrolled.
The last well was capped on November 6, 1991. The total amount of oil
burned is generally estimated at about one billion barrels of the entire
one hundred four billion supply. Almost one in every 100 barrels was
destroyed forever. Daily global oil consumption in 2015 is about 91.4 million barrels; the oil lost to combustion would last 11 days at modern usage rates.
Military effects
USAF aircraft fly over burning Kuwaiti oil wells (1991)
The oil fires caused a dramatic decrease in air quality, causing respiratory problems for many soldiers on the ground without gas masks (1991).
United States Marines approach burning oilfields during ground war of the Gulf War (1991).
On March 21, 1991, a Royal Saudi Air Force C-130H crashed in heavy smoke due to the Kuwaiti oil fires on approach to Ras Mishab Airport, Saudi Arabia. 92 Senegalese soldiers and 6 Saudi crew members were killed, the largest accident among Coalition forces.
The smoke screening was also used by Iraqi anti-armor forces to a successful extent in the Battle of Phase Line Bullet, having aided in achieving the element of surprise against advancing Bradley (IFV)s, along with increasing the general fog of war.
The fires burned out of control because of the dangers of sending in firefighting crews during the war. Land mines had been placed in areas around the oil wells and military demining was necessary before the fires could be put out. Around 5 million barrels (790,000 m3) of oil were lost each day. Eventually, privately contracted crews extinguished the fires, at a total cost of US$1.5 billion to Kuwait. By that time, however, the fires had burned for approximately ten months, causing widespread pollution.
The fires have been linked with what was later deemed Gulf War Syndrome,
a chronic disorder afflicting military veterans and civilian workers
that include fatigue, muscle pain, and cognitive problems; however,
studies have indicated that the firemen who capped the wells did not
report any of the symptoms that the soldiers experienced. The causes of Gulf War Syndrome have yet to be determined.
From the perspective of ground forces, apart from the occasional "oil rain" experienced by troops very close to spewing wells, one of the more commonly experienced effects of the oil field fires were the ensuing smoke plumes which rose into the atmosphere and then precipitated or fell out of the air via dry deposition
and by rain. The pillar-like plumes frequently broadened and joined up
with other smoke plumes at higher altitudes, producing a cloudy grey overcast effect, as only about 10% of all the fires corresponding with those that originated from "oil lakes" produced pure black soot
filled plumes, 25% of the fires emitted white to grey plumes, while the
remainder emitted plumes with colors between grey and black. For example, one Gulf War veteran stated:
It was like a cloudy day all day
long, in fact, we didn’t realize it was smoke at first. The smoke was
about 500 feet above us, so we couldn’t see the sky. However, we could
see horizontally for long distances with no problem. We knew it was
smoke when the mucous from our nostrils started to look black..."
A paper published in 2000 analyzed the degree of exposure by troops to particulate matter, which included soot but the paper focused more-so on silica sand, which can produce silicosis. The paper included troop medical records,
and in its conclusion: "A literature review indicated negligible to
nonexistent health risk from other inhaled particulate material (other
than silica) during the Gulf War".
Extinguishing efforts
The burning wells needed to be extinguished as, without active
efforts, Kuwait would lose billions of dollars in oil revenues. It was
predicted that the fires would burn from two to five years before losing
pressure and going out on their own, optimists estimating two years and
pessimists estimating five while the majority estimated three years
until this occurred.
The companies responsible for extinguishing the fires initially were Bechtel, Red Adair Company (now sold off to Global Industries of Louisiana), Boots and Coots, and Wild Well Control. Safety Boss
was the fourth company to arrive but ended up extinguishing and capping
the most wells of any other company: 180 of the 600. Other companies
including Cudd Well/Pressure Control, Neal Adams Firefighters, and
Kuwait Wild Well Killers were also contracted.
According to Larry H. Flak, a petroleum engineer for Boots and
Coots International Well Control, 90% of all the 1991 fires in Kuwait
were put out with nothing but sea water, sprayed from powerful hoses at
the base of the fire. The extinguishing water was supplied to the arid desert region by re-purposing the oil pipelines that prior to the arson attack had pumped oil from the wells to the Persian Gulf. The pipeline had been mildly damaged but, once repaired, its flow was reversed to pump Persian gulf seawater to the burning oil wells.
The extinguishing rate was approximately 1 every 7–10 days at the start
of efforts but then with experience gained and the removal of the mine fields that surrounded the burning wells, the rate increased to 2 or more per day.
For stubborn oil well fires, the use of a gas turbine
to blast a large volume of water at high velocity at the fire proved
popular with firefighters in Kuwait and was brought to the region by
Hungarians equipped with MiG-21 engines mounted originally on a T-34 (later replaced with T-55) tank, called Big wind. It extinguished 9 fires in 43 days.
In fighting a fire at a directly vertical spewing wellhead, high explosives, such as dynamite were used to create a blast wave
that pushes the burning fuel and local atmospheric oxygen away from the
well. (This is a similar principle to blowing out a candle.) The flame
is removed and the fuel can continue to spill out without igniting.
Generally, explosives were placed within 55 gallon drums, the explosives surrounded by fire retardant
chemicals, and then the drums are wrapped with insulating material with
a horizontal crane being used to bring the drum as close to the burning
area as possible.
The firefighting teams titled their occupation as "Operation Desert Hell" after Operation Desert Storm.
Fire documentaries
The fires were the subject of a 1992 IMAXdocumentary film, Fires of Kuwait, which was nominated for an Academy Award. The film includes footage of the Hungarian team using their jet turbine extinguisher.
Betchel Corporation produced a short documentary titled Kuwait: Bringing Back the Sun that summarizes and focuses upon the fire fighting efforts, which were dubbed the Al-Awda (Arabic for "The Return") project.
Environmental impact
Oil fire smoke
Immediately following Iraq’s invasion of Kuwait, predictions were
made of an environmental disaster stemming from Iraqi threats to blow up
captured Kuwaiti oil wells. Speculation ranging from a nuclear winter type scenario, to heavy acid rain and even short term immediate global warming were presented at the World Climate Conference in Geneva that November.
On January 10, 1991, a paper appearing in the Journal Nature, stated Paul Crutzen's calculations that the setting alight of the Kuwait oil wells would produce a "nuclear winter", with a cloud of smoke covering half of the Northern Hemisphere after 100 days had passed and beneath the cloud, temperatures would be reduced by 5-10 Celsius. This was followed by articles printed in the Wilmington Morning Star and the Baltimore Sun newspapers in mid to late January 1991, with the popular TV scientist personality of the time, Carl Sagan, who was also the co-author of the first few nuclear winter papers along with Richard P. Turco, John W. Birks, Alan Robock and Paul Crutzen together collectively stated that they expected catastrophic nuclear winter
like effects with continental sized impacts of "sub-freezing"
temperatures as a result if the Iraqis went through with their threats
of igniting 300 to 500 pressurized oil wells and they burned for a few
months.
Later when Operation Desert Storm had begun, Dr. S. Fred Singer and Carl Sagan discussed the possible environmental impacts of the Kuwaiti petroleum fires on the ABC News program Nightline. Sagan again argued that some of the effects of the smoke could be similar to the effects of a nuclear winter, with smoke lofting into the stratosphere, a region of the atmosphere beginning around 43,000 feet (13,000 m) above sea level at Kuwait,
resulting in global effects and that he believed the net effects would
be very similar to the explosion of the Indonesian volcano Tambora in 1815, which resulted in the year 1816 being known as the Year Without a Summer.
He reported on initial modeling estimates that forecast impacts
extending to south Asia, and perhaps to the northern hemisphere as well.
Singer, on the other hand, said that calculations showed that the smoke
would go to an altitude of about 3,000 feet (910 m) and then be rained
out after about three to five days and thus the lifetime of the smoke
would be limited. Both height estimates made by Singer and Sagan turned
out to be wrong, albeit with Singer's narrative being closer to what
transpired, with the comparatively minimal atmospheric effects remaining
limited to the Persian Gulf region, with smoke plumes, in general, lofting to about 10,000 feet (3,000 m) and a few times as high as 20,000 feet (6,100 m).
Along with Singer's televised critique, Richard D. Small criticized the initial Nature paper in a reply on March 7, 1991 arguing along similar lines as Singer.
Sagan later conceded in his book The Demon-Haunted World that his prediction did not turn out to be correct: "it was
pitch black at noon and temperatures dropped 4–6 °C over the Persian
Gulf, but not much smoke reached stratospheric altitudes and Asia was
spared."
At the peak of the fires, the smoke absorbed 75 to 80% of the
sun’s radiation. The particles rose to a maximum of 20,000 feet
(6,100 m), but were scavenged by cloud condensation nuclei from the atmosphere relatively quickly.
Sagan and his colleagues expected that a "self-lofting" of the
sooty smoke would occur when it absorbed the sun's heat radiation, with
little to no scavenging occurring, whereby the black particles of soot
would be heated by the sun and lifted/lofted higher and higher into the
air, thereby injecting the soot into the stratosphere where it would
take years for the sun blocking effect of this aerosol
of soot to fall out of the air, and with that, catastrophic ground
level cooling and agricultural impacts in Asia and possibly the Northern Hemisphere as a whole.
In retrospect, it is now known that smoke from the Kuwait oil
fires only affected the weather pattern throughout the Persian Gulf and
surrounding region during the periods that the fires were burning in
1991, with lower atmospheric winds blowing the smoke along the eastern
half of the Arabian Peninsula, and cities such as Dhahran and Riyadh, and countries such as Bahrain experienced days with smoke filled skies and carbon soot rainout/fallout.
Thus the immediate consequence of the arson sabotage was a dramatic regional decrease in air quality, causing respiratory problems for many Kuwaitis and those in neighboring countries.
According to the 1992 study from Peter Hobbs and Lawrence Radke, daily emissions of sulfur dioxide (which can generate acid rain) from the Kuwaiti oil fires were 57% of that from electric utilities in the United States, the emissions of carbon dioxide were 2% of global emissions and emissions of soot reached 3400 metric tons per day.
In a paper in the DTIC
archive, published in 2000, it states that "Calculations based on smoke
from Kuwaiti oil fires in May and June 1991 indicate that combustion efficiency was about 96% in producing carbon dioxide. While, with respect to the incomplete combustion fraction, Smoke particulate matter
accounted for 2% of the fuel burned, of which 0.4% was soot."[With the
remaining 2%, being oil that did not undergo any initial combustion].
Smoke documentary
Peter V. Hobbs also narrated a short amateur documentary titled Kuwait Oil Fires that followed the University of Washington/UW's
"Cloud and Aerosol Research Group" as they flew through, around and
above the smoke clouds and took samples, measurements, and video of the
smoke clouds in their Convair C-131(N327UW) Aerial laboratory.
Oil spills
A 2008 picture of the mummified remains of a bird, encrusted within the top hard layer of a dry oil lake in the Kuwaiti desert.
Although scenarios that predicted long-lasting environmental impacts
on a global atmospheric level due to the burning oil sources did not
transpire, long-lasting ground level oil spill impacts were detrimental to the environment regionally.
Forty-six oil wells are estimated to have gushed, and before efforts to cap them began, they were releasing approximately 300,000-400,000 barrels of oil per day, with the last gusher being capped occurring in the latter days of October 1991.
The Kuwaiti Oil Minister estimated between twenty-five and fifty
million barrels of unburned oil from damaged facilities pooled to create
approximately 300 oil lakes, that contaminated around 40 million tons
of sand and earth. The mixture of desert sand, unignited oil spilled and
soot generated by the burning oil wells formed layers of hard "tarcrete", which covered nearly five percent of Kuwait's land mass.
Vegetation in most of the contaminated areas adjoining the oil
lakes began recovering by 1995, but the dry climate has also partially
solidified some of the lakes. Over time the oil has continued to sink
into the sand, with potential consequences for Kuwait's small
groundwater resources.
The land based Kuwaiti oil spill surpassed the Lakeview Gusher, which spilled nine million pounds in 1910, as the largest oil spill in recorded history.
Six to eight million barrels of oil were directly spilled into the Persian Gulf, which became known as the Gulf War oil spill.
Comparable incidents
During the second US invasion of Iraq in 2003, approximately 40 oil wells were set on fire in the Persian gulf within Iraqi territory, ostensibly to once again hinder the invasion.
The Kuwait Wild Well Killers, who successfully extinguished 41 of the Kuwait oil well fires in 1991, used their experience to tackle blazes in the Iraqi Rumaila oilfields in 2003.
Firefighters fight to secure a burning oil well in the Iraqi Rumaila oilfields in 2003.
Landsat 7 CGI image of Baghdad, April 2, 2003. Fires set in an attempt to hinder attacking air forces.
There was also a flyover as well as some ground shots of the oil fires in the 1992 nonverbal film Baraka, shot on 70mm Todd-AO film.
The 2004 film The Manchurian Candidate included a scene set in Kuwait in February 1991, with burning oil fields visible in the background.
In the 2005 film Jarhead,
the oil fires burn continuously throughout the 1991 invasion of Iraq,
and its effects—an unceasing rain of unburned oil and smoke-filled
skies, feature prominently in the story.
In the 1999 film Three Kings, oil fires are featured in multiple scenes.
In the 1990s TV series The X-Files, the "Black Oil" is believed to be an alien disease causing agent, evoking the conspiracy theory that Gulf War syndrome was caused by the Kuwaiti oil. The 2001 episode, "Vienen", includes an oil-rig fire that could potentially disperse The Black Oil contagion.
In the 2002 video game Eternal Darkness the fires are featured in the final level of the game as a key plot point.