Search This Blog

Thursday, August 31, 2023

Nuclear weapons of the United States


United States
Location of United States
Nuclear program start date21 October 1939
First nuclear weapon test16 July 1945
First thermonuclear weapon test1 November 1952
Last nuclear test23 September 1992
Largest yield test15 Mt (63 PJ) (1 March 1954)
Total tests1,054 detonations
Peak stockpile31,255 warheads (1967)
Current stockpile3,708 (2023)
Maximum missile rangeICBM: 15,000 km (9,321 mi)
SLBM: 12,000 km (7,456 mi)
NPT partyYes (1968)

The United States was the first country to manufacture nuclear weapons and is the only country to have used them in combat, with the bombings of Hiroshima and Nagasaki in World War II. Before and during the Cold War, it conducted 1,054 nuclear tests, and tested many long-range nuclear weapons delivery systems.

Between 1940 and 1996, the U.S. federal government spent at least US$10.9 trillion in present-day terms on nuclear weapons, including platforms development (aircraft, rockets and facilities), command and control, maintenance, waste management and administrative costs. It is estimated that the United States produced more than 70,000 nuclear warheads since 1945, more than all other nuclear weapon states combined. Until November 1962, the vast majority of U.S. nuclear tests were above ground. After the acceptance of the Partial Nuclear Test Ban Treaty, all testing was relegated underground, in order to prevent the dispersion of nuclear fallout.

By 1998, at least US$759 million had been paid to the Marshall Islanders in compensation for their exposure to U.S. nuclear testing. By March 2021 over US$2.5 billion in compensation had been paid to U.S. citizens exposed to nuclear hazards as a result of the U.S. nuclear weapons program.

In 2019, the U.S. and Russia possessed a comparable number of nuclear warheads; together, these two nations possess more than 90% of the world's nuclear weapons stockpile. As of 2020, the United States had a stockpile of 3,750 active and inactive nuclear warheads plus approximately 2,000 warheads retired and awaiting dismantlement. Of the stockpiled warheads, the U.S. stated in its March 2019 New START declaration that 1,365 were deployed on 656 ICBMs, SLBMs, and strategic bombers.

Development history

Manhattan Project

The Trinity test of the Manhattan Project was the first detonation of a nuclear weapon.

The United States first began developing nuclear weapons during World War II under the order of President Franklin Roosevelt in 1939, motivated by the fear that they were engaged in a race with Nazi Germany to develop such a weapon. After a slow start under the direction of the National Bureau of Standards, at the urging of British scientists and American administrators, the program was put under the Office of Scientific Research and Development, and in 1942 it was officially transferred under the auspices of the United States Army and became known as the Manhattan Project, an American, British and Canadian joint venture. Under the direction of General Leslie Groves, over thirty different sites were constructed for the research, production, and testing of components related to bomb-making. These included the Los Alamos National Laboratory at Los Alamos, New Mexico, under the direction of physicist Robert Oppenheimer, the Hanford plutonium production facility in Washington, and the Y-12 National Security Complex in Tennessee.

By investing heavily in breeding plutonium in early nuclear reactors and in the electromagnetic and gaseous diffusion enrichment processes for the production of uranium-235, the United States was able to develop three usable weapons by mid-1945. The Trinity test was a plutonium implosion-design weapon tested on 16 July 1945, with around a 20 kiloton yield.

Faced with a planned invasion of the Japanese home islands scheduled to begin on 1 November 1945 and with Japan not surrendering, President Harry S. Truman ordered the atomic raids on Japan. On 6 August 1945, the U.S. detonated a uranium-gun design bomb, Little Boy, over the Japanese city of Hiroshima with an energy of about 15 kilotons of TNT, killing approximately 70,000 people, among them 20,000 Japanese combatants and 20,000 Korean slave laborers, and destroying nearly 50,000 buildings (including the 2nd General Army and Fifth Division headquarters). Three days later, on 9 August, the U.S. attacked Nagasaki using a plutonium implosion-design bomb, Fat Man, with the explosion equivalent to about 20 kilotons of TNT, destroying 60% of the city and killing approximately 35,000 people, among them 23,200–28,200 Japanese munitions workers, 2,000 Korean slave laborers, and 150 Japanese combatants.

On 1 January 1947, the Atomic Energy Act of 1946 (known as the McMahon Act) took effect, and the Manhattan Project was officially turned over to the United States Atomic Energy Commission (AEC).

On 15 August 1947, the Manhattan Project was abolished.

During the Cold War

Protest in Bonn against the deployment of Pershing II missiles in West Germany, 1981

The American atomic stockpile was small and grew slowly in the immediate aftermath of World War II, and the size of that stockpile was a closely guarded secret. However, there were forces that pushed the United States towards greatly increasing the size of the stockpile. Some of these were international in origin and focused on the increasing tensions of the Cold War, including the loss of China, the Soviet Union becoming an atomic power, and the onset of the Korean War. And some of the forces were domestic – both the Truman administration and the Eisenhower administration wanted to rein in military spending and avoid budget deficits and inflation. It was the perception that nuclear weapons gave more "bang for the buck" and thus were the most cost-efficient way to respond to the security threat the Soviet Union represented.

As a result, beginning in 1950 the AEC embarked on a massive expansion of its production facilities, an effort that would eventually be one of the largest U.S. government construction projects ever to take place outside of wartime. And this production would soon include the far more powerful hydrogen bomb, which the United States had decided to move forward with after an intense debate during 1949–50. as well as much smaller tactical atomic weapons for battlefield use.

By 1990, the United States had produced more than 70,000 nuclear warheads, in over 65 different varieties, ranging in yield from around .01 kilotons (such as the man-portable Davy Crockett shell) to the 25 megaton B41 bomb. Between 1940 and 1996, the U.S. spent at least $10.9 trillion in present-day terms on nuclear weapons development. Over half was spent on building delivery mechanisms for the weapon. $681 billion in present-day terms was spent on nuclear waste management and environmental remediation.

Richland, Washington was the first city established to support plutonium production at the nearby Hanford nuclear site, to power the American nuclear weapons arsenals. It produced plutonium for use in cold war atomic bombs.

Throughout the Cold War, the U.S. and USSR threatened with all-out nuclear attack in case of war, regardless of whether it was a conventional or a nuclear clash. U.S. nuclear doctrine called for mutually assured destruction (MAD), which entailed a massive nuclear attack against strategic targets and major populations centers of the Soviet Union and its allies. The term "mutual assured destruction" was coined in 1962 by American strategist Donald Brennan. MAD was implemented by deploying nuclear weapons simultaneously on three different types of weapons platforms.

Post–Cold War

After the 1989 end of the Cold War and the 1991 dissolution of the Soviet Union, the U.S. nuclear program was heavily curtailed; halting its program of nuclear testing, ceasing its production of new nuclear weapons, and reducing its stockpile by half by the mid-1990s under President Bill Clinton. Many former nuclear facilities were closed, and their sites became targets of extensive environmental remediation. Efforts were redirected from weapons production to stockpile stewardship;,attempting to predict the behavior of aging weapons without using full-scale nuclear testing. Increased funding was directed to anti-nuclear proliferation programs, such as helping the states of the former Soviet Union to eliminate their former nuclear sites and to assist Russia in their efforts to inventory and secure their inherited nuclear stockpile. By February 2006, over $1.2 billion had been paid under the Radiation Exposure Compensation Act of 1990 to U.S. citizens exposed to nuclear hazards as a result of the U.S. nuclear weapons program, and by 1998 at least $759 million had been paid to the Marshall Islanders in compensation for their exposure to U.S. nuclear testing. Over $15 million was paid to the Japanese government following the exposure of its citizens and food supply to nuclear fallout from the 1954 "Bravo" test. In 1998, the country spent an estimated $35.1 billion on its nuclear weapons and weapons-related programs.

Large stockpile with global range (dark blue)

In the 2013 book Plutopia: Nuclear Families, Atomic Cities, and the Great Soviet and American Plutonium Disasters (Oxford), Kate Brown explores the health of affected citizens in the United States, and the "slow-motion disasters" that still threaten the environments where the plants are located. According to Brown, the plants at Hanford, over a period of four decades, released millions of curies of radioactive isotopes into the surrounding environment. Brown says that most of this radioactive contamination over the years at Hanford were part of normal operations, but unforeseen accidents did occur and plant management kept this secret, as the pollution continued unabated. Even today, as pollution threats to health and the environment persist, the government keeps knowledge about the associated risks from the public.

During the presidency of George W. Bush, and especially after the 11 September terrorist attacks of 2001, rumors circulated in major news sources that the U.S. was considering designing new nuclear weapons ("bunker-busting nukes") and resuming nuclear testing for reasons of stockpile stewardship. Republicans argued that small nuclear weapons appear more likely to be used than large nuclear weapons, and thus small nuclear weapons pose a more credible threat that has more of a deterrent effect against hostile behavior. Democrats counterargued that allowing the weapons could trigger an arms race. In 2003, the Senate Armed Services Committee voted to repeal the 1993 Spratt-Furse ban on the development of small nuclear weapons. This change was part of the 2004 fiscal year defense authorization. The Bush administration wanted the repeal so that they could develop weapons to address the threat from North Korea. "Low-yield weapons" (those with one-third the force of the bomb that was dropped on Hiroshima in 1945) were permitted to be developed. The Bush administration was unsuccessful in its goal to develop a guided low-yield nuclear weapon, however, in 2010 President Barack Obama began funding and development for what would become the B61-12, a smart guided low-yield nuclear bomb developed off of the B61 “dumb bomb”.

Statements by the U.S. government in 2004 indicated that they planned to decrease the arsenal to around 5,500 total warheads by 2012. Much of that reduction was already accomplished by January 2008.

According to the Pentagon's June 2019 Doctrine for Joint Nuclear Operations, "Integration of nuclear weapons employment with conventional and special operations forces is essential to the success of any mission or operation."

Nuclear weapons testing

The U.S. conducted hundreds of nuclear tests at the Nevada Test Site.
Members of Nevada Desert Experience hold a prayer vigil during the Easter period of 1982 at the entrance to the Nevada Test Site.
Shot "Baker" of Operation Crossroads (1946) was the first underwater nuclear explosion.

Between 16 July 1945 and 23 September 1992, the United States maintained a program of vigorous nuclear testing, with the exception of a moratorium between November 1958 and September 1961. By official count, a total of 1,054 nuclear tests and two nuclear attacks were conducted, with over 100 of them taking place at sites in the Pacific Ocean, over 900 of them at the Nevada Test Site, and ten on miscellaneous sites in the United States (Alaska, Colorado, Mississippi, and New Mexico). Until November 1962, the vast majority of the U.S. tests were atmospheric (that is, above-ground); after the acceptance of the Partial Test Ban Treaty all testing was relegated underground, in order to prevent the dispersion of nuclear fallout.

The U.S. program of atmospheric nuclear testing exposed a number of the population to the hazards of fallout. Estimating exact numbers, and the exact consequences, of people exposed has been medically very difficult, with the exception of the high exposures of Marshall Islanders and Japanese fishers in the case of the Castle Bravo incident in 1954. A number of groups of U.S. citizens—especially farmers and inhabitants of cities downwind of the Nevada Test Site and U.S. military workers at various tests—have sued for compensation and recognition of their exposure, many successfully. The passage of the Radiation Exposure Compensation Act of 1990 allowed for a systematic filing of compensation claims in relation to testing as well as those employed at nuclear weapons facilities. By June 2009 over $1.4 billion total has been given in compensation, with over $660 million going to "downwinders".

A few notable U.S. nuclear tests include:

  • Trinity test on 16 July 1945, was the world's first test of a nuclear weapon (yield of around 20 kt).
  • Operation Crossroads series in July 1946, was the first postwar test series and one of the largest military operations in U.S. history.
  • Operation Greenhouse shots of May 1951 included the first boosted fission weapon test ("Item") and a scientific test that proved the feasibility of thermonuclear weapons ("George").
  • Ivy Mike shot of 1 November 1952, was the first full test of a Teller-Ulam design "staged" hydrogen bomb, with a yield of 10 megatons. It was not a deployable weapon, however—with its full cryogenic equipment it weighed some 82 tons.
  • Castle Bravo shot of 1 March 1954, was the first test of a deployable (solid fuel) thermonuclear weapon, and also (accidentally) the largest weapon ever tested by the United States (15 megatons). It was also the single largest U.S. radiological accident in connection with nuclear testing. The unanticipated yield, and a change in the weather, resulted in nuclear fallout spreading eastward onto the inhabited Rongelap and Rongerik atolls, which were soon evacuated. Many of the Marshall Islanders have since suffered from birth defects and have received some compensation from the federal government. A Japanese fishing boat, Daigo FukuryĆ« Maru, also came into contact with the fallout, which caused many of the crew to grow ill; one eventually died.
  • Shot Argus I of Operation Argus, on 27 August 1958, was the first detonation of a nuclear weapon in outer space when a 1.7-kiloton warhead was detonated at an altitude of 200 kilometres (120 mi) during a series of high altitude nuclear explosions.
  • Shot Frigate Bird of Operation Dominic I on 6 May 1962, was the only U.S. test of an operational submarine-launched ballistic missile (SLBM) with a live nuclear warhead (yield of 600 kilotons), at Christmas Island. In general, missile systems were tested without live warheads and warheads were tested separately for safety concerns. In the early 1960s, however, there mounted technical questions about how the systems would behave under combat conditions (when they were "mated", in military parlance), and this test was meant to dispel these concerns. However, the warhead had to be somewhat modified before its use, and the missile was a SLBM (and not an ICBM), so by itself it did not satisfy all concerns.
  • Shot Sedan of Operation Storax on 6 July 1962 (yield of 104 kilotons), was an attempt to show the feasibility of using nuclear weapons for "civilian" and "peaceful" purposes as part of Operation Plowshare. In this instance, a 1,280-foot (390 m) diameter 320-foot (98 m) deep crater was created at the Nevada Test Site.

A summary table of each of the American operational series may be found at United States' nuclear test series.

Delivery systems

Early weapons models, such as the "Fat Man" bomb, were extremely large and difficult to use.
From left are the Peacekeeper, the Minuteman III and the Minuteman I

The original Little Boy and Fat Man weapons, developed by the United States during the Manhattan Project, were relatively large (Fat Man had a diameter of 5 feet (1.5 m)) and heavy (around 5 tons each) and required specially modified bomber planes to be adapted for their bombing missions against Japan. Each modified bomber could only carry one such weapon and only within a limited range. After these initial weapons were developed, a considerable amount of money and research was conducted towards the goal of standardizing nuclear warheads so that they did not require highly specialized experts to assemble them before use, as in the case with the idiosyncratic wartime devices, and miniaturization of the warheads for use in more variable delivery systems.

Through the aid of brainpower acquired through Operation Paperclip at the tail end of the European theater of World War II, the United States was able to embark on an ambitious program in rocketry. One of the first products of this was the development of rockets capable of holding nuclear warheads. The MGR-1 Honest John was the first such weapon, developed in 1953 as a surface-to-surface missile with a 15-mile (24 km) maximum range. Because of their limited range, their potential use was heavily constrained (they could not, for example, threaten Moscow with an immediate strike).

The MGR-1 Honest John was the first nuclear-armed rocket developed by the U.S.
The B-36 Peacemaker in flight

Development of long-range bombers, such as the B-29 Superfortress during World War II, was continued during the Cold War period. In 1946, the Convair B-36 Peacemaker became the first purpose-built nuclear bomber; it served with the USAF until 1959. The Boeing B-52 Stratofortress was able by the mid-1950s to carry a wide arsenal of nuclear bombs, each with different capabilities and potential use situations. Starting in 1946, the U.S. based its initial deterrence force on the Strategic Air Command, which, by the late 1950s, maintained a number of nuclear-armed bombers in the sky at all times, prepared to receive orders to attack the USSR whenever needed. This system was, however, tremendously expensive, both in terms of natural and human resources, and raised the possibility of an accidental nuclear war.

During the 1950s and 1960s, elaborate computerized early warning systems such as Defense Support Program were developed to detect incoming Soviet attacks and to coordinate response strategies. During this same period, intercontinental ballistic missile (ICBM) systems were developed that could deliver a nuclear payload across vast distances, allowing the U.S. to house nuclear forces capable of hitting the Soviet Union in the American Midwest. Shorter-range weapons, including small tactical weapons, were fielded in Europe as well, including nuclear artillery and man-portable Special Atomic Demolition Munition. The development of submarine-launched ballistic missile systems allowed for hidden nuclear submarines to covertly launch missiles at distant targets as well, making it virtually impossible for the Soviet Union to successfully launch a first strike attack against the United States without receiving a deadly response.

Improvements in warhead miniaturization in the 1970s and 1980s allowed for the development of MIRVs—missiles which could carry multiple warheads, each of which could be separately targeted. The question of whether these missiles should be based on constantly rotating train tracks (to avoid being easily targeted by opposing Soviet missiles) or based in heavily fortified silos (to possibly withstand a Soviet attack) was a major political controversy in the 1980s (eventually the silo deployment method was chosen). MIRVed systems enabled the U.S. to render Soviet missile defenses economically unfeasible, as each offensive missile would require between three and ten defensive missiles to counter.

Additional developments in weapons delivery included cruise missile systems, which allowed a plane to fire a long-distance, low-flying nuclear-armed missile towards a target from a relatively comfortable distance.

Comparing the size of U.S. nuclear weapons over time.

The current delivery systems of the U.S. make virtually any part of the Earth's surface within the reach of its nuclear arsenal. Though its land-based missile systems have a maximum range of 10,000 kilometres (6,200 mi) (less than worldwide), its submarine-based forces extend its reach from a coastline 12,000 kilometres (7,500 mi) inland. Additionally, in-flight refueling of long-range bombers and the use of aircraft carriers extends the possible range virtually indefinitely.

Command and control

Command and control procedures in case of nuclear war were given by the Single Integrated Operational Plan (SIOP) until 2003, when this was superseded by Operations Plan 8044.

Since World War II, the President of the United States has had sole authority to launch U.S. nuclear weapons, whether as a first strike or nuclear retaliation. This arrangement was seen as necessary during the Cold War to present a credible nuclear deterrent; if an attack was detected, the United States would have only minutes to launch a counterstrike before its nuclear capability was severely damaged, or national leaders killed. If the President has been killed, command authority follows the presidential line of succession. Changes to this policy have been proposed, but currently the only way to countermand such an order before the strike was launched would be for the Vice President and the majority of the Cabinet to relieve the President under Section 4 of the Twenty-fifth Amendment to the United States Constitution.

Regardless of whether the United States is actually under attack by a nuclear-capable adversary, the President alone has the authority to order nuclear strikes. The President and the Secretary of Defense form the National Command Authority, but the Secretary of Defense has no authority to refuse or disobey such an order. The President's decision must be transmitted to the National Military Command Center, which will then issue the coded orders to nuclear-capable forces.

The President can give a nuclear launch order using their nuclear briefcase (nicknamed the nuclear football), or can use command centers such as the White House Situation Room. The command would be carried out by a Nuclear and Missile Operations Officer (a member of a missile combat crew, also called a "missileer") at a missile launch control center. A two-man rule applies to the launch of missiles, meaning that two officers must turn keys simultaneously (far enough apart that this cannot be done by one person).

When President Reagan was shot in 1981, there was confusion about where the "nuclear football" was, and who was in charge.

In 1975, a launch crew member, Harold Hering, was dismissed from the Air Force for asking how he could know whether the order to launch his missiles came from a sane president. It has been claimed that the system is not foolproof.

Starting with President Eisenhower, authority to launch a full-scale nuclear attack has been delegated to theater commanders and other specific commanders if they believe it is warranted by circumstances, and are out of communication with the president or the president had been incapacitated. For example, during the Cuban Missile Crisis, on 24 October 1962, General Thomas Power, commander of the Strategic Air Command (SAC), took the country to DEFCON 2, the very precipice of full-scale nuclear war, launching the SAC bombers of the US with nuclear weapons ready to strike. Moreover, some of these commanders subdelegated to lower commanders the authority to launch nuclear weapons under similar circumstance. In fact, the nuclear weapons were not placed under locks (i.e., permissive action links) until decades later, and so pilots or individual submarine commanders had the power to launch nuclear weapons entirely on their own, without higher authority.

Accidents

The Castle Bravo fallout plume spread dangerous levels of radioactive material over an area over 100 miles (160 km) long, including inhabited islands, in the largest single U.S. nuclear accident.

The United States nuclear program since its inception has experienced accidents of varying forms, ranging from single-casualty research experiments (such as that of Louis Slotin during the Manhattan Project), to the nuclear fallout dispersion of the Castle Bravo shot in 1954, to accidents such as crashes of aircraft carrying nuclear weapons, the dropping of nuclear weapons from aircraft, losses of nuclear submarines, and explosions of nuclear-armed missiles (broken arrows). How close any of these accidents came to being major nuclear disasters is a matter of technical and scholarly debate and interpretation.

Weapons accidentally dropped by the United States include incidents off the coast of British Columbia (1950) (see 1950 British Columbia B-36 crash), near Atlantic City, New Jersey (1957); Savannah, Georgia (1958) (see Tybee Bomb); Goldsboro, North Carolina (1961) (see 1961 Goldsboro B-52 crash); off the coast of Okinawa (1965); in the sea near Palomares, Spain (1966, see 1966 Palomares B-52 crash); and near Thule Air Base, Greenland (1968) (see 1968 Thule Air Base B-52 crash). In some of these cases (such as the 1966 Palomares case), the explosive system of the fission weapon discharged, but did not trigger a nuclear chain reaction (safety features prevent this from easily happening), but did disperse hazardous nuclear materials across wide areas, necessitating expensive cleanup endeavors. Several US nuclear weapons, partial weapons, or weapons components are thought to be lost and unrecovered, primarily in aircraft accidents. The 1980 Damascus Titan missile explosion in Damascus, Arkansas, threw a warhead from its silo but did not release any radiation.

The nuclear testing program resulted in a number of cases of fallout dispersion onto populated areas. The most significant of these was the Castle Bravo test, which spread radioactive ash over an area of over 100 square miles (260 km2), including a number of populated islands. The populations of the islands were evacuated but not before suffering radiation burns. They would later suffer long-term effects, such as birth defects and increased cancer risk. There are ongoing concerns around deterioration of the nuclear waste site on Runit Island and a potential radioactive spill. There were also instances during the nuclear testing program in which soldiers were exposed to overly high levels of radiation, which grew into a major scandal in the 1970s and 1980s, as many soldiers later suffered from what were claimed to be diseases caused by their exposures.

Many of the former nuclear facilities produced significant environmental damages during their years of activity, and since the 1990s have been Superfund sites of cleanup and environmental remediation. Hanford is currently the most contaminated nuclear site in the United States and is the focus of the nation's largest environmental cleanup. Radioactive materials are known to be leaking from Hanford into the environment. The Radiation Exposure Compensation Act of 1990 allows for U.S. citizens exposed to radiation or other health risks through the U.S. nuclear program to file for compensation and damages.

Deliberate attacks on weapons facilities

In 1972, three hijackers took control of a domestic passenger flight along the east coast of the U.S. and threatened to crash the plane into a U.S. nuclear weapons plant in Oak Ridge, Tennessee. The plane got as close as 8,000 feet above the site before the hijackers' demands were met.

Various acts of civil disobedience since 1980 by the peace group Plowshares have shown how nuclear weapons facilities can be penetrated, and the group's actions represent extraordinary breaches of security at nuclear weapons plants in the United States. The National Nuclear Security Administration has acknowledged the seriousness of the 2012 Plowshares action. Non-proliferation policy experts have questioned "the use of private contractors to provide security at facilities that manufacture and store the government's most dangerous military material". Nuclear weapons materials on the black market are a global concern, and there is concern about the possible detonation of a small, crude nuclear weapon by a militant group in a major city, with significant loss of life and property.

Stuxnet is a computer worm discovered in June 2010 that is believed to have been created by the United States and Israel to attack Iran's nuclear fuel enrichment facilities.

Development agencies

The United States Atomic Energy Commission (1946–1974) managed the U.S. nuclear program after the Manhattan Project.

The initial U.S. nuclear program was run by the National Bureau of Standards starting in 1939 under the edict of President Franklin Delano Roosevelt. Its primary purpose was to delegate research and dispense funds. In 1940 the National Defense Research Committee (NDRC) was established, coordinating work under the Committee on Uranium among its other wartime efforts. In June 1941, the Office of Scientific Research and Development (OSRD) was established, with the NDRC as one of its subordinate agencies, which enlarged and renamed the Uranium Committee as the Section on Uranium. In 1941, NDRC research was placed under direct control of Vannevar Bush as the OSRD S-1 Section, which attempted to increase the pace of weapons research. In June 1942, the U.S. Army Corps of Engineers took over the project to develop atomic weapons, while the OSRD retained responsibility for scientific research.

This was the beginning of the Manhattan Project, run as the Manhattan Engineering District (MED), an agency under military control that was in charge of developing the first atomic weapons. After World War II, the MED maintained control over the U.S. arsenal and production facilities and coordinated the Operation Crossroads tests. In 1946 after a long and protracted debate, the Atomic Energy Act of 1946 was passed, creating the Atomic Energy Commission (AEC) as a civilian agency that would be in charge of the production of nuclear weapons and research facilities, funded through Congress, with oversight provided by the Joint Committee on Atomic Energy. The AEC was given vast powers of control over secrecy, research, and money, and could seize lands with suspected uranium deposits. Along with its duties towards the production and regulation of nuclear weapons, it was also in charge of stimulating development and regulating civilian nuclear power. The full transference of activities was finalized in January 1947.

In 1975, following the "energy crisis" of the early 1970s and public and congressional discontent with the AEC (in part because of the impossibility to be both a producer and a regulator), it was disassembled into component parts as the Energy Research and Development Administration (ERDA), which assumed most of the AEC's former production, coordination, and research roles, and the Nuclear Regulatory Commission, which assumed its civilian regulation activities.

ERDA was short-lived, however, and in 1977 the U.S. nuclear weapons activities were reorganized under the Department of Energy, which maintains such responsibilities through the semi-autonomous National Nuclear Security Administration. Some functions were taken over or shared by the Department of Homeland Security in 2002. The already-built weapons themselves are in the control of the Strategic Command, which is part of the Department of Defense.

In general, these agencies served to coordinate research and build sites. They generally operated their sites through contractors, however, both private and public (for example, Union Carbide, a private company, ran Oak Ridge National Laboratory for many decades; the University of California, a public educational institution, has run the Los Alamos and Lawrence Livermore laboratories since their inception, and will jointly manage Los Alamos with the private company Bechtel as of its next contract). Funding was received both through these agencies directly, but also from additional outside agencies, such as the Department of Defense. Each branch of the military also maintained its own nuclear-related research agencies (generally related to delivery systems).

Weapons production complex

This table is not comprehensive, as numerous facilities throughout the United States have contributed to its nuclear weapons program. It includes the major sites related to the U.S. weapons program (past and present), their basic site functions, and their current status of activity. Not listed are the many bases and facilities at which nuclear weapons have been deployed. In addition to deploying weapons on its own soil, during the Cold War, the United States also stationed nuclear weapons in 27 foreign countries and territories, including Okinawa (which was US-controlled until 1971,) Japan (during the occupation immediately following World War II), Greenland, Germany, Taiwan, and French Morocco then independent Morocco.

Site name Location Function Status
Los Alamos National Laboratory Los Alamos, New Mexico Research, Design, Pit Production Active
Lawrence Livermore National Laboratory Livermore, California Research and design Active
Sandia National Laboratories Livermore, California; Albuquerque, New Mexico Research and design Active
Hanford Site Richland, Washington Material production (plutonium) Not active, in remediation
Oak Ridge National Laboratory Oak Ridge, Tennessee Material production (uranium-235, fusion fuel), research Active to some extent
Y-12 National Security Complex Oak Ridge, Tennessee Component fabrication, stockpile stewardship, uranium storage Active
Nevada Test Site Near Las Vegas, Nevada Nuclear testing and nuclear waste disposal Active; no tests since 1992, now engaged in waste disposal
Yucca Mountain Nevada Test Site Waste disposal (primarily power reactor) Pending
Waste Isolation Pilot Plant East of Carlsbad, New Mexico Radioactive waste from nuclear weapons production Active
Pacific Proving Grounds Marshall Islands Nuclear testing Not active, last test in 1962
Rocky Flats Plant Near Denver, Colorado Components fabrication Not active, in remediation
Pantex Amarillo, Texas Weapons assembly, disassembly, pit storage Active, esp. disassembly
Fernald Site Near Cincinnati, Ohio Material fabrication (uranium-238) Not active, in remediation
Paducah Plant Paducah, Kentucky Material production (uranium-235) Active (commercial use)
Portsmouth Plant Near Portsmouth, Ohio Material fabrication (uranium-235) Active, (centrifuge), but not for weapons production
Kansas City Plant Kansas City, Missouri Component production Active
Mound Plant Miamisburg, Ohio Research, component production, tritium purification Not active, in remediation
Pinellas Plant Largo, Florida Manufacture of electrical components Active, but not for weapons production
Savannah River Site Near Aiken, South Carolina Material production (plutonium, tritium) Active (limited operation), in remediation
Map of major nuclear sites in the contiguous U.S. Grayed-out sites are not currently active.

Proliferation

A sign pointing to an old fallout shelter in New York City.
The Atoms for Peace program distributed nuclear technology, materials, and know-how to many less technologically advanced countries.

Early on in the development of its nuclear weapons, the United States relied in part on information-sharing with both the United Kingdom and Canada, as codified in the Quebec Agreement of 1943. These three parties agreed not to share nuclear weapons information with other countries without the consent of the others, an early attempt at nonproliferation. After the development of the first nuclear weapons during World War II, though, there was much debate within the political circles and public sphere of the United States about whether or not the country should attempt to maintain a monopoly on nuclear technology, or whether it should undertake a program of information sharing with other nations (especially its former ally and likely competitor, the Soviet Union), or submit control of its weapons to some sort of international organization (such as the United Nations) who would use them to attempt to maintain world peace. Though fear of a nuclear arms race spurred many politicians and scientists to advocate some degree of international control or sharing of nuclear weapons and information, many politicians and members of the military believed that it was better in the short term to maintain high standards of nuclear secrecy and to forestall a Soviet bomb as long as possible (and they did not believe the USSR would actually submit to international controls in good faith).

Since this path was chosen, the United States was, in its early days, essentially an advocate for the prevention of nuclear proliferation, though primarily for the reason of self-preservation. A few years after the USSR detonated its first weapon in 1949, though, the U.S. under President Dwight D. Eisenhower sought to encourage a program of sharing nuclear information related to civilian nuclear power and nuclear physics in general. The Atoms for Peace program, begun in 1953, was also in part political: the U.S. was better poised to commit various scarce resources, such as enriched uranium, towards this peaceful effort, and to request a similar contribution from the Soviet Union, who had far fewer resources along these lines; thus the program had a strategic justification as well, as was later revealed by internal memos. This overall goal of promoting civilian use of nuclear energy in other countries, while also preventing weapons dissemination, has been labeled by many critics as contradictory and having led to lax standards for a number of decades which allowed a number of other nations, such as China and India, to profit from dual-use technology (purchased from nations other than the U.S.).

The Cooperative Threat Reduction program of the Defense Threat Reduction Agency was established after the breakup of the Soviet Union in 1991 to aid former Soviet bloc countries in the inventory and destruction of their sites for developing nuclear, chemical, and biological weapons, and their methods of delivering them (ICBM silos, long-range bombers, etc.). Over $4.4 billion has been spent on this endeavor to prevent purposeful or accidental proliferation of weapons from the former Soviet arsenal.

After India and Pakistan tested nuclear weapons in 1998, President Bill Clinton imposed economic sanctions on the countries. In 1999, however, the sanctions against India were lifted; those against Pakistan were kept in place as a result of the military government that had taken over. Shortly after the September 11 attacks in 2001, President George W. Bush lifted the sanctions against Pakistan as well, in order to get the Pakistani government's help as a conduit for US and NATO forces for operations in Afghanistan.

The U.S. government has been vocal against the proliferation of such weapons in the countries of Iran and North Korea. The 2003 invasion of Iraq by the U.S. was done, in part, on indications that Weapons of mass destruction were being stockpiled (later, stockpiles of previously undeclared nerve agent and mustard gas shells were located in Iraq), and the Bush administration said that its policies on proliferation were responsible for the Libyan government's agreement to abandon its nuclear ambitions. However, a year after the war began, the Senate's report on pre-war intelligence on Iraq, no stockpiles of weapons of mass destruction or active programs of mass destruction were found in Iraq.

Nuclear disarmament in international law

The United States is one of the five nuclear weapons states with a declared nuclear arsenal under the Treaty on the Non-Proliferation of Nuclear Weapons (NPT), of which it was an original drafter and signatory on 1 July 1968 (ratified 5 March 1970). All signatories of the NPT agreed to refrain from aiding in nuclear weapons proliferation to other states.

Further under Article VI of the NPT, all signatories, including the US, agreed to negotiate in good faith to stop the nuclear arms race and to negotiate for complete elimination of nuclear weapons. "Each of the Parties to the Treaty undertakes to pursue negotiations in good faith on effective measures relating to cessation of the nuclear arms race at an early date and to nuclear disarmament, and on a treaty on general and complete disarmament." The International Court of Justice (ICJ), the preeminent judicial tribunal of international law, in its advisory opinion on the Legality of the Threat or Use of Nuclear Weapons, issued 8 July 1996, unanimously interprets the text of Article VI as implying that:

There exists an obligation to pursue in good faith and bring to a conclusion negotiations leading to nuclear disarmament in all its aspects under strict and effective international control.

The International Atomic Energy Agency (IAEA) in 2005 proposed a comprehensive ban on fissile material that would greatly limit the production of weapons of mass destruction. One hundred forty seven countries voted for this proposal but the United States voted against. The US government has also resisted the Treaty on the Prohibition of Nuclear Weapons, a binding agreement for negotiations for the total elimination of nuclear weapons, supported by more than 120 nations.

International relations and nuclear weapons

Soviet General Secretary Gorbachev and U.S. President Reagan signing the INF Treaty in 1987

In 1958, the United States Air Force had considered a plan to drop nuclear bombs on China during a confrontation over Taiwan but it was overruled, previously secret documents showed after they were declassified due to the Freedom of Information Act in April 2008. The plan included an initial plan to drop 10–15 kiloton bombs on airfields in Amoy (now called Xiamen) in the event of a Chinese blockade against Taiwan's Offshore Islands.

Occupational illness

The Energy Employees Occupational Illness Compensation Program (EEOICP) began on 31 July 2001. The program provides compensation and health benefits to Department of Energy nuclear weapons workers (employees, former employees, contractors and subcontractors) as well as compensation to certain survivors if the worker is already deceased. By 14 August 2010, the program had already identified 45,799 civilians who lost their health (including 18,942 who developed cancer) due to exposure to radiation and toxic substances while producing nuclear weapons for the United States.

Current status

U.S. nuclear warhead stockpile, 1945–2002.
A graph showing the amount of nuclear weapons stockpiled by either country during the nuclear race.
U.S. ground-based nuclear weapons (all LGM-30 Minuteman missiles) are deployed in three areas, spanning five states. These locations were chosen to be far away from the coasts, to maximize warning of an incoming attack from submarines; far away from populated areas, since the silos would likely be targeted in a nuclear war; and relatively close to the Soviet Union via the polar route.

The United States is one of the five recognized nuclear powers by the signatories of the Treaty on the Non-Proliferation of Nuclear Weapons (NPT). As of 2017, the US has an estimated 4,018 nuclear weapons in either deployment or storage. This figure compares to a peak of 31,225 total warheads in 1967 and 22,217 in 1989, and does not include "several thousand" warheads that have been retired and scheduled for dismantlement. The Pantex Plant near Amarillo, Texas, is the only location in the United States where weapons from the aging nuclear arsenal can be refurbished or dismantled.

In 2009 and 2010, the Obama administration declared policies that would invalidate the Bush-era policy for use of nuclear weapons and its motions to develop new ones. First, in a prominent 2009 speech, U.S. President Barack Obama outlined a goal of "a world without nuclear weapons". To that goal, U.S. President Barack Obama and Russian President Dmitry Medvedev signed a new START treaty on 8 April 2010, to reduce the number of active nuclear weapons from 2,200 to 1,550. That same week Obama also revised U.S. policy on the use of nuclear weapons in a Nuclear Posture Review required of all presidents, declaring for the first time that the U.S. would not use nuclear weapons against non-nuclear, NPT-compliant states. The policy also renounces development of any new nuclear weapons. However, within the same Nuclear Posture Review of April of 2010, there was a stated need to develop new “low yield” nuclear weapons. This resulted in the development of the B61 Mod 12.  Despite President Obama's goal of a nuclear-free world and reversal of former President Bush’s nuclear policies, his presidency cut fewer warheads from the stockpile any previous post-Cold War presidency. 

Following a renewal of tension after the Russo-Ukrainian War started in 2014, the Obama administration announced plans to continue to renovate the US nuclear weapons facilities and platforms with a budgeted spend of about a trillion dollars over 30 years. Under these news plans, the US government would fund research and development of new nuclear cruise missiles. The Trump and Biden administrations continued with these plans.

As of 2021, American nuclear forces on land consist of 400 Minuteman III ICBMs spread among 450 operational launchers, staffed by Air Force Global Strike Command. Those in the seas consist of 14 nuclear-capable Ohio-class Trident submarines, nine in the Pacific and five in the Atlantic. Nuclear capabilities in the air are provided by 60 nuclear-capable heavy bombers, 20 B-2 bombers and 40 B-52s.

The Air Force has modernized its Minuteman III missiles to last through 2030, and a Ground Based Strategic Deterrent (GBSD) is set to begin replacing them in 2029. The Navy has undertaken efforts to extend the operational lives of its missiles in warheads past 2020; it is also producing new Columbia-class submarines to replace the Ohio fleet beginning 2031. The Air Force is also retiring the nuclear cruise missiles of its B-52s, leaving only half nuclear-capable. It intends to procure a new long-range bomber, the B-21, and a new long-range standoff (LRSO) cruise missile in the 2020s.

Nuclear disarmament movement

April 2011 OREPA rally at the Y-12 nuclear weapons plant entrance

In the early 1980s, the revival of the nuclear arms race triggered large protests about nuclear weapons. On 12 June 1982, one million people demonstrated in New York City's Central Park against nuclear weapons and for an end to the cold war arms race. It was the largest anti-nuclear protest and the largest political demonstration in American history. International Day of Nuclear Disarmament protests were held on 20 June 1983 at 50 sites across the United States. There were many Nevada Desert Experience protests and peace camps at the Nevada Test Site during the 1980s and 1990s.

There have also been protests by anti-nuclear groups at the Y-12 Nuclear Weapons Plant, the Idaho National Laboratory, Yucca Mountain nuclear waste repository proposal, the Hanford Site, the Nevada Test Site, Lawrence Livermore National Laboratory, and transportation of nuclear waste from the Los Alamos National Laboratory.

On 1 May 2005, 40,000 anti-nuclear/anti-war protesters marched past the United Nations in New York, 60 years after the atomic bombings of Hiroshima and Nagasaki. This was the largest anti-nuclear rally in the U.S. for several decades. In May 2010, some 25,000 people, including members of peace organizations and 1945 atomic bomb survivors, marched from downtown New York to the United Nations headquarters, calling for the elimination of nuclear weapons.

Some scientists and engineers have opposed nuclear weapons, including Paul M. Doty, Hermann Joseph Muller, Linus Pauling, Eugene Rabinowitch, M.V. Ramana and Frank N. von Hippel. In recent years, many elder statesmen have also advocated nuclear disarmament. Sam Nunn, William Perry, Henry Kissinger, and George Shultz—have called upon governments to embrace the vision of a world free of nuclear weapons, and in various op-ed columns have proposed an ambitious program of urgent steps to that end. The four have created the Nuclear Security Project to advance this agenda. Organizations such as Global Zero, an international non-partisan group of 300 world leaders dedicated to achieving nuclear disarmament, have also been established.

United States strategic nuclear weapons arsenal

New START Treaty Aggregate Numbers of Strategic Offensive Arms, 14 June 2023

Category of Data United States of America
Deployed ICBMs, Deployed SLBMs,

and Deployed Heavy Bombers

665
Warheads on Deployed ICBMs, on Deployed SLBMs,

and Nuclear Warheads Counted for Deployed Heavy Bombers

1,389
Deployed and Non-deployed Launchers of ICBMs,

Deployed and Non-deployed Launchers of SLBMs,

and Deployed and Non-deployed Heavy Bombers

800
Total 2,854

Notes:

  • Each heavy bomber is counted as one warhead (The New START Treaty)
  • The nuclear weapon delivery capability has been removed from B-1 heavy bombers.

Deplatforming

From Wikipedia, the free encyclopedia
A bust of MIT president Francis Amasa Walker separated from its pedestal at the MIT Museum.

Deplatforming, also known as no-platforming, has been defined as an "attempt to boycott a group or individual through removing the platforms (such as speaking venues or websites) used to share information or ideas", or "the action or practice of preventing someone holding views regarded as unacceptable or offensive from contributing to a forum or debate, especially by blocking them on a particular website."

History

Deplatforming of invited speakers

In the United States, the banning of speakers on University campuses dates back to the 1940s. This was carried out by policies of the universities themselves. The University of California had a policy known as the Speaker Ban, codified in university regulations under President Robert Gordon Sproul that mostly, but not exclusively, targeted communists. One rule stated that "the University assumed the right to prevent exploitation of its prestige by unqualified persons or by those who would use it as a platform for propaganda." This rule was used in 1951 to block Max Shachtman, a socialist, from speaking at the University of California at Berkeley. In 1947, former U.S. Vice President Henry A. Wallace was banned from speaking at UCLA because of his views on U.S. Cold War policy, and in 1961, Malcolm X was prohibited from speaking at Berkeley as a religious leader.

Controversial speakers invited to appear on college campuses have faced deplatforming attempts to disinvite them or to otherwise prevent them from speaking. The British National Union of Students established its No Platform policy as early as 1973. In the mid-1980s, visits by South African ambassador Glenn Babb to Canadian college campuses faced opposition from students opposed to apartheid.

In the United States, recent examples include the March 2017 disruption by protestors of a public speech at Middlebury College by political scientist Charles Murray. In February 2018, students at the University of Central Oklahoma rescinded a speaking invitation to creationist Ken Ham, after pressure from an LGBT student group. In March 2018, a "small group of protesters" at Lewis & Clark Law School attempted to stop a speech by visiting lecturer Christina Hoff Sommers. In the 2019 film No Safe Spaces, Adam Carolla and Dennis Prager documented their own disinvitation along with others.

As of February 2020, the Foundation for Individual Rights in Education, a speech advocacy group, documented 469 disinvitation or disruption attempts at American campuses since 2000, including both "unsuccessful disinvitation attempts" and "successful disinvitations"; the group defines the latter category as including three subcategories: formal disinvitation by the sponsor of the speaking engagement; the speaker's withdrawal "in the face of disinvitation demands"; and "heckler's vetoes" (situations when "students or faculty persistently disrupt or entirely prevent the speakers' ability to speak").

Deplatforming in social media

Beginning in 2015, Reddit banned several communities on the site ("subreddits") for violating the site's anti-harassment policy. A 2017 study published in the journal Proceedings of the ACM on Human-Computer Interaction, examining "the causal effects of the ban on both participating users and affected communities," found that "the ban served a number of useful purposes for Reddit" and that "Users participating in the banned subreddits either left the site or (for those who remained) dramatically reduced their hate speech usage. Communities that inherited the displaced activity of these users did not suffer from an increase in hate speech." In June 2020 and January 2021, Reddit also issued bans to two prominent online pro-Trump communities over violations of the website's content and harassments policies.

On May 2, 2019, Facebook and the Facebook-owned platform Instagram announced a ban of "dangerous individuals and organizations" including Nation of Islam leader Louis Farrakhan, Milo Yiannopoulos, Alex Jones and his organization InfoWars, Paul Joseph Watson, Laura Loomer, and Paul Nehlen. In the wake of the 2021 storming of the US Capitol, Twitter banned then-president Donald Trump, as well as 70,000 other accounts linked to the event and the far-right movement QAnon.

Some studies have found that deplatforming of extremists reduced their audience, although other research has found that some content creators become more toxic following deplatforming and migration to alt-tech platform.

Twitter

On November 18, 2022, Elon Musk, as newly instated CEO of Twitter, reopened previously banned Twitter accounts of high profile users including Kathie Griffin, Jorden Peterson, and Babylon Bee as part of the new Twitter policy. As Musk exclaimed, “New Twitter policy is freedom of speech, but not freedom of reach".

Alex Jones

On August 6, 2018, Facebook, Apple, YouTube and Spotify removed all content by Jones and InfoWars for policy violations. YouTube removed channels associated with InfoWars, including The Alex Jones Channel. On Facebook, four pages associated with InfoWars and Alex Jones were removed over repeated policy violations. Apple removed all podcasts associated with Jones from iTunes. On August 13, 2018, Vimeo removed all of Jones's videos because of "prohibitions on discriminatory and hateful content". Facebook cited instances of dehumanizing immigrants, Muslims and transgender people, as well as glorification of violence, as examples of hate speech. After InfoWars was banned from Facebook, Jones used another of his websites, NewsWars, to circumvent the ban.

Jones's accounts were also removed from Pinterest, Mailchimp and LinkedIn. As of early August 2018, Jones retained active accounts on Instagram, Google+ and Twitter.

In September, Jones was permanently banned from Twitter and Periscope after berating CNN reporter Oliver Darcy. On September 7, 2018, the InfoWars app was removed from the Apple App Store for "objectionable content". He was banned from using PayPal for business transactions, having violated the company's policies by expressing "hate or discriminatory intolerance against certain communities and religions." After Elon Musk's purchase of Twitter several previously banned accounts were reinstated including Donald Trump, Andrew Tate and Ye resulting in questioning if Alex Jones will be unbanned as well. However Musk denied that Alex Jones will be unbanned criticizing Jones as a person that "would use the deaths of children for gain, politics or fame".

InfoWars remained available on Roku devices, in January 2019, a year after the channel's removal from multiple streaming services. Roku indicated that they do not "curate or censor based on viewpoint," and that it had policies against content that is "unlawful, incited illegal activities, or violates third-party rights," but that InfoWars was not in violation of these policies. Following a social media backlash, Roku removed InfoWars and stated "After the InfoWars channel became available, we heard from concerned parties and have determined that the channel should be removed from our platform."

In March 2019, YouTube terminated the Resistance News channel due to its reuploading of live streams from InfoWars. On May 1, 2019, Jones was barred from using both Facebook and Instagram. Jones briefly moved to Dlive, but was suspended in April 2019 for violating community guidelines.

In March 2020, the InfoWars app was removed from the Google Play store due to claims of Jones disseminating COVID-19 misinformation. A Google spokesperson stated that "combating misinformation on the Play Store is a top priority for the team" and apps that violate Play policy by "distributing misleading or harmful information" are removed from the store.

Donald Trump

On January 6, 2021, in a joint session of the United States Congress, the counting of the votes of the Electoral College was interrupted by a breach of the United States Capitol chambers. The rioters were supporters of President Donald Trump who hoped to delay and overturn the President's loss in the 2020 election. The event resulted in five deaths and at least 400 people being charged with crimes. The certification of the electoral votes was only completed in the early morning hours of January 7, 2021. In the wake of several Tweets by President Trump on January 7, 2021 Facebook, Instagram, YouTube, Reddit, and Twitter all deplatformed Trump to some extent. Twitter deactivated his personal account, which the company said could possibly be used to promote further violence. Trump subsequently tweeted similar messages from the President's official US Government account @POTUS, which resulted in him being permanently banned on January 8. Twitter then announced that Trump's ban from their platform would be permanent.

Trump planned to re-join on social media through the use of a new platform by May or June 2021, according to Jason Miller on a Fox News broadcast.

The same week Musk announced Twitter's new freedom of speech policy, he tweeted a poll to ask whether to bring back Trump into the platform. The poll ended with 51.8% in favor of unbanning Trump's account.Twitter has since reinstated Trump’s twitter account @realDonaldTrump (as of 19 Nov 2022 — but by then Trump's platform was Truth Social).

Andrew Tate

In 2017, Andrew Tate was banned from Twitter for tweeting that women should “bare some responsibility” in response to the #MeToo movement. Similarly, in August 2022, Tate was banned on four more major social media platforms: Instagram, Facebook, TikTok, and YouTube. These platforms indicated that Tate’s misogynistic comments violated their hate speech policies. 

Tate has since been unbanned from Twitter as part of the new freedom of speech policy on twitter.

Demonetization

Social media platforms such as YouTube and Instagram allow their content producers or influencers to earn money based on the content (videos, images, etc.), most typically based around some sort of payment per a set number of new "likes" or clicks etc. When the content is deemed inappropriate for compensation, but still left on the platform, this is called "demonetization" because the content producer is left with no compensation for their content that they created, while at the same time the content is still left up and available for viewing or listening by the general public. In September 2016, Vox reported that demonetization—as it pertained to YouTube specifically—involved the following key points:

  • "Since 2012, YouTube has been automatically 'demonetizing' some videos because its software thought the content was unfriendly for advertisers."
  • "Many YouTube video makers didn’t realize this until last week, when YouTube began actively telling them about it."
  • "This has freaked YouTubers out, even though YouTube has been behaving rationally by trying to connect advertisers to advertiser-friendly content. It’s not censorship, since YouTube video makers can still post (just about) anything they want."
  • "YouTube’s software will screw things up, which means videos that should have ads don’t, which means YouTube video makers have been missing out on ad revenue."

Other examples

Financial deplatforming

Countering terrorist organizations ranges from the use of civil laws to financial deplatforming— Jessica Davis.

Harassment and threats to employment

Deplatforming tactics have also included attempts to silence controversial speakers through various forms of personal harassment, such as doxing, the making of false emergency reports for purposes of swatting, and complaints or petitions to third parties. In some cases, protesters have attempted to have speakers blacklisted from projects or fired from their jobs.

In 2019, for example, students at the University of the Arts in Philadelphia circulated an online petition demanding that Camille Paglia "should be removed from UArts faculty and replaced by a queer person of color." According to The Atlantic's Conor Friedersdorf, "It is rare for student activists to argue that a tenured faculty member at their own institution should be denied a platform." Paglia, a tenured professor for over 30 years who identifies as transgender, had long been unapologetically outspoken on controversial "matters of sex, gender identity, and sexual assault".

In print media

In December 2017, after learning that a French artist it had previously reviewed was a neo-Nazi, the San Francisco punk magazine Maximum Rocknroll apologized and announced that it has "a strict no-platform policy towards any bands and artists with a Nazi ideology".

Legislative responses

United Kingdom

In May 2021, the UK government under Boris Johnson announced a Higher Education (Freedom of Speech) Bill that would allow speakers at universities to seek compensation for no-platforming, impose fines on universities and student unions that promote the practice, and establish a new ombudsman charged with monitoring cases of no-platforming and academic dismissals. In addition, the government published an Online Safety Bill that would prohibit social media networks from discriminating against particular political views or removing "democratically important" content, such as comments opposing or supporting political parties and policies.

United States

Some critics of deplatforming have proposed that governments should treat social media as a public utility to ensure that constitutional rights of the users are protected, citing their belief that an Internet presence using social media websites is imperative in order to adequately take part in the 21st century as an individual. Republican politicians have sought to weaken the protections established by Section 230 of the Communications Decency Act—which provides immunity from liability for providers and users of an "interactive computer service" who publish information provided by third-party users—under allegations that the moderation policies of major social networks are not politically neutral.

Reactions

Justifications

According to its defenders, deplatforming has been used as a tactic to prevent the spread of hate speech and disinformation. Social media has evolved into a significant source of news reporting for its users, and support for content moderation and banning of inflammatory posters has been defended as an editorial responsibility required by news outlets.

Supporters of deplatforming have justified the action on the grounds that it produces the desired effect of reducing what they characterize as hate speech. Angelo Carusone, president of the progressive organization Media Matters for America and who had run deplatforming campaigns against conservative talk hosts Rush Limbaugh in 2012 and Glenn Beck in 2010, pointed to Twitter's 2016 ban of Milo Yiannopoulos, stating that "the result was that he lost a lot.... He lost his ability to be influential or at least to project a veneer of influence."

In the United States, deprivation of rights under the First Amendment is sometimes cited as a criticism of deplatforming. Proponents say that deplatforming is a legal way of dealing with controversial users online or in other digital spaces, so long as the government is not involved with causing the deplatforming. According to Audie Cornish, host of the NPR show Consider This, modern deplatforming is not a government issue. She states that "the government can't silence your ability to say almost anything you want on a public street corner. But a private company can silence your ability to say whatever you want on a platform they created."

Critical responses

In the words of technology journalist Declan McCullagh, "Silicon Valley's efforts to pull the plug on dissenting opinions" began around 2018 with Twitter, Facebook, and YouTube denying service to selected users of their platforms; he said they devised "excuses to suspend ideologically disfavored accounts". In 2019, McCullagh predicted that paying customers would become targets for deplatforming as well, citing protests and open letters by employees of Amazon, Microsoft, Salesforce, and Google who opposed policies of U.S. Immigration and Customs Enforcement (ICE), and who reportedly sought to influence their employers to deplatform the agency and its contractors.

Law professor Glenn Reynolds dubbed 2018 the "Year of Deplatforming" in an August 2018 article in The Wall Street Journal. Reynolds criticized the decision of "internet giants" to "slam the gates on a number of people and ideas they don't like", naming Alex Jones and Gavin McInnes. Reynolds cited further restrictions on "even mainstream conservative figures" such as Dennis Prager, as well as Facebook's blocking of a campaign advertisement by a Republican candidate "ostensibly because her video mentioned the Cambodian genocide, which her family survived."

In a 2019 The Atlantic article, Conor Friedersdorf described what he called "standard practice" among student activists. He wrote: "Activists begin with social-media callouts; they urge authority figures to impose outcomes that they favor, without regard for overall student opinion; they try to marshal antidiscrimination law to limit freedom of expression." Friedersdorf pointed to evidence of a chilling effect on free speech and academic freedom. Of the faculty members he had contacted for interviews, he said a large majority "on both sides of the controversy insisted that their comments be kept off the record or anonymous. They feared openly participating in a debate about a major event at their institution—even after their university president put out an uncompromising statement in support of free speech."

DNA computing

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/DNA_computing
The biocompatible computing device: Deoxyribonucleic acid (DNA)

DNA computing is an emerging branch of unconventional computing which uses DNA, biochemistry, and molecular biology hardware, instead of the traditional electronic computing. Research and development in this area concerns theory, experiments, and applications of DNA computing. Although the field originally started with the demonstration of a computing application by Len Adleman in 1994, it has now been expanded to several other avenues such as the development of storage technologies, nanoscale imaging modalities, synthetic controllers and reaction networks, etc.

History

Leonard Adleman of the University of Southern California initially developed this field in 1994. Adleman demonstrated a proof-of-concept use of DNA as a form of computation which solved the seven-point Hamiltonian path problem. Since the initial Adleman experiments, advances have occurred and various Turing machines have been proven to be constructible.

Since then the field has expanded into several avenues. In 1995, the idea for DNA-based memory was proposed by Eric Baum who conjectured that a vast amount of data can be stored in a tiny amount of DNA due to its ultra-high density. This expanded the horizon of DNA computing into the realm of memory technology although the in vitro demonstrations were made almost after a decade.

The field of DNA computing can be categorized as a sub-field of the broader DNA nanoscience field started by Ned Seeman about a decade before Len Adleman's demonstration. Ned's original idea in the 1980s was to build arbitrary structures using bottom-up DNA self-assembly for applications in crystallography. However, it morphed into the field of structural DNA self-assembly which as of 2020 is extremely sophisticated. Self-assembled structure from a few nanometers tall all the way up to several tens of micrometers in size have been demonstrated in 2018.

In 1994, Prof. Seeman's group demonstrated early DNA lattice structures using a small set of DNA components. While the demonstration by Adleman showed the possibility of DNA-based computers, the DNA design was trivial because as the number of nodes in a graph grows, the number of DNA components required in Adleman's implementation would grow exponentially. Therefore, computer scientist and biochemists started exploring tile-assembly where the goal was to use a small set of DNA strands as tiles to perform arbitrary computations upon growth. Other avenues that were theoretically explored in the late 90's include DNA-based security and cryptography, computational capacity of DNA systems, DNA memories and disks, and DNA-based robotics.

In 2003, John Reif's group first demonstrated the idea of a DNA-based walker that traversed along a track similar to a line follower robot. They used molecular biology as a source of energy for the walker. Since this first demonstration, a wide variety of DNA-based walkers have been demonstrated.

Applications, examples, and recent developments

In 1994 Leonard Adleman presented the first prototype of a DNA computer. The TT-100 was a test tube filled with 100 microliters of a DNA solution. He managed to solve an instance of the directed Hamiltonian path problem. In Adleman's experiment, the Hamiltonian Path Problem was implemented notationally as "travelling salesman problem". For this purpose, different DNA fragments were created, each one of them representing a city that had to be visited. Every one of these fragments is capable of a linkage with the other fragments created. These DNA fragments were produced and mixed in a test tube. Within seconds, the small fragments form bigger ones, representing the different travel routes. Through a chemical reaction, the DNA fragments representing the longer routes were eliminated. The remains are the solution to the problem, but overall, the experiment lasted a week. However, current technical limitations prevent the evaluation of the results. Therefore, the experiment isn't suitable for the application, but it is nevertheless a proof of concept.

Combinatorial problems

First results to these problems were obtained by Leonard Adleman.

Tic-tac-toe game

In 2002, J. Macdonald, D. Stefanović and M. Stojanović created a DNA computer able to play tic-tac-toe against a human player. The calculator consists of nine bins corresponding to the nine squares of the game. Each bin contains a substrate and various combinations of DNA enzymes. The substrate itself is composed of a DNA strand onto which was grafted a fluorescent chemical group at one end, and the other end, a repressor group. Fluorescence is only active if the molecules of the substrate are cut in half. The DNA enzymes simulate logical functions. For example, such a DNA will unfold if two specific types of DNA strand are introduced to reproduce the logic function AND.

By default, the computer is considered to have played first in the central square. The human player starts with eight different types of DNA strands corresponding to the eight remaining boxes that may be played. To play box number i, the human player pours into all bins the strands corresponding to input #i. These strands bind to certain DNA enzymes present in the bins, resulting, in one of these bins, in the deformation of the DNA enzymes which binds to the substrate and cuts it. The corresponding bin becomes fluorescent, indicating which box is being played by the DNA computer. The DNA enzymes are divided among the bins in such a way as to ensure that the best the human player can achieve is a draw, as in real tic-tac-toe.

Neural network based computing

Kevin Cherry and Lulu Qian at Caltech developed a DNA-based artificial neural network that can recognize 100-bit hand-written digits. They achieve this by programming on computer in advance with appropriate set of weights represented by varying concentrations weight molecules which will later be added to the test tube that holds the input DNA strands.

Improved speed with Localized (cache-like) Computing

One of the challenges of DNA computing is its speed. While DNA as a substrate is biologically compatible i.e. it can be used at places where silicon technology cannot, its computation speed is still very slow. For example, the square-root circuit used as a benchmark in field took over 100 hours to complete. While newer ways with external enzyme sources are reporting faster and more compact circuits, Chatterjee et al. demonstrated an interesting idea in the field to speedup computation through localized DNA circuits. This concept is being further explored by other groups. This idea, while originally proposed in the field computer architecture, has been adopted in this field as well. In computer architecture, it is very well-known that if the instructions are executed in sequence, having them loaded in the cache will inevitably lead to fast performance, also called as the principle of localization. This is because with instructions in fast cache memory, there is no need swap them in and out of main memory which can be slow. Similarly, in localized DNA computing, the DNA strands responsible for computation are fixed on a breadboard like substrate ensuring physical proximity of the computing gates. Such localized DNA computing techniques have shown to potentially reduce the computation time by orders of magnitude.

Renewable (or reversible) DNA computing

Subsequent research on DNA computing has produced reversible DNA computing, bringing the technology one step closer to the silicon-based computing used in (for example) PCs. In particular, John Reif and his group at Duke University have proposed two different techniques to reuse the computing DNA complexes. The first design uses dsDNA gates, while the second design uses DNA hairpin complexes. While both the designs face some issues (such as reaction leaks), this appears to represent a significant breakthrough in the field of DNA computing. Some other groups have also attempted to address the gate reusability problem.

Using strand displacement reactions (SRDs), reversible proposals are presented in "Synthesis Strategy of Reversible Circuits on DNA Computers" paper  for implementing reversible gates and circuits on DNA computers by combining DNA computing and reversible computing techniques. This paper also proposes a universal reversible gate library (URGL) for synthesizing n-bit reversible circuits on DNA computers with an average length and cost of the constructed circuits better than the previous methods.

Methods

There are multiple methods for building a computing device based on DNA, each with its own advantages and disadvantages. Most of these build the basic logic gates (AND, OR, NOT) associated with digital logic from a DNA basis. Some of the different bases include DNAzymes, deoxyoligonucleotides, enzymes, and toehold exchange.

Strand displacement mechanisms

The most fundamental operation in DNA computing and molecular programming is the strand displacement mechanism. Currently, there are two ways to perform strand displacement:

  • Toehold mediated strand displacement (TMSD)
  • Polymerase-based strand displacement (PSD)

Toehold exchange

Beside simple strand displacement schemes, DNA computers have also been constructed using the concept of toehold exchange. In this system, an input DNA strand binds to a sticky end, or toehold, on another DNA molecule, which allows it to displace another strand segment from the molecule. This allows the creation of modular logic components such as AND, OR, and NOT gates and signal amplifiers, which can be linked into arbitrarily large computers. This class of DNA computers does not require enzymes or any chemical capability of the DNA.

Chemical reaction networks (CRNs)

The full stack for DNA computing looks very similar to a traditional computer architecture. At the highest level, a C-like general purpose programming language is expressed using a set of chemical reaction networks (CRNs). This intermediate representation gets translated to domain-level DNA design and then implemented using a set of DNA strands. In 2010, Erik Winfree's group showed that DNA can be used a substrate to implement arbitrary chemical reactions. This opened the way to design and synthesis of biochemical controllers since the expressive power of CRNs is equivalent to a Turing machine. Such controllers can potentially be used in vivo for applications such as preventing hormonal imbalance.

DNAzymes

Catalytic DNA (deoxyribozyme or DNAzyme) catalyze a reaction when interacting with the appropriate input, such as a matching oligonucleotide. These DNAzymes are used to build logic gates analogous to digital logic in silicon; however, DNAzymes are limited to 1-, 2-, and 3-input gates with no current implementation for evaluating statements in series.

The DNAzyme logic gate changes its structure when it binds to a matching oligonucleotide and the fluorogenic substrate it is bonded to is cleaved free. While other materials can be used, most models use a fluorescence-based substrate because it is very easy to detect, even at the single molecule limit. The amount of fluorescence can then be measured to tell whether or not a reaction took place. The DNAzyme that changes is then "used", and cannot initiate any more reactions. Because of this, these reactions take place in a device such as a continuous stirred-tank reactor, where old product is removed and new molecules added.

Two commonly used DNAzymes are named E6 and 8-17. These are popular because they allow cleaving of a substrate in any arbitrary location. Stojanovic and MacDonald have used the E6 DNAzymes to build the MAYA I and MAYA II machines, respectively; Stojanovic has also demonstrated logic gates using the 8-17 DNAzyme. While these DNAzymes have been demonstrated to be useful for constructing logic gates, they are limited by the need for a metal cofactor to function, such as Zn2+ or Mn2+, and thus are not useful in vivo.

A design called a stem loop, consisting of a single strand of DNA which has a loop at an end, are a dynamic structure that opens and closes when a piece of DNA bonds to the loop part. This effect has been exploited to create several logic gates. These logic gates have been used to create the computers MAYA I and MAYA II which can play tic-tac-toe to some extent.

Enzymes

Enzyme-based DNA computers are usually of the form of a simple Turing machine; there is analogous hardware, in the form of an enzyme, and software, in the form of DNA.

Benenson, Shapiro and colleagues have demonstrated a DNA computer using the FokI enzyme and expanded on their work by going on to show automata that diagnose and react to prostate cancer: under expression of the genes PPAP2B and GSTP1 and an over expression of PIM1 and HPN. Their automata evaluated the expression of each gene, one gene at a time, and on positive diagnosis then released a single strand DNA molecule (ssDNA) that is an antisense for MDM2. MDM2 is a repressor of protein 53, which itself is a tumor suppressor. On negative diagnosis it was decided to release a suppressor of the positive diagnosis drug instead of doing nothing. A limitation of this implementation is that two separate automata are required, one to administer each drug. The entire process of evaluation until drug release took around an hour to complete. This method also requires transition molecules as well as the FokI enzyme to be present. The requirement for the FokI enzyme limits application in vivo, at least for use in "cells of higher organisms". It should also be pointed out that the 'software' molecules can be reused in this case.

Algorithmic self-assembly

DNA arrays that display a representation of the Sierpinski gasket on their surfaces. Click the image for further details. Image from Rothemund et al., 2004.

DNA nanotechnology has been applied to the related field of DNA computing. DNA tiles can be designed to contain multiple sticky ends with sequences chosen so that they act as Wang tiles. A DX array has been demonstrated whose assembly encodes an XOR operation; this allows the DNA array to implement a cellular automaton which generates a fractal called the Sierpinski gasket. This shows that computation can be incorporated into the assembly of DNA arrays, increasing its scope beyond simple periodic arrays.

Capabilities

DNA computing is a form of parallel computing in that it takes advantage of the many different molecules of DNA to try many different possibilities at once. For certain specialized problems, DNA computers are faster and smaller than any other computer built so far. Furthermore, particular mathematical computations have been demonstrated to work on a DNA computer.

DNA computing does not provide any new capabilities from the standpoint of computability theory, the study of which problems are computationally solvable using different models of computation. For example, if the space required for the solution of a problem grows exponentially with the size of the problem (EXPSPACE problems) on von Neumann machines, it still grows exponentially with the size of the problem on DNA machines. For very large EXPSPACE problems, the amount of DNA required is too large to be practical.

Alternative technologies

A partnership between IBM and Caltech was established in 2009 aiming at "DNA chips" production. A Caltech group is working on the manufacturing of these nucleic-acid-based integrated circuits. One of these chips can compute whole square roots. A compiler has been written in Perl.

Pros and cons

The slow processing speed of a DNA computer (the response time is measured in minutes, hours or days, rather than milliseconds) is compensated by its potential to make a high amount of multiple parallel computations. This allows the system to take a similar amount of time for a complex calculation as for a simple one. This is achieved by the fact that millions or billions of molecules interact with each other simultaneously. However, it is much harder to analyze the answers given by a DNA computer than by a digital one.

Computer-aided software engineering

From Wikipedia, the free encyclopedia ...