The Shock Doctrine: The Rise of Disaster Capitalism is a 2007 book by Canadian author and social activist Naomi Klein. In the book, Klein argues that neoliberal economic policies promoted by Milton Friedman and the Chicago School of Economics
have risen to global prominence because of a deliberate strategy she
calls "disaster capitalism". In this strategy, political actors exploit
the chaos of natural disasters, wars, and other crises to push through
unpopular policies such as deregulation and privatization. This economic "shock therapy"
favors corporate interests while disadvantaging and disenfranchising
citizens when they are too distracted and overwhelmed to respond or
resist effectively. The book challenges the narrative that free marketcapitalist
policies have been welcomed by the inhabitants of regions where they
have been implemented, and it argues that several man-made events,
including the Iraq War, were intentionally undertaken with the goal of pushing through these unpopular policies in their wake.
Some reviewers claimed the book oversimplifies political
phenomena, while others lauded it as a compelling and important work.
The book served as the main source of a 2009 documentary feature film
with the same title directed by Michael Winterbottom.
Synopsis
The book is divided into seven parts with a total of 21 chapters.
Part 1 begins with a chapter on psychiatric shock therapy and the covert experiments conducted by the psychiatrist Ewen Cameron in collusion with the Central Intelligence Agency. The second chapter introduces Milton Friedman and his Chicago school of economics, whom Klein describes as leading a laissez-faire capitalist movement committed to creating free markets that are even less regulated than those that existed before the Great Depression.
Part 2 discusses the use of the "shock doctrine" to transform South American economies in the 1970s, focusing on the 1973 coup in Chile led by General Augusto Pinochet and influenced by the Chicago Boys, a group of Chilean economists who had studied under Friedman at the Chicago School and were funded by the CIA. Klein connects torture with economic shock therapy.
Part 3 covers attempts to apply the shock doctrine without
the need for extreme violence against sections of the population. Klein
says that Margaret Thatcher applied mild shock "therapy" facilitated by the Falklands War, while free market reform in Bolivia was possible due to a combination of pre-existing economic crises and the charisma of Jeffrey Sachs.
Part 5 introduces the "Disaster Capitalism Complex", a
complex series of networks and influence employed by private companies
that allows them to profit from disasters. She mirrors this new Disaster
Capitalism Complex with the Military Industrial Complex and explains that both employ the blurring of the line between private and public, through tactics like the revolving door.
Part 6 discusses the use of "shock and awe" in the 2003 invasion of Iraq and the subsequent occupation of Iraq,
which Klein describes as the most comprehensive and full-scale
implementation of the shock doctrine ever attempted, with mass
privatization of Iraqi state-owned enterprises (including thousands of
men being laid off) which is argued as contributing to the insurgency,
since many of the unemployed became embittered toward the US as a result
and joined insurgent groups afterward.
Part 7 is about winners and losers of economic shock therapy – how small groups will often do very well by moving into luxurious gated communities while large sections of the population are left with decaying public infrastructure, declining incomes and increased unemployment. Klein describes economic policy after Hurricane Katrina, the 2004 Sri Lanka Tsunami, and the apartheid-style policy of the Israeli government toward Palestinians.
The Conclusion details the backlash against the "shock doctrine" and economic institutions which, in Klein's view, encourage it – like the World Bank and IMF.
South America and Lebanon post-2006 are shown in a positive light,
where politicians are already rolling back free-market policies, with
some mention of the increased campaigning by community-minded activists
in South Africa and China.
Reactions
Favourable
Paul B. Farrell from the Dow Jones Business News argued that The Shock Doctrine "may be the most important book on economics in the 21st century." In The Guardian, John Gray
hailed it as one of the "very few books that really help us understand
the present", describing the work as "both timely and devastating". William S. Kowinski of the San Francisco Chronicle praised Klein's prose and wrote that the author "may well have revealed the master narrative of our time." In The Irish Times, Tom Clonan reported that she "systematically and calmly demonstrates to the reader" the way in which neoconservative figures were intimately linked to seismic events that "resulted in the loss of millions of lives."
In the Los Angeles Times,
Richard Rayner opined, "Not everybody's going to agree with her, but
this is reporting and history-writing in the tradition of Izzy Stone and Upton Sinclair. Klein upends assumptions and demands that we think – her book is thrilling, troubling and very dark." Stephen Amidon of the New York Observer affirmed the applicability of Klein's thesis to the Iraq War
and argued, "Seen through the lens of Naomi Klein's analysis, [it]
makes horrifying sense, right down to Mr. Rumsfeld's decision to allow
the looting of the nation's cultural identity." Shashi Tharoor noted the work's "meticulous endnotes" and stated, referring to globalization, that Klein "has established herself as its principal naysayer." Katy Guest of The Independent praised the book as "a compelling account of the way big business and politics use global disasters for their own ends." Juan Santos,
winner of the 2016 Nobel Peace Prize, called the book "as gripping as
the best murder mystery, as well researched as the best investigative
journalism – on a par with the work of a Seymour Hersh."
The Nobel Laureate and former Chief Economist of the World BankJoseph Stiglitz wrote a review of The Shock Doctrine for The New York Times
calling the parallel between economic shock therapy and the
psychological experiments conducted by Ewen Cameron "overdramatic and
unconvincing" and claiming that "Klein is not an academic and cannot be
judged as one. There are many places in her book where she
oversimplifies." He also said, "the case against these policies is even
stronger than the one Klein makes" and that the book contains "a rich
description of the political machinations required to force unsavory
economic policies on resisting countries."
Shashi Tharoor in The Washington Post says that The Shock Doctrine
takes Klein's criticism of capitalism an important step further. He
also said Klein "is too ready to see conspiracies where others might
discern little more than the all-too-human pattern of chaos and
confusion, good intentions and greed."
Sociologists as Ulrich Beck envisioned the society of risk as a new cultural value which posed risk as a commodity to be exchanged in globalized economies. As Klein observed, this suggested that disasters and capitalist economy was inevitably entwined. Some voices have praised the contributions of Klein to the study of the "spectacle of disasters".
Unfavourable
In the London Review of Books, Stephen Holmes criticizes The Shock Doctrine as naïve, and opines that it conflates "'free market orthodoxy' with predatory corporate behaviour." John Willman of the Financial Times
describes it as "a deeply flawed work that blends together disparate
phenomena to create a beguiling – but ultimately dishonest – argument." Tom Redburn in The New York Times states that "what she is most blind to is the necessary role of entrepreneurial capitalism in overcoming the inherent tendency of any established social system to lapse into stagnation."
Jonathan Chait wrote in The New Republic that Klein "pays shockingly (but, given her premises, unsurprisingly) little attention to right-wing ideas. She recognizes that neoconservatism sits at the heart of the Iraq war project, but she does not seem to know what neoconservatism is; and she makes no effort to find out." Robert Cole from The Times said, "Klein derides the 'disaster capitalism complex' and the profits and privatisations that go with it but she does not supply a cogently argued critique of free market principles, and without this The Shock Doctrine descends into a muddle of stories that are often worrying, sometimes interesting, and occasionally bizarre."
Economist Tyler Cowen, who called Klein's arguments "ridiculous" and the book a "true economics disaster", wrote on The New York Sun that the book contains "a series of fabricated claims, such as the suggestion that Margaret Thatcher created the Falkland Islands crisis to crush the unions and foist unfettered capitalism upon an unwilling British public." Johan Norberg of the libertarian Cato Institute
criticizes the book, saying that "Klein's analysis is hopelessly flawed
at virtually every level." Norberg finds fault with specifics of the
analysis, such as with the Chinese government crackdown on the Tiananmen Square protests of 1989.
He argues that, rather than crushing opposition to pro-market reforms
(as Klein would have it), the crackdown itself caused liberalization to
stall for years.
Klein responded on her website to both Norberg and Chait, stating that
both had misrepresented her positions. Klein wrote that Norberg had
erected a straw man by claiming that her book is about one man, Friedman, but that it is in fact about a "multifaceted ideological trend".
Norberg again responded that Klein "actually defends only one of her
central claims that I criticized. Instead, she gives the impression that
I have just tried to find small mistakes here and there in her book."
He went on to say that the numbers Klein supplied in her reply reveal
the statistics in her central argument to be "rubbish".
Later comments
In a piece related to the COVID-19 pandemic, Klein wrote in 2020 that a "Pandemic Shock Doctrine" was beginning to emerge and called it the "Screen New Deal".
Disinformation is misleading content deliberately spread to deceive people, or to secure economic or political gain and which may cause public harm. Disinformation is an orchestrated adversarial activity in which actors employ strategic deceptions and media manipulation tactics to advance political, military, or commercial goals. Disinformation is implemented through attacks that "weaponize multiple rhetorical strategies and forms of knowing—including not only falsehoods but also truths, half-truths, and value judgements—to exploit and amplify culture wars and other identity-driven controversies."
In contrast, misinformation refers to inaccuracies that stem from inadvertent error. Misinformation can be used to create disinformation when known misinformation is purposefully and intentionally disseminated. "Fake news"
has sometimes been categorized as a type of disinformation, but
scholars have advised not using these two terms interchangeably or using
"fake news" altogether in academic writing since politicians have
weaponized it to describe any unfavorable news coverage or information.
Etymology
The English word disinformation comes from the application of the Latin prefix dis- to information
making the meaning "reversal or removal of information". The rarely
used word had appeared with this usage in print at least as far back as
1887.
Some consider it a loan translation of the Russian дезинформация, transliterated as dezinformatsiya, apparently derived from the title of a KGBblack propaganda department. Soviet planners in the 1950s defined disinformation as "dissemination (in
the press, on the radio, etc.) of false reports intended to mislead public opinion."
Disinformation first made an appearance in dictionaries in 1985, specifically, Webster's New College Dictionary and the American Heritage Dictionary. In 1986, the term disinformation was not defined in Webster's New World Thesaurus or New Encyclopædia Britannica.
After the Soviet term became widely known in the 1980s, native speakers
of English broadened the term as "any government communication (either
overt or covert) containing intentionally false and misleading material,
often combined selectively with true information, which seeks to
mislead and manipulate either elites or a mass audience."
By 1990, use of the term disinformation had fully established itself in the English language within the lexicon of politics. By 2001, the term disinformation had come to be known as simply a more civil phrase for saying someone was lying. Stanley B. Cunningham wrote in his 2002 book The Idea of Propaganda that disinformation had become pervasively used as a synonym for propaganda.
Operationalization
The Shorenstein Center
at Harvard University defines disinformation research as an academic
field that studies "the spread and impacts of misinformation,
disinformation, and media manipulation," including "how it spreads
through online and offline channels, and why people are susceptible to
believing bad information, and successful strategies for mitigating its
impact" According to a 2023 research article published in New Media & Society, disinformation circulates on social media through deception campaigns implemented in multiple ways including: astroturfing, conspiracy theories, clickbait, culture wars, echo chambers, hoaxes, fake news, propaganda, pseudoscience, and rumors.
Activities that operationalize disinformation campaigns online
In order to distinguish between similar terms, including
misinformation and malinformation, scholars collectively agree on the
definitions for each term as follows: (1) disinformation is the
strategic dissemination of false information with the intention to cause
public harm;
(2) misinformation represents the unintentional spread of false
information; and (3) malinformation is factual information disseminated
with the intention to cause harm, these terms are abbreviated 'DMMI'.
In 2019, Camille François devised the "ABC" framework of understanding different modalities of online disinformation:
Manipulative Actors, who "engage knowingly and with clear
intent in viral deception campaigns" that are "covert, designed to
obfuscate the identity and intent of the actor orchestrating them."
Examples include personas such as Guccifer 2.0, Internet trolls, state media, and military operatives.
Deceptive Behavior, which "encompasses the variety of
techniques viral deception actors may use to enhance and exaggerate the
reach, virality and impact of their campaigns." Examples include troll farms, Internet bots, astroturfing, and "paid engagement".
In 2020, the Brookings Institution proposed amending this framework to include Distribution, defined by the "technical protocols that enable, constrain, and shape user behavior in a virtual space". Similarly, the Carnegie Endowment for International Peace proposed adding Degree ("distribution of the content ... and the audiences it reaches") and Effect ("how much of a threat a given case poses").
Comparisons with propaganda
Whether and to what degree disinformation and propaganda overlap is subject to debate. Some (like U.S. Department of State)
define propaganda as the use of non-rational arguments to either
advance or undermine a political ideal, and use disinformation as an
alternative name for undermining propaganda. While others consider them to be separate concepts altogether.
One popular distinction holds that disinformation also describes
politically motivated messaging designed explicitly to engender public
cynicism, uncertainty, apathy, distrust, and paranoia, all of which
disincentivize citizen engagement and mobilization for social or
political change.
Practice
Disinformation is the label often given to foreign information manipulation and interference (FIMI).
Studies on disinformation are often concerned with the content of
activity whereas the broader concept of FIMI is more concerned with the
"behaviour of an actor" that is described through the military doctrine concept of tactics, techniques, and procedures (TTPs).
Disinformation is primarily carried out by government intelligence agencies, but has also been used by non-governmental organizations and businesses. Front groups are a form of disinformation, as they mislead the public about their true objectives and who their controllers are. Most recently, disinformation has been deliberately spread through social media in the form of "fake news", disinformation masked as legitimate news articles and meant to mislead readers or viewers. Disinformation may include distribution of forgeddocuments, manuscripts, and photographs, or spreading dangerous rumours and fabricatedintelligence. Use of these tactics can lead to blowback, however, causing such unintended consequences such as defamation lawsuits or damage to the dis-informer's reputation.
Worldwide
Soviet disinformation
Use of disinformation as a Soviet tactical weapon started in 1923, when it became a tactic used in the Soviet political warfare called active measures.
Russian disinformation
Russian disinformation campaigns have occurred in many countries. For example, disinformation campaigns led by Yevgeny Prigozhin have been reported in several African countries. Russia, however, denies that it uses disinformation to influence public opinion.
Often Russian campaigns aim to disrupt domestic politics within Europe and the United States in an attempt to weaken the West due to its long-standing commitment to fight back against "Western imperialism" and shift the balance of world power to Russia and her allies. According to the Voice of America,
Russia seeks to promote American isolationism, border security concerns
and racial tensions within the United States through its disinformation
campaigns.
Chinese disinformation
Spamouflage,
Dragonbridge, Spamouflage Dragon, Storm 1376, or Taizi Flood is an
online propaganda and disinformation operation that uses a network of
social media accounts to make posts in favor of the Chinese government and harass dissidents and journalists overseas since 2017. Beginning in the early 2020s, Spamouflage accounts also began making posts about American and Taiwanese politics. It is widely believed that the Chinese government, particularly the Ministry of Public Security, is behind the network. Spamouflage has increasingly used generative artificial intelligence for influence operations. The campaign has largely failed to receive views from real users, although it has attracted some organic engagement using new tactics.
American disinformation
The United States Intelligence Community appropriated use of the term disinformation in the 1950s from the Russian dezinformatsiya, and began to use similar strategies during the Cold War and in conflict with other nations. The New York Times reported in 2000 that during the CIA's effort to substitute Mohammed Reza Pahlavi for then-Prime Minister of IranMohammad Mossadegh, the CIA placed fictitious stories in the local newspaper. Reuters documented how, subsequent to the 1979 Soviet Union invasion of Afghanistan during the Soviet–Afghan War,
the CIA put false articles in newspapers of Islamic-majority countries,
inaccurately stating that Soviet embassies had "invasion day
celebrations". Reuters noted a former U.S. intelligence officer said they would attempt to gain the confidence of reporters and use them as secret agents, to affect a nation's politics by way of their local media.
In October 1986, the term gained increased currency in the U.S. when it was revealed that two months previously, the Reagan Administration had engaged in a disinformation campaign against then-leader of Libya, Muammar Gaddafi. White House representative Larry Speakes said reports of a planned attack on Libya as first broken by The Wall Street Journal on August 25, 1986, were "authoritative", and other newspapers including The Washington Post then wrote articles saying this was factual. U.S. State Department representative Bernard Kalb
resigned from his position in protest over the disinformation campaign,
and said: "Faith in the word of America is the pulse beat of our
democracy."
The executive branch of the Reagan administration kept watch on disinformation campaigns through three yearly publications by the Department of State: Active Measures: A Report on the Substance and Process of Anti-U.S. Disinformation and Propaganda Campaigns (1986); Report on Active Measures and Propaganda, 1986–87 (1987); and Report on Active Measures and Propaganda, 1987–88 (1989).
According to a report by Reuters, the United States ran a propaganda campaign to spread disinformation about the Sinovac Chinese COVID-19
vaccine, including using fake social media accounts to spread the
disinformation that the Sinovac vaccine contained pork-derived
ingredients and was therefore haram under Islamic law. Reuters said the ChinaAngVirus disinformation campaign
was designed to "counter what it perceived as China's growing influence
in the Philippines" and was prompted by the "[fear] that China's COVID diplomacy and propaganda could draw other Southeast Asian countries, such as Cambodia and Malaysia, closer to Beijing". The campaign was also described as "payback for Beijing's efforts to blame Washington for the pandemic". The campaign primarily targeted people in the Philippines and used a social media hashtag for "China is the virus" in Tagalog. The campaign ran from 2020 to mid-2021. The primary contractor for the U.S. military on the project was General Dynamics IT, which received $493 million for its role.
Response
Responses from cultural leaders
Pope Francis condemned disinformation in a 2016 interview, after being made the subject of a fake news website during the 2016 U.S. election cycle which falsely claimed that he supported Donald Trump. He stated that the worst thing the news media could do was spread disinformation and said the act was a sin, comparing those who spread disinformation to individuals who engage in coprophilia.
Ethics in warfare
In a contribution to the 2014 book Military Ethics and Emerging Technologies, writers David Danks and Joseph H. Danks discuss the ethical implications in using disinformation as a tactic during information warfare. They note there has been a significant degree of philosophical debate over the issue as related to the ethics of war and use of the technique. The writers describe a position whereby the use of disinformation is occasionally allowed, but not in all situations. Typically the ethical test to consider is whether the disinformation was performed out of a motivation of good faith and acceptable according to the rules of war. By this test, the tactic during World War II of putting fake inflatable tanks in visible locations on the Pacific Islands
in order to falsely present the impression that there were larger
military forces present would be considered as ethically permissible.
Conversely, disguising a munitions plant as a healthcare facility in
order to avoid attack would be outside the bounds of acceptable use of
disinformation during war.
Research
Research related to disinformation studies is increasing as an applied area of inquiry. The call to formally classify disinformation as a cybersecurity threat is made by advocates due to its increase in social networking sites.
Despite the proliferation of social media websites, Facebook and
Twitter showed the most activity in terms of active disinformation
campaigns. Techniques reported on included the use of bots to amplify
hate speech, the illegal harvesting of data, and paid trolls to harass
and threaten journalists.
Whereas disinformation research focuses primarily on how actors orchestrate deceptions on social media, primarily via fake news, new research investigates how people take what started as deceptions and circulate them as their personal views.
As a result, research shows that disinformation can be conceptualized
as a program that encourages engagement in oppositional fantasies (i.e.,
culture wars), through which disinformation circulates as rhetorical ammunition for never-ending arguments. As disinformation entangles with culture wars, identity-driven controversies constitute a vehicle through which disinformation disseminates on social media.
This means that disinformation thrives, not despite raucous grudges but
because of them. The reason is that controversies provide fertile
ground for never-ending debates that solidify points of view.
Scholars have pointed out that disinformation is not only a
foreign threat as domestic purveyors of disinformation are also
leveraging traditional media outlets such as newspapers, radio stations,
and television news media to disseminate false information. Current research suggests right-wing online political activists in the United States may be more likely to use disinformation as a strategy and tactic.
Governments have responded with a wide range of policies to address
concerns about the potential threats that disinformation poses to
democracy, however, there is little agreement in elite policy discourse
or academic literature as to what it means for disinformation to
threaten democracy, and how different policies might help to counter its
negative implications.
Consequences of exposure to disinformation online
There
is a broad consensus amongst scholars that there is a high degree of
disinformation, misinformation, and propaganda online; however, it is
unclear to what extent such disinformation has on political attitudes in
the public and, therefore, political outcomes. This conventional wisdom
has come mostly from investigative journalists, with a particular rise
during the 2016 U.S. election: some of the earliest work came from Craig
Silverman at Buzzfeed News. Cass Sunstein supported this in #Republic, arguing that the internet would become rife with echo chambers and informational cascades of misinformation leading to a highly polarized and ill-informed society.
Research after the 2016 election found: (1) for 14 percent of
Americans social media was their "most important" source of election
news; 2) known false news stories "favoring Trump were shared a total of
30 million times on Facebook, while those favoring Clinton were shared 8
million times"; 3) the average American adult saw fake news stories,
"with just over half of those who recalled seeing them believing them";
and 4) people are more likely to "believe stories that favor their
preferred candidate, especially if they have ideologically segregated
social media networks."
Correspondingly, whilst there is wide agreement that the digital spread
and uptake of disinformation during the 2016 election was massive and
very likely facilitated by foreign agents, there is an ongoing debate on
whether all this had any actual effect on the election. For example, a
double blind randomized-control experiment by researchers from the
London School of Economics (LSE), found that exposure to online fake
news about either Trump or Clinton had no significant effect on
intentions to vote for those candidates. Researchers who examined the
influence of Russian disinformation on Twitter during the 2016 US
presidential campaign found that exposure to disinformation was (1)
concentrated among a tiny group of users, (2) primarily among
Republicans, and (3) eclipsed by exposure to legitimate political news
media and politicians. Finally, they find "no evidence of a meaningful
relationship between exposure to the Russian foreign influence campaign
and changes in attitudes, polarization, or voting behavior."
As such, despite its mass dissemination during the 2016 Presidential
Elections, online fake news or disinformation probably did not cost
Hillary Clinton the votes needed to secure the presidency.
Research on this topic remains inconclusive, for example,
misinformation appears not to significantly change political knowledge
of those exposed to it.
There seems to be a higher level of diversity of news sources that
users are exposed to on Facebook and Twitter than conventional wisdom
would dictate, as well as a higher frequency of cross-spectrum
discussion.Other evidence has found that disinformation campaigns rarely succeed in altering the foreign policies of the targeted states.
Research is also challenging because disinformation is meant to
be difficult to detect and some social media companies have discouraged
outside research efforts.
For example, researchers found disinformation made "existing detection
algorithms from traditional news media ineffective or not
applicable...[because disinformation] is intentionally written to
mislead readers...[and] users' social engagements with fake news produce
data that is big, incomplete, unstructured, and noisy." Facebook, the largest social media company, has been criticized by analytical journalists and scholars for preventing outside research of disinformation.
Alternative perspectives and critiques
Researchers
have criticized the framing of disinformation as being limited to
technology platforms, removed from its wider political context and
inaccurately implying that the media landscape was otherwise
well-functioning.
"The field possesses a simplistic understanding of the effects of media
technologies; overemphasizes platforms and underemphasizes politics;
focuses too much on the United States and Anglocentric analysis; has a
shallow understanding of political culture and culture in general; lacks
analysis of race, class, gender, and sexuality as well as status,
inequality, social structure, and power; has a thin understanding of
journalistic processes; and, has progressed more through the exigencies
of grant funding than the development of theory and empirical findings."
Alternative perspectives have been proposed:
Moving beyond fact-checking and media literacy to study a pervasive phenomenon as something that involves more than news consumption.
Moving beyond technical solutions including AI-enhanced fact checking to understand the systemic basis of disinformation.
Develop a theory that goes beyond Americentrism to develop a global perspective, understand cultural imperialism and Third World dependency on Western news, and understand disinformation in the Global South.
Develop market-oriented disinformation research that examines the financial incentives and business models that nudge content creators and digital platforms to circulate disinformation online.
Develop understandings of Gendered-based disinformation (GBD)
defined as "the dissemination of false or misleading information
attacking women (especially political leaders, journalists and public
figures), basing the attack on their identity as women."
The research literature on how disinformation spreads is growing. Studies show that disinformation spread in social media can be classified into two broad stages: seeding and echoing.
"Seeding," when malicious actors strategically insert deceptions, like
fake news, into a social media ecosystem, and "echoing" is when the
audience disseminates disinformation argumentatively as their own
opinions often by incorporating disinformation into a confrontational
fantasy.
Internet manipulation
Internet manipulation is the co-optation of online digital technologies, including algorithms, social bots, and automated scripts, for commercial, social, military, or political purposes. Internet and social media manipulation are the prime vehicles for spreading disinformation due to the importance of digital platforms for media consumption and everyday communication. When employed for political purposes, internet manipulation may be used to steer public opinion, polarise citizens, circulate conspiracy theories, and silence political dissidents. Internet manipulation can also be done for profit, for instance, to harm corporate or political adversaries and improve brand reputation. Internet manipulation is sometimes also used to describe the selective enforcement of Internet censorship or selective violations of net neutrality.
Internet manipulation for propaganda purposes with the help of data analysis and internet bots in social media is called computational propaganda.
Studies show four main methods of seeding disinformation online: