Search This Blog

Tuesday, February 20, 2024

Popular culture

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Popular_culture

Popular culture (also called mass culture or pop culture) is generally recognized by members of a society as a set of practices, beliefs, artistic output (also known as popular art or mass art) and objects that are dominant or prevalent in a society at a given point in time. Popular culture also encompasses the activities and feelings produced as a result of interaction with these dominant objects. The primary driving forces behind popular culture, especially when speaking of Western popular cultures, are the media, mass appeal, marketing and capitalism; and it is produced by what philosopher Theodor Adorno refers to as the "culture industry".

Heavily influenced in modern times by mass media, this collection of ideas permeates the everyday lives of people in a given society. Therefore, popular culture has a way of influencing an individual's attitudes towards certain topics. However, there are various ways to define pop culture. Because of this, popular culture is something that can be defined in a variety of conflicting ways by different people across different contexts. It is generally viewed in contrast to other forms of culture such as folk culture, working-class culture, or high culture, and also from different academic perspectives such as psychoanalysis, structuralism, postmodernism, and more. The common pop-culture categories are entertainment (such as film, music, television and video games), sports, news (as in people/places in the news), politics, fashion, technology, and slang.

History

In the past, folk culture functioned analogously to the popular culture of the masses and of the nations.

The phrase "popular culture" was coined in the 19th century or earlier. Traditionally, popular culture was associated with poor education and with the lower classes, as opposed to the "official culture" and higher education of the upper classes. With the rise of the Industrial Revolution in the eighteenth and nineteenth centuries, Britain experienced social changes that resulted in increased literacy rates, and with the rise of capitalism and industrialization, people began to spend more money on entertainment, such as (commercialised) pubs and sports. Reading also gained traction. Labeling penny dreadfuls the Victorian equivalent of video games, The Guardian in 2016 described penny fiction as "Britain's first taste of mass-produced popular culture for the young". A growing consumer culture and an increased capacity for travel via the newly invented railway (the first public railway, Stockton and Darlington Railway, opened in north-east England in 1825) created both a market for cheap popular literature and the ability for its distribution on a large scale. The first penny serials were published in the 1830s to meet the growing demand.

The stress on the distinction from "official culture" became more pronounced towards the end of the 19th century, a usage that became established by the interbellum period.

From the end of World War II, following major cultural and social changes brought by mass media innovations, the meaning of "popular culture" began to overlap with the connotations of "mass culture", "media culture", "image culture", "consumer culture", and "culture for mass consumption".

The abbreviated form "pop" for "popular", as in "pop music", dates from the late 1950s. Although the terms "pop" and "popular" are in some cases used interchangeably, and their meaning partially overlap, the term "pop" is narrower. Pop is specific to something containing qualities of mass appeal, while "popular" refers to what has gained popularity, regardless of its style.

Definition

According to author John Storey, there are various definitions of popular culture. The quantitative definition of culture has the problem that too much "high culture" (e.g., television dramatizations of Jane Austen) is also "popular." "Pop culture" is also defined as the culture that is "leftover" when we have decided what high culture is. However, many works straddle the boundaries, e.g., William Shakespeare and Charles Dickens, Leo Tolstoy, and George Orwell.

A third definition equates pop culture with "mass culture" and ideas. This is seen as a commercial culture, mass-produced for mass consumption by mass media. From a Western European perspective, this may be compared to American culture. Alternatively, "pop culture" can be defined as an "authentic" culture of the people, but this can be problematic as there are many ways of defining the "people." Storey argued that there is a political dimension to popular culture; neo-Gramscian hegemony theory "... sees popular culture as a site of struggle between the 'resistance' of subordinate groups in society and the forces of 'incorporation' operating in the interests of dominant groups in society." A postmodernist approach to popular culture would "no longer recognize the distinction between high and popular culture."

Storey claims that popular culture emerged from the urbanization of the Industrial Revolution. Studies of Shakespeare (by Weimann, Barber, or Bristol, for example) locate much of the characteristic vitality of his drama in its participation in Renaissance popular culture, while contemporary practitioners like Dario Fo and John McGrath use popular culture in its Gramscian sense that includes ancient folk traditions (the commedia dell'arte for example).

Popular culture is constantly evolving and occurs uniquely in place and time. It forms currents and eddies, and represents a complex of mutually interdependent perspectives and values that influence society and its institutions in various ways. For example, certain currents of pop culture may originate from, (or diverge into) a subculture, representing perspectives with which the mainstream popular culture has only limited familiarity. Items of popular culture most typically appeal to a broad spectrum of the public. Important contemporary contributions to understanding what popular culture means have been given by the German researcher Ronald Daus, who studies the impact of extra-European cultures in North America, Asia, and especially in Latin America.

Levels

Within the realm of popular culture, there exists an organizational culture. From its beginning, popular culture has revolved around classes in society and the push-back between them. Within popular culture, there are two levels that have emerged, high and low. High culture can be described as art and works considered of superior value, historically, aesthetically and socially. Low culture is regarded by some as that of the lower classes, historically.

Folklore

Adaptations based on traditional folklore provide a source of popular culture. This early layer of cultural mainstream still persists today, in a form separate from mass-produced popular culture, propagating by word of mouth rather than via mass media, e.g. in the form of jokes or urban legends. With the widespread use of the Internet from the 1990s, the distinction between mass media and word-of-mouth has become blurred.

Although the folkloric element of popular culture engages heavily with the commercial element, communities amongst the public have their own tastes and they may not always embrace every cultural or subcultural item sold. Moreover, certain beliefs and opinions about the products of commercial culture may spread by word-of-mouth, and become modified in the process and in the same manner that folklore evolves.

Criticism

Popular culture in the West has been critiqued for its being a system of commercialism that privileges products selected and mass-marketed by the upper-class capitalist elite; such criticisms are most notable in many Marxist theorists such as Herbert Marcuse, Theodor Adorno, Max Horkheimer, bell hooks, Antonio Gramsci, Guy Debord, Fredric Jameson, Terry Eagleton, as well as certain postmodern philosophers such as Jean-François Lyotard, who has written about the commercialisation of information under capitalism, and Jean Baudrillard, as well as others.

The culture industry

The most influential critiques of popular culture came from Marxist theorists of the Frankfurt School during the twentieth century. Theodor Adorno and Max Horkheimer analyzed the dangers of the culture industry in their influential work the Dialectic of Enlightenment by drawing upon the works of Kant, Marx, Nietzsche and others. Capitalist popular culture, as Adorno argued, was not an authentic culture of the people but a system of homogenous and standardized products manufactured in the service of capitalist domination by the elite. The consumer demand for Hollywood films, pop tunes, and consumable books is influenced by capitalist industries like Hollywood and the elite who decide which commodities are to be promoted in the media, including television and print journalism. Adorno wrote, "The industry bows to the vote it has itself rigged". It is the elite who commodify products in accordance with their narrow ideological values and criteria, and Adorno argues that the audience becomes accustomed to these formulaic conventions, making intellectual contemplation impossible. Adorno's work has had a considerable influence on culture studies, philosophy, and the New Left.

Writing in the New Yorker in 2014, music critic Alex Ross, argued that Adorno's work has a renewed importance in the digital age: "The pop hegemony is all but complete, its superstars dominating the media and wielding the economic might of tycoons...Culture appears more monolithic than ever, with a few gigantic corporations—Google, Apple, Facebook, Amazon—presiding over unprecedented monopolies". There is much scholarship on how Western entertainment industries strengthen transnational capitalism and reinforce Western cultural dominance. Hence, rather than being a local culture, commercial entertainment is artificially reinforced by transnational media corporations.

Jack Zipes, a professor of German and literature, critiqued the mass commercialization and corporate hegemony behind the Harry Potter franchise. He argued that the commodities of the culture industry are "popular" because they are homogenous and obey standard conventions; the media then influences the tastes of children. In his analysis of Harry Potter's global brand, Zipes wrote, "It must conform to the standards of exception set by the mass media and promoted by the culture industry in general. To be a phenomenon means that a person or commodity must conform to the hegemonic groups that determine what makes up a phenomenon."

Imperialism

According to John M. MacKenzie, many products of popular culture have been designed to promote imperialist ideologies and to glorify the British upper classes rather than present a democratic view of the world. Although there are many films which do not contain such propaganda, there have been many films that promote racism and militarist imperialism.

Feminist critique

bell hooks, an influential feminist, argues that commercial commodities and celebrities cannot be symbols of progressiveness when they collaborate with imperialist capitalism and promote ideals of beauty; hooks uses Beyoncé as an example of a commodity reinforced by capitalist corporations complicit in imperialism and patriarchy.

Propaganda

Edward S. Herman and Noam Chomsky critiqued the mass media in their 1988 work Manufacturing Consent: The Political Economy of the Mass Media. They argue that mass media is controlled by a powerful hegemonic elite who are motivated by their own interests that determine and manipulate what information is present in the mainstream. The mass media is therefore a system of propaganda.

In sum, a propaganda approach to media coverage suggests a systematic and highly political dichotomization in news coverage based on serviceability to important domestic power interests. This should be observable in dichotomized choices of story and in the volume and quality of coverage... such dichotomization in the mass media is massive and systematic: not only are choices for publicity and suppression comprehensible in terms of system advantage, but the modes of handling favored and inconvenient materials (placement, tone, context, fullness of treatment) differ in ways that serve political ends.

Consumerism

According to the postmodern sociologist Jean Baudrillard, the individual is trained into the duty of seeking the relentless maximization of pleasure lest he or she become asocial. Therefore, "enjoyment" and "fun" become indistinguishable from the need to consume. Whereas the Frankfurt School believed consumers were passive, Baudrillard argued that consumers were trained to consume products in the form of active labor in order to achieve upward social mobility. Thus, consumers under capitalism are trained to purchase products such as pop albums and consumable fiction in order to signal their devotion to social trends, fashions, and subcultures. Although the consumption may arise from an active choice, the choice is still the consequence of a social conditioning that the individual is unconscious of. Baudrillard says, "One is permanently governed by a code whose rules and meaning—constraints—like those of language—are, for the most part, beyond the grasp of individuals".

Jean Baudrillard argued that the vague conception "Public Opinion" is a subjective and inaccurate illusion, for it attributes a sovereignty to consumers that they do not really have. In Baudrillard's understanding, the products of capitalist popular culture can only give the illusion of rebellion, since they are still produced by a system controlled by the powerful. Baudrillard stated in an interview, critiquing the content and production of The Matrix:

The Matrix paints the picture of a monopolistic superpower, like we see today, and then collaborates in its refraction. Basically, its dissemination on a world scale is complicit with the film itself. On this point it is worth recalling Marshall McLuhan: the medium is the message. The message of The Matrix is its own diffusion by an uncontrollable and proliferating contamination.

Sources

Print culture

With the invention of the printing press in the sixteenth century, mass-produced, cheap books, pamphlets and periodicals became widely available to the public. With this, the transmission of common knowledge and ideas was possible.

Radio culture

In the 1890s, Nikola Tesla and Guglielmo Marconi created the radiotelegraph, allowing for the modern radio to be born. This led to the radio being able to influence a more "listened-to" culture, with individuals being able to feel like they have a more direct impact. This radio culture is vital, because it was imperative to advertising, and it introduced the commercial.

Films

Films and cinema are highly influential to popular culture, as films as an art form are what people seem to respond to the most. With moving pictures being first captured by Eadweard Muybridge in 1877, films have evolved into elements that can be cast into different digital formats, spreading to different cultures.

The impact of films and cinema are most evident when analyzing in the search of what the films aim to portray. Films are used to seek acceptance and understanding of many subjects because of the influence the films carry—an example of an early representation of this can be seen in Casablanca (1942): the film introduced war subjects to the public after the United States entered World War II, and it meant to increase pro-war sentiment for the allies. Films are a known massive influencer to popular culture yet not all films create a movement that contributes enough to be part of the popular culture that starts movements. The content must resonate to most of the public so the knowledge in the material connects with the majority. Popular culture is a set of beliefs in trends and entail to change a person's set of ideologies and create social transformation. The beliefs are still a trend that change more rapidly in the modern age that carries a continuation of outpouring media and more specifically films. The trend does not last but it also carries a different effect based on individuals that can be grouped to generalized groups based on age and education. The creation of culture by films is seen in fandoms, religions, ideologies, and movements. The culture of film is more evident through social media. Social media is an instant source of feedback and creates discussion on films. A repeating event that has been set in modern culture within the trend setting phase is the creation of movements in social media platforms to defend a featured subject on a film.

Popular culture or mass culture, is reached easily with films which are easily shared and reached worldwide.

Television programs

A television program is a segment of audiovisual content intended for broadcast (other than a commercial, trailer, or other content not serving as attraction for viewership).

Television programs may be fictional (as in comedies and dramas), or non-fictional (as in documentary, light entertainment, news and reality television). They may be topical (as in the case of a local newscast and some made-for-television movies), or historical (as in the case of many documentaries and fictional series). They can be primarily instructional or educational, or entertaining as is the case in situation comedy and game shows.

Music

Popular music is music with wide appeal that is typically distributed to large audiences through the music industry. These forms and styles can be enjoyed and performed by people with little or no musical training. It stands in contrast to both art music and traditional or "folk" music. Art music was historically disseminated through the performances of written music, although since the beginning of the recording industry, it is also disseminated through recordings. Traditional music forms such as early blues songs or hymns were passed along orally, or to smaller, local audiences.

Sports

Sports include all forms of competitive physical activity or games which, through casual or organised participation, aim to use, maintain or improve physical ability and skills while providing enjoyment to participants, and in some cases, entertainment for spectators.

Corporate branding

Corporate branding refers to the practice of promoting the brand name of a corporate entity, as opposed to specific products or services.

Personal branding

Personal branding includes the use of social media to promotion to brands and topics to further good repute among professionals in a given field, produce an iconic relationship between a professional, a brand and its audience that extends networks past the conventional lines established by the mainstream and to enhance personal visibility. Popular culture: is generally recognized by members of a society as a set of the practices, beliefs, and objects that are dominant or prevalent in a society at a given point in time. As celebrities online identities are extremely important in order to create a brand to line-up sponsorships, jobs, and opportunities. As influencers, micro-celebrities, and users constantly need to find new ways to be unique or stay updated with trends, in order to maintain followers, views, and likes. For example, Ellen DeGeneres has created her own personal branding through her talk show The Ellen DeGeneres Show. As she developed her brand we can see the branches she created to extend her fan base such as Ellen clothing, socks, pet beds, and more.

Social media

Social media is interactive computer-mediated technologies that facilitate the creation or sharing of information, ideas, career interests and other forms of expression via virtual communities and networks. Social media platforms such as Instagram, Facebook, Twitter, YouTube, TikTok and Snapchat are the most popular applications used on a daily basis by younger generations. Social media tends to be implemented into the daily routine of individuals in our current society. Social media is a vital part of our culture as it continues to impact the forms of communication used to connect with those in our communities, families, or friend groups. We often see that terms or slang are used online that is not used in face-to-face conversations, thus, adding to a persona users create through the screens of technology. For example, some individuals respond to situations with a hashtag or emojis.

Social media influencers have become trendsetters through their direct engagement with large audiences, upending conventional marketing and advertising techniques. Consumer purchase choices have been impacted by fashion partnerships, sponsored material and outfit ideas offered by influencers. Social media has also made fashion more accessible by fostering uniqueness, expanding the depiction of trends, and facilitating the rise of niche influencers. The influencer-driven fashion industry, nevertheless, has also come under fire for encouraging excessive consumerism, inflated beauty ideals, and labour exploitation.

Clothing

The fashion industry has witnessed tremendous, rapid, and applaudable changes over the years, culminating in the production of masterpieces unimaginable in the past decades. This dynamic trend has compelled renowned clothing lines such as Christian Dior, Louis Vuitton, and Balenciaga to intensify research and creative imagination to develop appealing designs that are outstanding and fascinating. Fashion has changed from the classical baggy and oversized pieces to trendy and slim-fit clothes for both males and females. Further, the past few decades have seen the reintroduction of old designs, which have been revitalized and improvised to fit the current market needs. Additionally, celebrities and influencers are at the forefront of setting fashion trends through various platforms. The future of fashion is promising and is significantly inspired by past trends, for instance, the oversized boyfriend blazers.

Influences

Pop culture has had a lasting influence to the products being released in their time. Many examples of art, books, films and others, have been inspired by pop culture. These include:

Pop art

Pop art is an art movement that first emerged in the 1950s as a reaction and a counter to traditional and high-class art by including common and well-known images and references. Artists known during this movement include Eduardo Paolozzi, Richard Hamilton, Larry RiversRobert Rauschenberg and Andy Warhol.

Pop music

Pop music is a wide-ranging genre of music whose characteristics include styles and tones that have a wider and more massive appeal to all kinds of consumers. Oftentimes, many examples of these music contain influences from other pre-existing works. Singers and musicians of this genre include Michael Jackson, Madonna, Justin Bieber, Elvis Presley, Beatles and Beyonce.

Pop culture fiction

Pop culture fiction is a genre in books, comics, films, shows, and many other story-telling media that depicts stories that are purposely filled with easter eggs and references to pop culture. The genre often overlaps with satire and parody, but the most-well known are considered to be more serious works of literature. Writers of this genre include Ernest Cline, Bret Easton Ellis, Bryan Lee O'Malley, and Louis Bulaong.

Pop culture studies

Pop culture studies are researches thesis, and other academic works that analyzes various trends of pop and mass culture, pop icons, or the effects and influences of pop culture in society and history. Ray B. Browne is one of the first academicians to conduct courses on the studies about pop culture.

Nuclear weapons in popular culture

A nuclear fireball lights up the night in a United States nuclear test.

Since their public debut in August 1945, nuclear weapons and their potential effects have been a recurring motif in popular culture, to the extent that the decades of the Cold War are often referred to as the "atomic age".

Images of nuclear weapons

The now-familiar peace symbol was originally a specifically anti-nuclear weapons icon.

The atomic bombings of Hiroshima and Nagasaki ushered in the "atomic age", and the bleak pictures of the bombed-out cities released shortly after the end of World War II became symbols of the power and destruction of the new weapons (it is worth noting that the first pictures released were only from distances, and did not contain any human bodies—such pictures would only be released in later years).

The first pictures released of a nuclear explosion—the blast from the Trinity test—focused on the fireball itself; later pictures would focus primarily on the mushroom cloud that followed. After the United States began a regular program of nuclear testing in the late 1940s, continuing through the 1950s (and matched by the Soviet Union), the mushroom cloud has served as a symbol of the weapons themselves.

Pictures of nuclear weapons themselves (the actual casings) were not made public until 1960, and even those were only mock-ups of the "Fat Man" and "Little Boy" weapons dropped on Japan—not the more powerful weapons developed more recently. Diagrams of the general principles of operation of thermonuclear weapons have been available in very general terms since at least 1969 in at least two encyclopedia articles, and open literature research into inertial confinement fusion has been at least richly suggestive of how the "secondary" and "inter" stages of thermonuclear weapons work.

In general, however, the design of nuclear weapons has been the most closely guarded secret until long after the secrets had been independently developed—or stolen—by all the major powers and a number of lesser ones. It is generally possible to trace US knowledge of foreign progress in nuclear weapons technology by reading the US Department of Energy document "Restricted Data Declassification Decisions—1946 to the Present" (although some nuclear weapons design data have been reclassified since concern about proliferation of nuclear weapons to "nth countries" increased in the late 1970s).

However, two controversial publications breached this silence in ways that made many in the US and allied nuclear weapons community very anxious.

Former nuclear weapons designer Theodore Taylor described how terrorists could, without using any classified information at all, design a working fission nuclear weapon to journalist John McPhee, who published this information in the best-selling book The Curve of Binding Energy in 1974.

In 1979 the US Department of Energy sued to suppress the publication of an article by Howard Morland in The Progressive magazine detailing design information on thermonuclear and fission nuclear weapons he was able to glean in conversations with officials at several DoE contractor plants active in manufacture of nuclear weapons components. Ray Kidder, a nuclear weapon designer testifying for Morland, identified several open literature sources for the information Morland repeated in his article, while aviation historian Chuck Hansen produced a similar document for US Senator Charles Percy. Morland and The Progressive won the case, and Morland published a book on his journalistic research for the article, the trial, and a technical appendix in which he "corrected" what he felt were false assumptions in his original article about the design of thermonuclear weapons in his book, The Secret That Exploded. The concepts in Morland's book are widely acknowledged in other popular-audience descriptions of the inner workings of thermonuclear weapons.

During the 1950s, many countries developed large civil-defense programs designed to aid the populace in the event of nuclear warfare. These generally included drills for evacuation to fallout shelters, popularized through popular media such as the US film Duck and Cover. These drills, with their images of eerily empty streets and the activity of hiding from a nuclear bomb under a schoolroom desk, would later become symbols of the seemingly inescapable and common fate created by such weapons. Some Americans built back-yard fallout shelters, which would provide little protection from a direct hit, but would keep out wind-blown fallout, for a few days or weeks (Switzerland, which never acquired nuclear weapons, although it had the technological sophistication to do so long before Pakistan or North Korea, has built nuclear blast shelters that would protect most of its population from a nuclear war.)

After the development of hydrogen bombs in the 1950s, and especially after the massive and widely publicized Castle Bravo test accident by the United States in 1954, which spread nuclear fallout over a large area and resulted in the death of at least one Japanese fisherman, the idea of a "limited" or "survivable" nuclear war became increasingly replaced by a perception that nuclear war meant the potentially instant end of all civilization: in fact, the explicit strategy of the nuclear powers was called Mutual Assured Destruction. Nuclear weapons became synonymous with apocalypse, and as a symbol this resonated through the culture of nations with freedom of the press. Several popular novels—such as Alas, Babylon and On the Beach—portrayed the aftermath of nuclear war. Several science-fiction novels, such as A Canticle for Leibowitz, explored the long-term consequences. Stanley Kubrick's film Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb satirically portrayed the events and the thinking that could begin a nuclear war.

Nuclear weapons are also one of the main targets of peace organizations. The CND (Campaign for Nuclear Disarmament) was one of the main organisations campaigning against the "Bomb". Its symbol, a combination of the semaphore symbols for "N" (nuclear) and "D" (disarmament), entered modern popular culture as an icon of peace.

A limited number of Indian films depicting nuclear weapons and technology have been made and these mostly show nuclear weapons in a negative light especially in the hand of non-state actors. Atom Bomb (1947) by Homi Wadia, one of the first Indian films involving nuclear technology, is about a man with enhanced physical strength due to the effects of a nuclear weapons test. Indian films involving non-state actors and nuclear weapons include Agent Vinod (1977) by Deepak Bahry and a 2012 film of the same name by Sriram Raghavan, Vikram (1986) by Rajasekhar, Mr. India (1987) by Shekhar Kapoor, Tirangaa (1993) by Mehul Kumar, The Hero: Love Story of a Spy (2003) by Anil Sharma, Fanaa (2006) by Kunal Kohli, and Tiger Zinda Hai (2017) by Ali Abbas Zafar. Other Indian films covering nuclear weapons include Hava Aney Dey (2004) by Partho Sen-Gupta about a future nuclear war between India and Pakistan and Parmanu: The Story of Pokhran (2018) by Abhishek Sharma – the first nuclear historical film in India about the Pokhran-II Indian nuclear weapons tests. Sacred Games, an Indian Netflix series based on the novel of the same name, involves the acquirement of a nuclear bomb by an apocalyptic cult who plans to blow it up in Mumbai.

History of biological warfare

From Wikipedia, the free encyclopedia

Before the 20th century, the use of biological agents took three major forms:

  • Deliberate contamination of food and water with poisonous or contagious material
  • Use of microbes, biological toxins, animals, or plants (living or dead) in a weapon system
  • Use of biologically inoculated fabrics and persons

In the 20th century, sophisticated bacteriological and virological techniques allowed the production of significant stockpiles of weaponized bio-agents:

Antiquity

The earliest documented incident of the intention to use biological weapons is possibly recorded in Hittite texts of 1500–1200 BC, in which victims of tularemia were driven into enemy lands, causing an epidemic. Although the Assyrians knew of ergot, a parasitic fungus of rye which produces ergotism when ingested, there is no evidence that they poisoned enemy wells with the fungus, as has been claimed.

According to Homer's epic poems about the legendary Trojan War, the Iliad and the Odyssey, spears and arrows were tipped with poison. During the First Sacred War in Greece, in about 590 BC, Athens and the Amphictionic League poisoned the water supply of the besieged town of Kirrha (near Delphi) with the toxic plant hellebore. According to Herodotus, during the 4th century BC Scythian archers dipped their arrow tips into decomposing cadavers of humans and snakes or in blood mixed with manure, supposedly making them contaminated with dangerous bacterial agents like Clostridium perfringens and Clostridium tetani, and snake venom.

In a naval battle against King Eumenes of Pergamon in 184 BC, Hannibal of Carthage had clay pots filled with venomous snakes and instructed his sailors to throw them onto the decks of enemy ships. The Roman commander Manius Aquillius poisoned the wells of besieged enemy cities in about 130 BC. In about AD 198, the Parthian city of Hatra (near Mosul, Iraq) repulsed the Roman army led by Septimius Severus by hurling clay pots filled with live scorpions at them. Like Scythian archers, Roman soldiers dipped their swords into excrements and cadavers too — victims were commonly infected by tetanus as result.

The use of bees as guided biological weapons was described in Byzantine written sources, such as Tactica of Emperor Leo VI the Wise in the chapter On Naval Warfare.

There are numerous other instances of the use of plant toxins, venoms, and other poisonous substances to create biological weapons in antiquity.

Post-classical ages

The Mongol Empire established commercial and political connections between the Eastern and Western areas of the world, through the most mobile army ever seen. The armies, composed of the most rapidly moving travelers who had ever moved between the steppes of East Asia (where bubonic plague was and remains endemic among small rodents), managed to keep the chain of infection without a break until they reached, and infected, peoples and rodents who had never encountered it. The ensuing Black Death may have killed up to 25 million total, including China and roughly a third of the population of Europe and in the next decades, changing the course of Asian and European history.

Biologicals were extensively used in many parts of Africa from the sixteenth century AD, most of the time in the form of poisoned arrows, or powder spread on the war front as well as poisoning of horses and water supply of the enemy forces. In Borgu, there were specific mixtures to kill, hypnotize, make the enemy bold, and to act as an antidote against the poison of the enemy as well. The creation of biologicals was reserved for a specific and professional class of medicine-men. In South Sudan, the people of the Koalit Hills kept their country free of Arab invasions by using tsetse flies as a weapon of war. Several accounts can give an idea of the efficiency of the biologicals. For example, Mockley-Ferryman in 1892 commented on the Dahomean invasion of Borgu, stating that "their (Borgawa) poisoned arrows enabled them to hold their own with the forces of Dahomey notwithstanding the latter's muskets." The same scenario happened to Portuguese raiders in Senegambia when they were defeated by Mali's Gambian forces, and to John Hawkins in Sierra Leone where he lost a number of his men to poisoned arrows.

During the Middle Ages, victims of the bubonic plague were used for biological attacks, often by flinging fomites such as infected corpses and excrement over castle walls using catapults. Bodies would be tied along with cannonballs and shot towards the city area. In 1346, during the siege of Caffa (now Feodossia, Crimea) the attacking Tartar Forces (subjugated by the Mongol empire under Genghis Khan more than a century earlier), used the bodies of Mongol warriors of the Golden Horde who had died of plague, as weapons. It has been speculated that this operation may have been responsible for the advent of the Black Death in Europe. At the time, the attackers thought that the stench was enough to kill them, though it was the disease that was deadly. (However in recent years, some scholarship and research has cast doubt on the use of trebuchets to hurl corpses due to factors such as the size of trebuchets and how close they would have to be constructed due to the hilly landscape in Caffa.)

At the siege of Thun-l'Évêque in 1340, during the Hundred Years' War, the attackers catapulted decomposing animals into the besieged area.

In 1422, during the siege of Karlstein Castle in Bohemia, Hussite attackers used catapults to throw dead (but not plague-infected) bodies and 2000 carriage-loads of dung over the walls.

English Longbowmen usually did not draw their arrows from a quiver; rather, they stuck their arrows into the ground in front of them. This allowed them to nock the arrows faster and the dirt and soil was likely to stick to the arrowheads, thus making the wounds much more likely to become infected.

17th and 18th century

Europe

The last known incident of using plague corpses for biological warfare may have occurred in 1710, when Russian forces attacked Swedish troops by flinging plague-infected corpses over the city walls of Reval (Tallinn) (although this is disputed). However, during the 1785 siege of La Calle, Tunisian forces flung diseased clothing into the city.

North America

During Pontiac's Rebellion, in June 1763 a group of Native Americans laid siege to British-held Fort Pitt. During a parley in the middle of the siege on June 24, Captain Simeon Ecuyer gave representatives of the besieging Delawares, including Turtleheart, two blankets and a handkerchief enclosed in small metal boxes that had been exposed to smallpox, in an attempt to spread the disease to the besieging Native warriors in order to end the siege. William Trent, the trader turned militia commander who had come up with the plan, sent an invoice to the British colonial authorities in North America indicating that the purpose of giving the blankets was "to Convey the Smallpox to the Indians." The invoice was approved by General Thomas Gage, then serving as Commander-in-Chief, North America. A reported outbreak that began the spring before left as many as one hundred Native Americans dead in Ohio Country from 1763 to 1764. It is not clear whether the smallpox was a result of the Fort Pitt incident or the virus was already present among the Delaware people as outbreaks happened on their own every dozen or so years and the delegates were met again later and seemingly had not contracted smallpox. Trade and combat also provided ample opportunity for transmission of the disease.

A month later, Colonel Henry Bouquet, who was leading a relief attempt towards Fort Pitt, wrote to his superior Sir Jeffery Amherst to discuss the possibility of using smallpox-infested blankets to spread smallpox amongst Natives. Amherst wrote to Bouquet that: "Could it not be contrived to send the small pox among the disaffected tribes of Indians? We must on this occasion use every stratagem in our power to reduce them." Bouquet replied in a latter, writing that "I will try to inocculate [sic] the Indians by means of Blankets that may fall in their hands, taking care however not to get the disease myself. As it is pity to oppose good men against them, I wish we could make use of the Spaniard's Method, and hunt them with English Dogs. Supported by Rangers, and some Light Horse, who would I think effectively extirpate or remove that Vermine." After receiving Bouquet's response, Amherst wrote back to him, stating that "You will Do well to try to Innoculate [sic] the Indians by means of Blankets, as well as to try Every other method that can serve to Extirpate this Execrable Race. I should be very glad your Scheme for Hunting them Down by Dogs could take Effect, but England is at too great a Distance to think of that at present."

New South Wales

Many Aboriginal Australians have claimed that smallpox outbreaks in Australia were a deliberate result of European colonisation, though this possibility has only been raised by historians from the 1980s onwards, when Noel Butlin suggested "there are some possibilities that... disease could have been used deliberately as an exterminating agent."

In 1997, scholar David Day claimed there "remains considerable circumstantial evidence to suggest that officers other than Phillip, or perhaps convicts or soldiers... deliberately spread smallpox among aborigines", and in 2000, John Lambert argued that "strong circumstantial evidence suggests the smallpox epidemic which ravaged Aborigines in 1789, may have resulted from deliberate infection."

Judy Campbell argued in 2002 that it is highly improbable that the First Fleet was the source of the epidemic as "smallpox had not occurred in any members of the First Fleet"; the only possible source of infection from the Fleet being exposure to variolous matter imported for the purposes of inoculation against smallpox. Campbell argued that, while there has been considerable speculation about a hypothetical exposure to the First Fleet's variolous matter, there was no evidence that Aboriginal people were ever actually exposed to it. She pointed to regular contact between fishing fleets from the Indonesia archipelago, where smallpox was always present, and Aboriginal people in Australia's North as a more likely source for the introduction of smallpox. She notes that while these fishermen are generally referred to as 'Macassans', referring to the port of Macassar on the island of Sulawesi from which most of the fishermen originated, "some travelled from islands as distant as New Guinea". She noted that there is little disagreement that the smallpox epidemic of the 1860s was contracted from Macassan fishermen and spread through the Aboriginal population by Aborigines fleeing outbreaks and also via their traditional social, kinship and trading networks. She argued that the 1789–90 epidemic followed the same pattern.

These claims are controversial as it is argued that any smallpox virus brought to New South Wales probably would have been sterilised by heat and humidity encountered during the voyage of the First Fleet from England and incapable of biological warfare. However, in 2007, Christopher Warren demonstrated that any smallpox which might have been carried onboard the First Fleet may have been still viable upon landing in Australia. Since them, some scholars have argued that smallpox in Australia was deliberately spread by the inhabitants of the British penal colony at Port Jackson in 1789.

In 2013, Warren reviewed the issue and argued that smallpox did not spread across Australia before 1824 and showed that there was no smallpox at Macassar that could have caused the outbreak at Sydney. Warren, however, did not address the issue of persons who joined the Macassan fleet from other islands and from parts of Sulawesi other than the port of Macassar. Warren concluded that the British were "the most likely candidates to have released smallpox" near Sydney Cove in 1789. Warren proposed that the British had no choice as they were confronted with dire circumstances when, among other factors, they ran out of ammunition for their muskets; he also used Aboriginal oral tradition and archaeological records from indigenous gravesites to analyse the cause and effect of the spread of smallpox in 1789.

Prior to the publication of Warren's article (2013), a professor of physiology John Carmody argued that the epidemic was an outbreak of chickenpox which took a drastic toll on an Aboriginal population without immunological resistance. With regard to how smallpox might have reached the Sydney region, Carmody said: "There is absolutely no evidence to support any of the theories and some of them are fanciful and far-fetched." Warren argued against the chickenpox theory at endnote 3 of Smallpox at Sydney Cove – Who, When, Why?. However, in a 2014 joint paper on historic Aboriginal demography, Carmody and the Australian National University's Boyd Hunter argued that the recorded behavior of the epidemic ruled out smallpox and indicated chickenpox.

20th century

By the turn of the 20th century, advances in microbiology had made thinking about "germ warfare" part of the zeitgeist. Jack London, in his short story '"Yah! Yah! Yah!"' (1909), described a punitive European expedition to a South Pacific island deliberately exposing the Polynesian population to measles, of which many of them died. London wrote another science fiction tale the following year, "The Unparalleled Invasion" (1910), in which the Western nations wipe out all of China with a biological attack.

First World War

During the First World War (1914–1918), the German Empire made some early attempts at anti-agriculture biological warfare. Those attempts were made by special sabotage group headed by Rudolf Nadolny. Using diplomatic pouches and couriers, the German General Staff supplied small teams of saboteurs in the Russian Duchy of Finland, and in the then-neutral countries of Romania, the United States, and Argentina. In Finland, saboteurs mounted on reindeer placed ampoules of anthrax in stables of Russian horses in 1916. Anthrax was also supplied to the German military attaché in Bucharest, as was glanders, which was employed against livestock destined for Allied service. German intelligence officer and US citizen Anton Casimir Dilger established a secret lab in the basement of his sister's home in Chevy Chase, Maryland, that produced glanders which was used to infect livestock in ports and inland collection points including, at least, Newport News, Norfolk, Baltimore, and New York City, and probably St. Louis and Covington, Kentucky. In Argentina, German agents also employed glanders in the port of Buenos Aires and also tried to ruin wheat harvests with a destructive fungus. Also, Germany itself became a victim of similar attacks — horses bound for Germany were infected with Burkholderia by French operatives in Switzerland.

The Geneva Protocol of 1925 prohibited the use of chemical weapons and biological weapons among signatory states in international armed conflicts, but said nothing about experimentation, production, storage, or transfer; later treaties did cover these aspects. Twentieth-century advances in microbiology enabled the first pure-culture biological agents to be developed by World War II.

Interwar period and WWII

In the interwar period, little research was done in biological warfare in both Britain and the United States at first. In the United Kingdom the preoccupation was mainly in withstanding the anticipated conventional bombing attacks that would be unleashed in the event of war with Germany. As tensions increased, Sir Frederick Banting began lobbying the British government to establish a research program into the research and development of biological weapons to effectively deter the Germans from launching a biological attack. Banting proposed a number of innovative schemes for the dissemination of pathogens, including aerial-spray attacks and germs distributed through the mail system.

With the onset of hostilities, the Ministry of Supply finally established a biological weapons programme at Porton Down, headed by the microbiologist Paul Fildes. The research was championed by Winston Churchill and soon tularemia, anthrax, brucellosis, and botulism toxins had been effectively weaponized. In particular, Gruinard Island in Scotland, during a series of extensive tests, was contaminated with anthrax for the next 48 years. Although Britain never offensively used the biological weapons it developed, its program was the first to successfully weaponize a variety of deadly pathogens and bring them into industrial production. Other nations, notably France and Japan, had begun their own biological-weapons programs.

When the United States entered the war, mounting British pressure for the creation of a similar research program for an Allied pooling of resources led to the creation of a large industrial complex at Fort Detrick, Maryland in 1942 under the direction of George W. Merck. The biological and chemical weapons developed during that period were tested at the Dugway Proving Grounds in Utah. Soon there were facilities for the mass production of anthrax spores, brucellosis, and botulism toxins, although the war was over before these weapons could be of much operational use.

However, the most notorious program of the period was run by the secret Imperial Japanese Army Unit 731 during the war, based at Pingfan in Manchuria and commanded by Lieutenant General Shirō Ishii. This unit did research on BW, conducted often fatal human experiments on prisoners, and produced biological weapons for combat use. Although the Japanese effort lacked the technological sophistication of the American or British programs, it far outstripped them in its widespread application and indiscriminate brutality. Biological weapons were used against both Chinese soldiers and civilians in several military campaigns. Three veterans of Unit 731 testified in a 1989 interview to the Asahi Shimbun that they contaminated the Horustein river with typhoid near the Soviet troops during the Battle of Khalkhin Gol. In 1940, the Imperial Japanese Army Air Force bombed Ningbo with ceramic bombs full of fleas carrying the bubonic plague. A film showing this operation was seen by the imperial princes Tsuneyoshi Takeda and Takahito Mikasa during a screening made by mastermind Shiro Ishii. During the Khabarovsk War Crime Trials the accused, such as Major General Kiyashi Kawashima, testified that as early as 1941 some 40 members of Unit 731 air-dropped plague-contaminated fleas on Changde. These operations caused epidemic plague outbreaks.

Many of these operations were ineffective due to inefficient delivery systems, using disease-bearing insects rather than dispersing the agent as a bioaerosol cloud.

Ban Shigeo, a technician at the Japanese Army's 9th Technical Research Institute, left an account of the activities at the Institute which was published in "The Truth About the Army Noborito Institute". Ban included an account of his trip to Nanking in 1941 to participate in the testing of poisons on Chinese prisoners. His testimony tied the Noborito Institute to the infamous Unit 731, which participated in biomedical research.

During the final months of World War II, Japan planned to utilize plague as a biological weapon against U.S. civilians in San Diego, California, during Operation Cherry Blossoms at Night. They hoped that it would kill tens of thousands of U.S. civilians and thereby dissuade America from attacking Japan. The plan was set to launch on September 22, 1945, at night, but it never came into fruition due to Japan's surrender on August 15, 1945.

When the war ended, the US Army quietly enlisted certain members of Noborito in its efforts against the communist camp in the early years of the Cold War. The head of Unit 731, Shiro Ishii, was granted immunity from war crimes prosecution in exchange for providing information to the United States on the Unit's activities. Allegations were made that a "chemical section" of a US clandestine unit hidden within Yokosuka naval base was operational during the Korean War, and then worked on unspecified projects inside the United States from 1955 to 1959, before returning to Japan to enter the private sector.

Some of the Unit 731 personnel were imprisoned by the Soviets, and may have been a potential source of information on Japanese weaponization.

Postwar period

Considerable research into BW was undertaken throughout the Cold War era by the US, UK and USSR, and probably other major nations as well, although it is generally believed that such weapons were never used.

In Britain, the 1950s saw the weaponization of plague, brucellosis, tularemia and later equine encephalomyelitis and vaccinia viruses. Trial tests at sea were carried out including Operation Cauldron off Stornoway in 1952. The programme was cancelled in 1956, when the British government unilaterally renounced the use of biological and chemical weapons.

The United States initiated its weaponization efforts with disease vectors in 1953, focused on Plague-fleas, EEE-mosquitoes, and yellow fever – mosquitoes (OJ-AP). However, US medical scientists in occupied Japan undertook extensive research on insect vectors, with the assistance of former Unit 731 staff, as early as 1946.

The United States Army Chemical Corps then initiated a crash program to weaponize anthrax (N) in the E61 1/2-lb hour-glass bomblet. Though the program was successful in meeting its development goals, the lack of validation on the infectivity of anthrax stalled standardization. The United States Air Force was also unsatisfied with the operational qualities of the M114/US bursting bomblet and labeled it an interim item until the Chemical Corps could deliver a superior weapon.

Around 1950 the Chemical Corps also initiated a program to weaponize tularemia (UL). Shortly after the E61/N failed to make standardization, tularemia was standardized in the 3.4" M143 bursting spherical bomblet. This was intended for delivery by the MGM-29 Sergeant missile warhead and could produce 50% infection over a 7-square-mile (18 km2) area. Although tularemia is treatable by antibiotics, treatment does not shorten the course of the disease. US conscientious objectors were used as consenting test subjects for tularemia in a program known as Operation Whitecoat. There were also many unpublicized tests carried out in public places with bio-agent simulants during the Cold War.

E120 biological bomblet, developed before the U.S. signed the Biological and Toxic Weapons Convention.

In addition to the use of bursting bomblets for creating biological aerosols, the Chemical Corps started investigating aerosol-generating bomblets in the 1950s. The E99 was the first workable design, but was too complex to be manufactured. By the late 1950s the 4.5" E120 spraying spherical bomblet was developed; a B-47 bomber with a SUU-24/A dispenser could infect 50% or more of the population of a 16-square-mile (41 km2) area with tularemia with the E120. The E120 was later superseded by dry-type agents.

Dry-type biologicals resemble talcum powder, and can be disseminated as aerosols using gas expulsion devices instead of a burster or complex sprayer. The Chemical Corps developed Flettner rotor bomblets and later triangular bomblets for wider coverage due to improved glide angles over Magnus-lift spherical bomblets. Weapons of this type were in advanced development by the time the program ended.

From January 1962, Rocky Mountain Arsenal "grew, purified and biodemilitarized" plant pathogen Wheat Stem Rust (Agent TX), Puccinia graminis, var. tritici, for the Air Force biological anti-crop program. TX-treated grain was grown at the Arsenal from 1962–1968 in Sections 23–26. Unprocessed TX was also transported from Beale AFB for purification, storage, and disposal. Trichothecenes Mycotoxin is a toxin that can be extracted from Wheat Stem Rust and Rice Blast and can kill or incapacitate depending on the concentration used. The "red mold disease" of wheat and barley in Japan is prevalent in the region that faces the Pacific Ocean. Toxic trichothecenes, including nivalenol, deoxynivalenol, and monoace tylnivalenol (fusarenon- X) from Fusarium nivale, can be isolated from moldy grains. In the suburbs of Tokyo, an illness similar to "red mold disease" was described in an outbreak of a food borne disease, as a result of the consumption of Fusarium- infected rice. Ingestion of moldy grains that are contaminated with trichothecenes has been associated with mycotoxicosis.

Although there is no evidence that biological weapons were used by the United States, China and North Korea accused the US of large-scale field testing of BW against them during the Korean War (1950–1953). At the time of the Korean War the United States had only weaponized one agent, brucellosis ("Agent US"), which is caused by Brucella suis. The original weaponized form used the M114 bursting bomblet in M33 cluster bombs. While the specific form of the biological bomb was classified until some years after the Korean War, in the various exhibits of biological weapons that Korea alleged were dropped on their country nothing resembled an M114 bomblet. There were ceramic containers that had some similarity to Japanese weapons used against the Chinese in World War II, developed by Unit 731.

Cuba also accused the United States of spreading human and animal disease on their island nation.

During the 1948 1947–1949 Palestine war, International Red Cross reports raised suspicion that the Israeli Haganah militia had released Salmonella typhi bacteria into the water supply for the city of Acre, causing an outbreak of typhoid among the inhabitants. Egyptian troops later claimed to have captured disguised Haganah soldiers near wells in Gaza, whom they executed for allegedly attempting another attack. Israel denies these allegations.

Biological and Toxin Weapons Convention

In mid-1969, the UK and the Warsaw Pact, separately, introduced proposals to the UN to ban biological weapons, which would lead to the signing of the Biological and Toxin Weapons Convention in 1972. United States President Richard Nixon signed an executive order in November 1969, which stopped production of biological weapons in the United States and allowed only scientific research of lethal biological agents and defensive measures such as immunization and biosafety. The biological munition stockpiles were destroyed, and approximately 2,200 researchers became redundant.

Special munitions for the United States Special Forces and the CIA and the Big Five Weapons for the military were destroyed in accordance with Nixon's executive order to end the offensive program. The CIA maintained its collection of biologicals well into 1975 when it became the subject of the senate Church Committee.

The Biological and Toxin Weapons Convention was signed by the US, UK, USSR and other nations, as a ban on "development, production and stockpiling of microbes or their poisonous products except in amounts necessary for protective and peaceful research" in 1972. The convention bound its signatories to a far more stringent set of regulations than had been envisioned by the 1925 Geneva Protocols. By 1996, 137 countries had signed the treaty; however it is believed that since the signing of the Convention the number of countries capable of producing such weapons has increased.

The Soviet Union continued research and production of offensive biological weapons in a program called Biopreparat, despite having signed the convention. The United States had no solid proof of this program until Vladimir Pasechnik defected in 1989, and Kanatjan Alibekov, the first deputy director of Biopreparat defected in 1992. Pathogens developed by the organization would be used in open-air trials. It is known that Vozrozhdeniye Island, located in the Aral Sea, was used as a testing site. In 1971, such testing led to the accidental aerosol release of smallpox over the Aral Sea and a subsequent smallpox epidemic.

During the closing stages of the Rhodesian Bush War, the Rhodesian government resorted to use chemical and biological warfare agents. Watercourses at several sites inside the Mozambique border were deliberately contaminated with cholera. These biological attacks had little overall impact on the fighting capability of ZANLA, but resulted in at least 809 recorded deaths of insurgents. It also caused considerable distress to the local population. The Rhodesians also experimented with several other pathogens and toxins for use in their counterinsurgency.

After the 1991 Persian Gulf War, Iraq admitted to the United Nations inspection team to having produced 19,000 liters of concentrated botulinum toxin, of which approximately 10,000 L were loaded into military weapons; the 19,000 liters have never been fully accounted for. This is approximately three times the amount needed to kill the entire current human population by inhalation, although in practice it would be impossible to distribute it so efficiently, and, unless it is protected from oxygen, it deteriorates in storage.

According to the U.S. Congress Office of Technology Assessment eight countries were generally reported as having undeclared offensive biological warfare programs in 1995: China, Iran, Iraq, Israel, Libya, North Korea, Syria and Taiwan. Five countries had admitted to having had offensive weapon or development programs in the past: United States, Russia, France, the United Kingdom, and Canada. Offensive BW programs in Iraq were dismantled by Coalition Forces and the UN after the first Gulf War (1990–91), although an Iraqi military BW program was covertly maintained in defiance of international agreements until it was apparently abandoned during 1995 and 1996.

21st century

On September 18, 2001, and for a few days thereafter, several letters were received by members of the U.S. Congress and American media outlets which contained intentionally prepared anthrax spores; the attack sickened at least 22 people of whom five died. The identity of the bioterrorist remains unknown, although in 2008 authorities stated that Bruce Ivins was likely the perpetrator. (See 2001 anthrax attacks.)

Suspicions of an ongoing Iraqi biological warfare program were not substantiated in the wake of the March 2003 invasion of that country. Later that year, however, Muammar Gaddafi was persuaded to terminate Libya's biological warfare program. In 2008, according to a U.S. Congressional Research Service report, China, Cuba, Egypt, Iran, Israel, North Korea, Russia, Syria and Taiwan are considered, with varying degrees of certainty, to have some biologicalwarfare capability. According to the same 2008 report by the U.S. Congressional Research Service, "Developments in biotechnology, including genetic engineering, may produce a wide variety of live agents and toxins that are difficult to detect and counter; and new chemical warfare agents and mixtures of chemical weapons and biowarfare agents are being developed . . . Countries are using the natural overlap between weapons and civilian applications of chemical and biological materials to conceal chemical weapon and bioweapon production." By 2011, 165 countries had officially joined the BWC and pledged to disavow biological weapons.

Disease surveillance

From Wikipedia, the free encyclopedia
 
Disease surveillance is an epidemiological practice by which the spread of disease is monitored in order to establish patterns of progression. The main role of disease surveillance is to predict, observe, and minimize the harm caused by outbreak, epidemic, and pandemic situations, as well as increase knowledge about which factors contribute to such circumstances. A key part of modern disease surveillance is the practice of disease case reporting.

In modern times, reporting incidences of disease outbreaks has been transformed from manual record keeping, to instant worldwide internet communication.

The number of cases could be gathered from hospitals – which would be expected to see most of the occurrences – collated, and eventually made public. With the advent of modern communication technology, this has changed dramatically. Organizations like the World Health Organization (WHO) and the Centers for Disease Control and Prevention (CDC) now can report cases and deaths from significant diseases within days – sometimes within hours – of the occurrence. Further, there is considerable public pressure to make this information available quickly and accurately.

Mandatory reporting

Formal reporting of notifiable infectious diseases is a requirement placed upon health care providers by many regional and national governments, and upon national governments by the World Health Organization to monitor spread as a result of the transmission of infectious agents. Since 1969, WHO has required that all cases of the following diseases be reported to the organization: cholera, plague, yellow fever, smallpox, relapsing fever and typhus. In 2005, the list was extended to include polio and SARS. Regional and national governments typically monitor a larger set of (around 80 in the U.S.) communicable diseases that can potentially threaten the general population. Tuberculosis, HIV, botulism, hantavirus, anthrax, and rabies are examples of such diseases. The incidence counts of diseases are often used as health indicators to describe the overall health of a population.

World Health Organization

The World Health Organization (WHO) is the lead agency for coordinating global response to major diseases. The WHO maintains Websites for a number of diseases and has active teams in many countries where these diseases occur.

During the SARS outbreak in early 2004, for example, the Beijing staff of the WHO produced updates every few days for the duration of the outbreak. Beginning in January 2004, the WHO has produced similar updates for H5N1. These results are widely reported and closely watched.

WHO's Epidemic and Pandemic Alert and Response (EPR) to detect, verify rapidly and respond appropriately to epidemic-prone and emerging disease threats covers the following diseases:

Political challenges

As the lead organization in global public health, the WHO occupies a delicate role in global politics. It must maintain good relationships with each of the many countries in which it is active. As a result, it may only report results within a particular country with the agreement of the country's government. Because some governments regard the release of any information on disease outbreaks as a state secret, this can place the WHO in a difficult position.

The WHO coordinated International Outbreak Alert and Response is designed to ensure "outbreaks of potential international importance are rapidly verified and information is quickly shared within the Network" but not necessarily by the public; integrate and coordinate "activities to support national efforts" rather than challenge national authority within that nation in order to "respect the independence and objectivity of all partners". The commitment that "All Network responses will proceed with full respect for ethical standards, human rights, national and local laws, cultural sensitivities and tradition" ensures each nation that its security, financial, and other interests will be given full weight.

Technical challenges

Testing for a disease can be expensive, and distinguishing between two diseases can be prohibitively difficult in many countries. One standard means of determining if a person has had a particular disease is to test for the presence of antibodies that are particular to this disease. In the case of H5N1, for example, there is a low pathogenic H5N1 strain in wild birds in North America that a human could conceivably have antibodies against. It would be extremely difficult to distinguish between antibodies produced by this strain, and antibodies produced by Asian lineage HPAI A(H5N1). Similar difficulties are common, and make it difficult to determine how widely a disease may have spread.

There is currently little available data on the spread of H5N1 in wild birds in Africa and Asia. Without such data, predicting how the disease might spread in the future is difficult. Information that scientists and decision makers need to make useful medical products and informed decisions for health care, but currently lack include:

  • Surveillance of wild bird populations
  • Cell cultures of particular strains of diseases

H5N1

Surveillance of H5N1 in humans, poultry, wild birds, cats and other animals remains very weak in many parts of Asia and Africa. Much remains unknown about the exact extent of its spread.

H5N1 in China is less than fully reported. Blogs have described many discrepancies between official China government announcements concerning H5N1 and what people in China see with their own eyes. Many reports of total H5N1 cases have excluded China due to widespread disbelief in China's official numbers. (See Disease surveillance in China.)

"Only half the world's human bird flu cases are being reported to the World Health Organization within two weeks of being detected, a response time that must be improved to avert a pandemic, a senior WHO official said Saturday. Shigeru Omi, WHO's regional director for the Western Pacific, said it is estimated that countries would have only two to three weeks to stamp out, or at least slow, a pandemic flu strain after it began spreading in humans."

David Nabarro, chief avian flu coordinator for the United Nations, says avian flu has too many unanswered questions.

CIDRAP reported on 25 August 2006 on a new US government Website that allows the public to view current information about testing of wild birds for H5N1 avian influenza, which is part of a national wild-bird surveillance plan that "includes five strategies for early detection of highly pathogenic avian influenza. Sample numbers from three of these will be available on HEDDS: live wild birds, subsistence hunter-killed birds, and investigations of sick and dead wild birds. The other two strategies involve domestic bird testing and environmental sampling of water and wild-bird droppings. [...] A map on the new USGS site shows that, 9327 birds from Alaska have been tested so far this year, with only a few from most other states. Last year, officials tested just 721 birds from Alaska and none from most other states, another map shows. The goal of the surveillance program for 2006 is to collect 75000 to 100000 samples from wild birds and 50000 environmental samples, officials have said".

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...