Wartime collaboration is cooperation with the enemy against one's country of citizenship in wartime, and in the words of historian Gerhard Hirschfeld, "is as old as war and the occupation of foreign territory".
The term collaborator dates to the 19th century and was used in France during the Napoleonic Wars. The meaning shifted during World War II to designate traitorous collaboration with the enemy. The related term collaborationism is used by historians restricted to a subset of wartime collaborators in Vichy France who actively promoted German victory.
Etymology
The term collaborate dates from 1871, and is a back-formation from collaborator (1802), from the French collaborateur as used during the Napoleonic Wars against smugglers trading with England and assisting in the escape of monarchists, and is itself derived from the Latin collaboratus, past participle of collaborare "work with", from com- "with" + labore "to work". The meaning of "traitorous cooperation with the enemy" dates from 1940, originally in reference to the Vichy Government of France which cooperated with the Germans, 1940–44. It was first used in the modern sense on 24 October 1940 in a meeting between Marshal Philippe Pétain and Adolf Hitler in Montoire-sur-Loire a few months after the Fall of France. Pétain believed that Germany had won the war, and informed the French people that he accepted "collaboration" with Germany.
Collaboration in wartime can take many forms, including political,
economic, social, cultural, or military collaboration. The activities
undertaken can be treasonous, to varying extent, and in a World War II
context generally means working with the enemy actively.
Stanley Hoffmann subdivided collaboration into involuntary (reluctant recognition of necessity) and voluntary (an attempt to exploit necessity). According to him, collaboration can be either servile or ideological.
Servile is service to an enemy based on necessity for personal
survival or comfort, whereas ideological is advocacy for cooperation
with an enemy power.
In contrast, Bertram Gordon used the terms "collaborator" and
"collaborationist" for non-ideological and ideological collaborations,
respectively. James Mace Ward has asserted that, while collaboration is often equated with treason, there was "legitimate collaboration" between civilian internees (mostly Americans) in the Philippines and their Japanese captors for mutual benefit and to enhance the possibilities of the internees to survive. Collaboration with the Axis Powers in Europe and Asia existed in varying degrees in all the occupied countries.
Collaboration with the enemy in wartime goes back to prehistory,
and has always been present. Since World War II, historians have
reserved its use mostly to refer to the wartime occupation of France by
Germany in World War II. Unlike other defeated countries which
capitulated to Germany and fled into exile, France signed an armistice,
remained in France, cooperated with the German Reich economically and
politically, and used the new situation to effectuate a transfer of
power to a cooperative French State under Marshall Phillipe Pétain.
In the context of World War II Europe, and especially in Vichy France, historians draw a distinction between collaboration and collaborator on the one hand, and the related terms collaborationism and collaborationist on the other.
Stanley Hoffmann in 1974 and other historians have used the term collaborationnistes to refer to fascists and Nazi sympathisers who, for anti-communist or other ideological reasons, wished a reinforced collaboration with Hitler's Germany.
Collaborationism refers to those, primarily from the fascist right in Vichy France, who embraced the goal of a German victory as their own, whereas collaboration refers to those among the French who for whatever reason collaborated with the Germans.
In some colonial or occupation conflicts, soldiers of native origin were seen as collaborationists. This could be the case of mamluks and janissaries
in the Ottoman Empire. In some cases, the meaning was not disrespectful
at the beginning, but changed with later use when borrowed: the Ottoman
term for the sipahi soldiers became sepoy in British India, which in turn was adapted as cipayo in Spanish or zipaio in Basque with a more overtly pejorative meaning of "mercenary".
In
France after liberation by the Allies, many women had their heads
shaved as punishment for having had relationships with Germans.
In France, a distinction emerged between the collaborateur (collaborator) and the collaborationniste (collaborationist). The term collaborationist is mainly used to describe individuals enrolled in pseudo-Nazi parties, often based in Paris, who had belief in fascist ideology or were anti-communists. Collaborators
on the other hand, engaged in collaboration for pragmatic reasons, such
as carrying out the orders of the occupiers to maintain public order
(policeman) or normal government functions (civil servants); commerce
(including sex workers and other women who had relationships with Germans and were called, "horizontal collaborators"); or to fulfill personal ambitions and greed. Collaborators were not necessarily believers in fascism or pro-Nazi Germany.
With the defeat of the Axis, collaborators were often punished by public humiliation,
imprisonment, and execution. In France, 10,500 collaborators are
estimated to have been executed, some after legal proceedings, others
extrajudicially.
Recent research by the British historian Simon Kitson has shown that French authorities did not wait until the Liberation to begin pursuing collaborationists. The Vichy government,
itself heavily engaged in collaboration, arrested around 2,000
individuals on charges of passing information to the Germans. Their
reasons for doing so was to centralise collaboration to ensure that the
state maintained a monopoly in Franco-German relations and to defend
sovereignty so that they could negotiate from a position of strength. It
was among the many compromises that the government engaged along the
way.
Adolf Hitler was providing Germans in France with plentiful
opportunities to exploit French weakness and maximizing tensions in the
country around June 1940.
On June 25, 1940, Jean Moulin, a French civil servant who served as the first President of the National Council of the Resistance
during World War II, was advised by German authorities to sign a
declaration condemning an alleged massacre of Chartres civilians by
French Senegalese troops. Moulin refused to collaborate, knowing that
the bombing massacre was done by Germans. Moulin was then incarcerated
by the Germans, where he would cut his throat with glass to prevent
himself from caving into giving information.
In Belgium, collaborators were organized into the VNV party and the DeVlag movement in Flanders, and into the Rexist movement in Wallonia. There was an active collaboration movement in the Netherlands.
Vidkun Quisling (1887–1945), a major in the Norwegian Army and former minister of defence.
He became minister-president of Norway in 1942, and attempted to Nazify
the country, but was fiercely resisted by most of the population. His
name is now synonymous with a high-profile government collaborator, now
known as a Quisling.
During the last two years of the occupation, the last quisling prime-minister, Ioannis Rallis, created the Security Battalions which were military corps that collaborated openly with the Germans, and had strong anti-communist ideology. The Security Battalions, along with various far-right and royalist
organizations, and parts of the country's police forces of that era,
were directly or indirectly responsible for the brutal killing of
thousands of Greeks during the occupation. Contrary to what happened to
other European countries, the members of these corps were never tried
or punished for their crimes, due to the Dekemvriana events that erupted immediately after the liberation, followed by the White Terror and the Greek Civil War, two years later.
Yugoslavia
The main collaborationist regime in Yugoslavia was the Independent State of Croatia, a puppet state semi-independent of Nazi Germany. Leon Rupnik
(1880–1946) was a Slovene general who collaborated as he took control
of the semi-independent region of the Italian-occupied southern Slovenia
known as the Province of Ljubljana, which came under German control in 1943.
The main collaborationists in East Yugoslavia were the German-puppet Serbian Government of National Salvation established on the German-occupied territory of Serbia, and the Yugoslav royalist Chetniks, who collaborated tactically with the Axis after 1941.
Poland
There was
relatively little collaboration in Poland with Nazi Germany, a point of
pride with the Polish people. However, the Soviet Union did find some
individuals who would work with them, and this is demonstrated notably
by the Lublin government set up by the Soviets in 1944 that operated in opposition to the Polish government-in-exile.
Germany
German citizen and non-Nazi Franz Oppenhoff accepted appointment as Mayor of the German city of Aachen in 1944, under authority of the Allied military command. He was assassinated on orders from Heinrich Himmler in 1945.
Vietnam
Vietnamese
emigres and expatriates living in France gained inspiration from the
Nazi occupation in the country. These people believed in many European
nationalist ideas at the time — these being a belief in an organic
ethnocultural national community and an authoritarian corporatist state
and economy. At the time Vietnamese feared that colonialism had
"systematically destroyed all elements of social order ... which would
have led the intellectual elite to oppose the bolshevization of the
country."
When German forces invaded France in May 1940 amid World War II,
the French military and government saw a collapse. In addition, six to
ten million people were forced to become refugees. The political
response was then provoked by the Vietnamese in the country.
France also had a group of Vietnamese students and professionals in Paris called the Amicale annamite.
They expressed a heavy dislike for French colonial rule without moving
forward with any explicit ideological agenda. Their motives were
expanded in 1943, with the addition of wanting to improve the situation
of Vietnamese soldiers interned as POWs. This included improvements in
conditions at camps, better food, health care, education, and vocational
training.
More recent examples of collaboration have included institutions and individuals in Afghanistan who collaborated with the Soviet occupation until 1989 and individuals in Iraq and Afghanistan recruited by the Coalition of the Willing. In 2014 during the occupation of Crimea and ongoing War in Donbass, some Ukrainian citizens collaborated with the invading Russian forces.
Israeli–Palestinian conflict
In Palestinian society, collaboration with Israel is viewed as a serious offence and social stain and is sometimes punished (judicially or extrajudicially) by death.
In addition, during the period of 2007–2009, around 30 Palestinians
have been sentenced to death in court on collaboration-related charges,
although the sentences have not been carried out.
In June 2009, Raed Sualha, a 15-year-old Palestinian boy, was
brutally tortured and hanged by his family because they suspected him of
collaborating with Israel. Authorities of the Palestinian territories launched an investigation into the case and arrested the perpetrators. Police said it was unlikely that such a young boy would have been recruited as an informer.
The Ukrainian government has had broad support from its population,
but support for Russia within Ukraine is common in Donbas. The Ukrainian
government compiled a "registry of collaborators." The Ukrainian
government says pro-Russian collaborators have acted as spotters to
assist shelling of the country. Anti-collaboration laws were enacted by
Ukrainian President Volodymyr Zelenskyy
after the invasion started, with offenders facing 15 years in prison
for collaborating with Russian forces, making public denials about
Russian aggression or supporting Russia.
People belonging to the same ethnic, religious or ideological
group as the invading enemy (while being a minority in their country of
citizenship) can sympathize with the attackers, or even view them as liberators.
The reasons why people collaborate with the enemy in wartime
vary. In World War II, collaborators with Nazi Germany were found in
Stalin's Soviet Union and in other Western European countries, and Japanese collaborators were operating in China.
Public perceptions of collaborators
Heonik Kwon: "Anyone who studies the reality of a modern war,
especially life under prolonged military occupation, will surely
encounter stories of collaboration between the subjugated locals and the
occupying power...The cooperation is often a coerced one; people may
have no choice but to cooperate. Since the authority that demands
cooperation may have brutally harmed the locals in the process of
conquest, collaborating with this authority can be a morally explosive
issue...the history of war inevitably involves stories of
collaboration..."
Timothy Brook: "On 30 October 1940, six days after meeting with Adolf Hitler in the railway station at Montoire, Philippe Pétain
announced on French radio that 'a collaboration has been envisioned
between our two countries.' Since then, 'collaboration' has been the
word by which we denigrate political cooperation with an occupying
force."
Edilberto C. de Jesus and Carlos Quirino. "Collaboration with the Japanese was a necessary evil embraced by the internee government [at Santo Tomas Internment Camp, Philippines] as preferable to a more direct and more oppressive enemy rule."
John Hickman identifies thirteen reasons why occupied populations might hold collaborators in contempt, because they are perceived as:
scapegoats for defeat
opportunistic
benefiting from their own poor decisions as leaders before the occupation
violating the norms of the traditional political order
World War II poster from the United States denouncing fifth columnists
A fifth column is any group of people who undermine a larger group from within, usually in favor of an enemy group or nation. According to Harris Mylonas
and Scott Radnitz, "fifth columns" are “domestic actors who work to
undermine the national interest, in cooperation with external rivals of
the state."
The activities of a fifth column can be overt or clandestine. Forces
gathered in secret can mobilize openly to assist an external attack.
This term is also extended to organised actions by military personnel.
Clandestine fifth column activities can involve acts of sabotage, disinformation, or espionage executed within defense lines by secret sympathizers with an external force.
Origin
The term "fifth column" originated in Spain (originally quinta columna) during the early phase of the Spanish Civil War. It gained popularity in the Loyalist faction media in early October 1936 and immediately started to spread abroad.
The exact origins of the term are not clear. Its first identified appearance was in a secret telegram sent to Berlin by the Germanchargé d'affaires in Alicante, Hans Hermann Völckers, dated September 30, 1936. He referred to an unidentified "supposed statement by Franco" which "is being circulated" (apparently in the Republican zone or in the Republican-held Levantine zone). In the statement, Franco allegedly claimed that there were four Nationalist columns approaching Madrid and a fifth column waiting to attack from the inside. However, the telegram was part of the secret German diplomatic correspondence and was discovered long after the civil war.
The first identified public use of the term is in the October 3, 1936, issue of the Madrid Communist daily Mundo Obrero. In a front-page article, the party propagandist Dolores Ibárruri referred to a very similar or the same statement as the one reported by Völckers but attributed it to General Emilio Mola. On the same day, the PCE activist Domingo Girón made a similar claim during a public rally. During the following days, Republican papers repeated the story but with differing detail; some attributed the phrase to General Queipo de Llano. In mid-October, the media already warned of the "famous fifth column".
Historians have never identified the original statement referred to by Völckers, Ibárruri, de Jong, and others. The transcripts of Francisco Franco's, Gonzalo Queipo de Llano's, and Emilio Mola's radio addresses have been published, but they do not contain the term,
and no other original statement containing this phrase has ever
surfaced. A British journalist who took part in Mola's press conference
on October 28, 1936, claimed that Mola referred to quinta columna on this very day, but at that time the term had already been used in the Republican press for more than three weeks.
Historiographic
works offer differing perspectives on authorship of the term. Many
scholars have no doubt about Mola's role and refer to "fifth column" as
to "a term coined in 1936 by General Emilio Mola", though they admit that the exact statement cannot be identified.
In some sources, Mola is noted as a person who used the term during an
impromptu press interview, and different though detailed versions of the
exchange are offered.
Probably the most popular version describes the theory of Mola's
authorship with a grade of doubt, either noting that it is presumed but
has never been proven, or that the phrase "is attributed" to Mola, who "apparently claimed" so,
or else noting that "la famosa quinta columna a la que parece que se
había referido el general Mola." (the famous fifth column that General
Mola seems to have referred to)
Some authors consider it possible if not likely that the term has been
invented by the Communist propaganda with the purpose of either raising
morale or providing justification for terror and repression; initially it might have been part of the whispering campaign, but was later openly floated by Communist propagandists. There are also other theories afloat.
Some writers, mindful of the origin of the phrase, use it only in
reference to military operations rather than the broader and less
well-defined range of activities that sympathizers might engage in to
support an anticipated attack.
Second World War
By the late 1930s, as American involvement in the war in Europe
became more likely, the term "fifth column" was commonly used to warn of
potential sedition and disloyalty within the borders of the United States. The fear of betrayal was heightened by the rapid fall of France
in 1940, which some blamed on internal weakness and a pro-German "fifth
column". A series of photos run in the June 1940 issue of Life magazine warned of "signs of Nazi Fifth Column Everywhere". In a speech to the House of Commons that same month, Winston Churchill reassured MPs that "Parliament has given us the powers to put down Fifth Column activities with a strong hand." In July 1940, Time magazine referred to talk of a fifth column as a "national phenomenon".
In August 1940, The New York Times mentioned "the first spasm of fear engendered by the success of fifth columns in less fortunate countries". One report identified participants in Nazi "fifth columns" as "partisans of authoritarian government everywhere", citing Poland, Czechoslovakia, Norway, and the Netherlands. During the Nazi invasion of Norway, the head of the Norwegian fascist party, Vidkun Quisling,
proclaimed the formation of a new fascist government in control of
Norway, with himself as Prime Minister, by the end of the first day of
fighting. The word "quisling" soon became a byword for "collaborator" or
"traitor".
The New York Times on August 11, 1940, featured three editorial cartoons using the term. John Langdon-Davies, a British journalist who covered the Spanish Civil War, wrote an account called The Fifth Column which was published the same year. In November 1940, Ralph Thomson, reviewing Harold Lavine's Fifth Column in America, a study of Communist and fascist groups in the U.S., in The New York Times, questioned his choice of that title: "the phrase has been worked so hard that it no longer means much of anything."
Dr. Seuss cartoon in PM dated February 13, 1942, with the caption 'Waiting for the Signal from Home'
Immediately following the Japanese attack on Pearl Harbor, U.S. Secretary of the Navy Frank Knox
issued a statement that "the most effective Fifth Column work of the
entire war was done in Hawaii with the exception of Norway." In a column published in The Washington Post, dated 12 February 1942, the columnist Walter Lippmann wrote of imminent danger from actions that might be taken by Japanese Americans. Titled "The Fifth Column on the Coast", he wrote of possible attacks that could be made along the West Coast of the United States that would amplify damage inflicted by a potential attack by Japanese naval and air forces. Suspicion about an active fifth column on the coast led eventually to the internment of Japanese Americans.
During the Japanese invasion of the Philippines, an article in the Pittsburgh Post-Gazette in December 1941 said the indigenous Moro Muslims were "capable of dealing with Japanese fifth columnists and invaders alike". Another in the Vancouver Sun the following month described how the large population of Japanese immigrants in Davao
in the Philippines welcomed the invasion: "the first assault on Davao
was aided by numbers of Fifth Columnists–residents of the town".
Later usage
Australian Prime Minister Menzies proposed a federal referendum on 22 September 1951 asking voters to give the Commonwealth Government the power to make laws regarding communists and communism.
German minority organizations in Czechoslovakia formed the Sudeten German Free Corps, which aided Nazi Germany. Some claimed they were "self-defense formations" created in the aftermath of World War I and unrelated to the German invasion two decades later.
More often their origins were discounted and they were defined by the
role they played in 1938–39: "The same pattern was repeated in
Czechoslovakia. Henlein's Free Corps played in that country the part of fifth column".
In 1945, a document produced by the U.S. Department of State
compared the earlier efforts of Nazi Germany to mobilize the support of
sympathizers in foreign nations to the superior efforts of the
international communist movement at the end of World War II: "a
communist party was in fact a fifth column as much as any [German] Bund
group, except that the latter were crude and ineffective in comparison
with the Communists". Arthur M. Schlesinger Jr.,
wrote in 1949: "the special Soviet advantage—the warhead—lies in the
fifth column; and the fifth column is based on the local Communist
parties".
Zainichi Koreans living in Japan, particularly those affiliated with the organization Chongryun (which is itself affiliated with the government of North Korea)
are sometimes seen as a "fifth column" by some Japanese, and have been
the victims of verbal and physical attacks. These have occurred more
frequently since the government of Kim Jong Il acknowledged it had abducted Japanese citizens from Japan and tested ballistic missiles near the waters of and over mainland Japan.
A significant number of Israeli Arabs, who compose approximately 20% of Israel's population, identify more with the Palestinian cause than with the State of Israel or Zionism. As a result, many Israeli Jews, including politicians, rabbis, journalists, and historians, view them (and/or the main Israeli Arab political group, the Joint List) as a fifth column.
Counter-jihad
literature has sometimes portrayed Western Muslims as a "fifth column",
collectively seeking to destabilize Western nations' identity and
values for the benefit of an international Islamic movement intent on
the establishment of a caliphate in Western countries. Following the 2015 attack by French-born Muslims on the offices of Charlie Hebdo in Paris, the leader of the UK Independence PartyNigel Farage said that Europe had "a fifth column living within our own countries". In 2001 Dutch politician Pim Fortuyn talked about Muslim immigrants being a "fifth column" the night he was dismissed as leader of Liveable Netherlands.
Putin
says (on 18'23"): "Yes, of course, they will back the so-called fifth
column, national traitors – those who make money here in our country but
live over there, and “live” not in the geographical sense of the word
but in their minds, in their servile mentality", and mentions the fifth
column two more times, on 19'57" and 20'33" (Closed captions available)
The title of Ernest Hemingway's only play "The Fifth Column" (1938) is a translation of General Mola's phrase, la quinta columna. In early 1937 Hemingway had been in Madrid, reporting the war from the loyalist side, and helping make the film The Spanish Earth. He returned to the US to publicise the film and wrote the play, in the Hotel Florida in Madrid, on his next visit to Spain later that year.
In the US an Australian radio play, The Enemy Within,
proved to be very popular, though this popularity was due to the belief
that the stories of fifth column activities were based on real events.
In December 1940 the Australian censors had the series banned.
British reviewers of Agatha Christie's novel N or M?
in 1941 used the term to describe the struggle of two British partisans
of the Nazi regime working on its behalf in Britain during World War
II.
In Frank Capra's film Meet John Doe
(1941), newspaper editor Henry Connell warns the politically-naïve
protagonist, John Doe, about a businessman's plans to promote his own
political ambitions using the apolitical John Doe Clubs. Connell says to
John: "Listen, pal, this fifth-column stuff is pretty rotten, isn't
it?", identifying the businessman with anti-democratic interests in the
United States. When Doe agrees, he adds: "And you'd feel like an awful
sucker if you found yourself marching right in the middle of it,
wouldn't you?"
Alfred Hitchcock's Saboteur (1942) features Robert Cummings asking for help against "fifth columnists" conspiring to sabotage the American war effort. Soon the term was being used in popular entertainment.
Several World War II era animated shorts include the term. Cartoons of Porky Pig asked any "fifth columnists" in the audience to leave the theater immediately. In Looney Tunes' Foney Fables, the narrator of a comic fairy tale described a wolf in sheep's clothing as a "fifth columnist". There was a Merrie Melodies cartoon released in 1943 titled The Fifth-Column Mouse. Comic books also contained references to the fifth column.
Graham Greene, in The Quiet American (1955) uses the phrase "Fifth Column, Third Force, Seventh Day" in the second chapter.
In the 1959 British action film Operation Amsterdam, the term "fifth columnists" is used repeatedly to refer to Nazi sympathizing members of the Dutch Army.
The V franchise is a set of TV shows, novels and comics about an alien invasion of Earth. A group of aliens opposed to the invasion and assist the human Resistance Movement is called The Fifth Column.
In the episode "Flight Into the Future" from the 1960s TV show Lost In Space,
Dr. Smith was referred to as the fifth columnist of the Jupiter 2
expedition. In the first episode, he was a secret agent sent to sabotage
the mission who got caught on board at liftoff.
Robert A. Heinlein's 1941 story "The Day After Tomorrow", originally titled "Sixth Column",
refers to a fictional fifth column that
destroyed the European democracies
from within in the tragic days that led up to the final blackout of
European civilization. But this would not be a fifth column of traitors,
but a sixth column of patriots whose privilege it would be to destroy
the morale of invaders, make them afraid, unsure of themselves.
— Robert A. Heinlein, "The Day after Tomorrow (original title: Sixth Column)", Signet Paperback #T4227, Chapter 3, page 37
In Foyle's War,
Series 2 Episode 3, "War Games", one line reads, "It's the Second
salvage collection I've missed, they've got me down as a fifth
columnist."
Plausible deniability is the ability of people, typically senior officials in a formal or informal chain of command, to deny
knowledge of or responsibility for any damnable actions committed by
members of their organizational hierarchy. They may do so because of a
lack or absence of evidence that can confirm their participation, even
if they were personally involved in or at least willfully ignorant
of the actions. If illegal or otherwise disreputable and unpopular
activities become public, high-ranking officials may deny any awareness
of such acts to insulate themselves and shift the blame
onto the agents who carried out the acts, as they are confident that
their doubters will be unable to prove otherwise. The lack of evidence
to the contrary ostensibly makes the denial plausible (credible), but
sometimes, it makes any accusations only unactionable.
The term typically implies forethought, such as intentionally
setting up the conditions for the plausible avoidance of responsibility
for one's future actions or knowledge. In some organizations, legal
doctrines such as command responsibility
exist to hold major parties responsible for the actions of subordinates
who are involved in heinous acts and nullify any legal protection that
their denial of involvement would carry.
In politics and espionage, deniability refers to the ability of a powerful player or intelligence agency to pass the buck and to avoid blowback
by secretly arranging for an action to be taken on its behalf by a
third party that is ostensibly unconnected with the major player. In
political campaigns, plausible deniability enables candidates to stay
clean and denounce third-party advertisements that use unethical
approaches or potentially libelous innuendo.
Although plausible deniability has existed throughout history, the term was coined by the CIA
in the early 1960s to describe the withholding of information from
senior officials to protect them from repercussions if illegal or
unpopular activities became public knowledge.[1]
Overview
Arguably,
the key concept of plausible deniability is plausibility. It is
relatively easy for a government official to issue a blanket denial of
an action, and it is possible to destroy or cover up evidence after the
fact, that might be sufficient to avoid a criminal prosecution, for
instance. However, the public might well disbelieve the denial,
particularly if there is strong circumstantial evidence or if the action
is believed to be so unlikely that the only logical explanation is that
the denial is false.
The concept is even more important in espionage. Intelligence may come from many sources, including human sources.
The exposure of information to which only a few people are privileged
may directly implicate some of the people in the disclosure. An example
is if an official is traveling secretly, and only one aide knows the
specific travel plans. If that official is assassinated during his
travels, and the circumstances of the assassination strongly suggest
that the assassin had foreknowledge of the official's travel plans, the
probable conclusion is that his aide has betrayed the official. There
may be no direct evidence linking the aide to the assassin, but
collaboration can be inferred from the facts alone, thus making the
aide's denial implausible.
History
The term's roots go back to US President Harry Truman's National Security Council
Paper 10/2 of June 18, 1948, which defined "covert operations" as "all
activities (except as noted herein) which are conducted or sponsored by
this Government against hostile foreign states or groups or in support
of friendly foreign states or groups but which are so planned and
executed that any US Government responsibility for them is not evident
to unauthorized persons and that if uncovered the US Government can
plausibly disclaim any responsibility for them." During the Eisenhower administration, NSC 10/2 was incorporated into the more-specific NSC 5412/2 "Covert Operations." NSC 5412 was declassified in 1977 and is located at the National Archives. The expression "plausibly deniable" was first used publicly by Central Intelligence Agency (CIA) Director Allen Dulles. The idea, on the other hand, is considerably older. For example, in the 19th century, Charles Babbage
described the importance of having "a few simply honest men" on a
committee who could be temporarily removed from the deliberations when
"a peculiarly delicate question arises" so that one of them could
"declare truly, if necessary, that he never was present at any meeting
at which even a questionable course had been proposed."
Church Committee
A U.S. Senate committee, the Church Committee,
in 1974–1975 conducted an investigation of the intelligence agencies.
In the course of the investigation, it was revealed that the CIA, going back to the Kennedy administration, had plotted the assassination of a number of foreign leaders, including Cuba's Fidel Castro,
but the president himself, who clearly supported such actions, was not
to be directly involved so that he could deny knowledge of it. That was
given the term "plausible denial."
Non-attribution to the United
States for covert operations was the original and principal purpose of
the so-called doctrine of "plausible denial." Evidence before the
Committee clearly demonstrates that this concept, designed to protect
the United States and its operatives from the consequences of
disclosures, has been expanded to mask decisions of the president and
his senior staff members.
— Church Committee
Plausible denial involves the creation of power structures and chains
of command loose and informal enough to be denied if necessary. The
idea was that the CIA and later other bodies could be given
controversial instructions by powerful figures, including the president
himself, but that the existence and true source of those instructions
could be denied if necessary if, for example, an operation went
disastrously wrong and it was necessary for the administration to
disclaim responsibility.
Later legislative barriers
The Hughes–Ryan Act
of 1974 sought to put an end to plausible denial by requiring a
presidential finding for each operation to be important to national
security, and the Intelligence Oversight Act of 1980 required for Congress to be notified of all covert operations.
Both laws, however, are full of enough vague terms and escape hatches
to allow the executive branch to thwart their authors' intentions, as
was shown by the Iran–Contra affair.
Indeed, the members of Congress are in a dilemma since when they are
informed, they are in no position to stop the action, unless they leak
its existence and thereby foreclose the option of covertness.
Media reports
The
(Church Committee) conceded that to provide the United States with
"plausible denial" in the event that the anti-Castro plots were
discovered, Presidential authorization might have been subsequently
"obscured". (The Church Committee) also declared that, whatever the
extent of the knowledge, Presidents Eisenhower, Kennedy and Johnson
should bear the "ultimate responsibility" for the actions of their
subordinates.
CIA officials deliberately used Aesopian language in talking to the President and others outside the agency. (Richard Helms)
testified that he did not want to "embarrass a President" or sit around
an official table talking about "killing or murdering." The report
found this "circumlocution"
reprehensible, saying: "Failing to call dirty business by its rightful
name may have increased the risk of dirty business being done." The
committee also suggested that the system of command and control may have
been deliberately ambiguous, to give Presidents a chance for "plausible
denial."
What made the responsibility
difficult to pin down in retrospect was a sophisticated system of
institutionalized vagueness and circumlocution whereby no official - and
particularly a President - had to officially endorse questionable
activities. Unsavory orders were rarely committed to paper and what
record the committee found was shot through with references to
"removal," "the magic button"
and "the resort beyond the last resort." Thus the agency might at times
have misread instructions from on high, but it seemed more often to be
easing the burden of presidents who knew there were things they didn't
want to know. As former CIA director Richard Helms told the committee:
"The difficulty with this kind of thing, as you gentlemen are all
painfully aware, is that nobody wants to embarrass a President of the
United States."
In his testimony to the congressional committee studying the Iran–Contra affair, Vice Admiral John Poindexter
stated: "I made a deliberate decision not to ask the President, so that
I could insulate him from the decision and provide some future
deniability for the President if it ever leaked out."
Declassified government documents
A telegram from the Ambassador in Vietnam Henry Cabot Lodge Jr., to Special Assistant for National Security Affairs McGeorge Bundy on US options with respect to a possible coup, mentions plausible denial.
CIA and White House documents on covert political intervention in
the 1964 Chilean election have been declassified. The CIA's Chief of
Western Hemisphere Division, J.C. King, recommended for funds for the
campaign to "be provided in a fashion causing (Eduardo Frei Montalva president of Chile) to infer United States origin of funds and yet permitting plausible denial."
Training files of the CIA's covert "Operation PBSuccess" for the 1954 coup in Guatemala describe plausible deniability. According to the National Security Archive:
"Among the documents found in the training files of Operation PBSuccess
and declassified by the Agency is a CIA document titled 'A Study of
Assassination.' A how-to guide book in the art of political killing, the
19-page manual offers detailed descriptions of the procedures,
instruments, and implementation of assassination." The manual states
that to provide plausible denial, "no assassination instructions should
ever be written or recorded."
Soviet operations
In the 1980s, the Soviet KGB ran OPERATION INFEKTION (also called "OPERATION DENVER"), which utilised the East German Stasi
and Soviet-affiliated press to spread the idea that HIV/AIDS was an
engineered bioweapon. The Stasi acquired plausible deniability on the
operation by covertly supporting biologist Jakob Segal, whose stories were picked up by international press, including "numerous bourgeois newspapers" such as the Sunday Express.
Publications in third-party countries were then cited as the
originators of the claims. Meanwhile, Soviet intelligence obtained
plausible deniability by utilising the German Stasi in the
disinformation operation.
Little green men and Wagner Group
In 2014, "Little green men" — troops without insignia carrying modern Russian military equipment, emerged at the start of the Russo-Ukrainian War, which The Moscow Times described as a tactic of plausible deniability.
The "Wagner Group" a Russian private military company
has been described as an attempt at plausible deniability for
Kremlin-backed interventions in Ukraine, Syria, and in various
interventions in Africa.
Flaws
It is
an open door to the abuse of authority by requiring that the parties in
question to be said to be able to have acted independently, which, in
the end, is tantamount to giving them license to act independently.
The denials are sometimes seen as plausible but sometimes seen through by both the media and the populace.
Plausible deniability increases the risk of misunderstanding between senior officials and their employees.
Other examples
Another
example of plausible deniability is someone who actively avoids gaining
certain knowledge of facts because it benefits that person not to know.
As an example, a lawyer
may suspect that facts exist that would hurt his case but decide not to
investigate the issue because if he has actual knowledge, the rules of
ethics might require him to reveal the facts to the opposing side.
Council on Foreign Relations
...the U.S. government may at times require a certain deniability. Private activities can provide that deniability.
— Council on Foreign Relations, Finding America's Voice: A Strategy for Reinvigorating U.S. Public Diplomacy
Use in computer networks
In
computer networks, plausible deniability often refers to a situation in
which people can deny transmitting a file, even when it is proven to
come from their computer.
That is sometimes done by setting the computer to relay certain
types of broadcasts automatically in such a way that the original
transmitter of a file is indistinguishable from those who are merely
relaying it. In that way, those who first transmitted the file can claim
that their computer had merely relayed it from elsewhere. This
principle is used in the opentrackerbittorrent implementation by including random IP addresses in peer lists.
In encrypted messaging protocols, such as bitmessage,
every user on the network keeps a copy of every message, but is only
able to decrypt their own and that can only be done by trying to decrypt
every single message. Using this approach it is impossible to determine
who sent a message to whom without being able to decrypt it. As
everyone receives everything and the outcome of the decryption process
is kept private.
It can also be done by a VPN if the host is not known.
In any case, that claim cannot be disproven without a complete decrypted log of all network connections.
Freenet file sharing
The Freenetfile sharing
network is another application of the idea by obfuscating data sources
and flows to protect operators and users of the network by preventing
them and, by extension, observers such as censors from knowing where data comes from and where it is stored.
Use in cryptography
In cryptography, deniable encryption may be used to describe steganographic techniques
in which the very existence of an encrypted file or message is deniable
in the sense that an adversary cannot prove that an encrypted message
exists. In that case, the system is said to be "fully undetectable".
Some systems take this further, such as MaruTukku, FreeOTFE and (to a much lesser extent) TrueCrypt and VeraCrypt,
which nest encrypted data. The owner of the encrypted data may reveal
one or more keys to decrypt certain information from it, and then deny
that more keys exist, a statement which cannot be disproven without
knowledge of all encryption keys involved. The existence of "hidden"
data within the overtly encrypted data is then deniable in the sense that it cannot be proven to exist.
Programming
The Underhanded C Contest
is an annual programming contest involving the creation of carefully
crafted defects, which have to be both very hard to find and plausibly
deniable as mistakes once found.
Cryptography prior to the modern age was effectively synonymous with encryption, converting readable information (plaintext) to unintelligible nonsense text (ciphertext), which can only be read by reversing the process (decryption).
The sender of an encrypted (coded) message shares the decryption
(decoding) technique only with intended recipients to preclude access
from adversaries. The cryptography literature often uses the names "Alice" (or "A") for the sender, "Bob" (or "B") for the intended recipient, and "Eve" (or "E") for the eavesdropping adversary. Since the development of rotor cipher machines in World War I and the advent of computers in World War II, cryptography methods have become increasingly complex and their applications more varied.
Modern cryptography is heavily based on mathematical theory and computer science practice; cryptographic algorithms are designed around computational hardness assumptions,
making such algorithms hard to break in actual practice by any
adversary. While it is theoretically possible to break into a
well-designed system, it is infeasible in actual practice to do so. Such
schemes, if well designed, are therefore termed "computationally
secure"; theoretical advances (e.g., improvements in integer factorization algorithms) and faster computing technology require these designs to be continually reevaluated, and if necessary, adapted. Information-theoretically secure schemes that provably cannot be broken even with unlimited computing power, such as the one-time pad, are much more difficult to use in practice than the best theoretically breakable, but computationally secure, schemes.
Alphabet shift ciphers are believed to have been used by Julius Caesar over 2,000 years ago. This is an example with k = 3.
In other words, the letters in the alphabet are shifted three in one
direction to encrypt and three in the other direction to decrypt.
Until modern times, cryptography referred almost exclusively to
"encryption", which is the process of converting ordinary information
(called plaintext) into an unintelligible form (called ciphertext). Decryption is the reverse, in other words, moving from the unintelligible ciphertext back to plaintext. A cipher
(or cypher) is a pair of algorithms that carry out the encryption and
the reversing decryption. The detailed operation of a cipher is
controlled both by the algorithm and, in each instance, by a "key". The
key is a secret (ideally known only to the communicants), usually a
string of characters (ideally short so it can be remembered by the
user), which is needed to decrypt the ciphertext. In formal
mathematical terms, a "cryptosystem"
is the ordered list of elements of finite possible plaintexts, finite
possible cyphertexts, finite possible keys, and the encryption and
decryption algorithms that correspond to each key. Keys are important
both formally and in actual practice, as ciphers without variable keys
can be trivially broken with only the knowledge of the cipher used and
are therefore useless (or even counter-productive) for most purposes.
Historically, ciphers were often used directly for encryption or
decryption without additional procedures such as authentication or integrity checks.
There are two main types of cryptosystems: symmetric and asymmetric.
In symmetric systems, the only ones known until the 1970s, the same
secret key encrypts and decrypts a message. Data manipulation in
symmetric systems is significantly faster than in asymmetric systems.
Asymmetric systems use a "public key" to encrypt a message and a related
"private key" to decrypt it. The advantage of asymmetric systems is
that the public key can be freely published, allowing parties to
establish secure communication without having a shared secret key. In
practice, asymmetric systems are used to first exchange a secret key,
and then secure communication proceeds via a more efficient symmetric
system using that key. Examples of asymmetric systems include Diffie–Hellman key exchange, RSA (Rivest–Shamir–Adleman), ECC (Elliptic Curve Cryptography), and Post-quantum cryptography. Secure symmetric algorithms include the commonly used AES (Advanced Encryption Standard) which replaced the older DES (Data Encryption Standard). Insecure symmetric algorithms include children's language tangling schemes such as Pig Latin or other cant, and all historical cryptographic schemes, however seriously intended, prior to the invention of the one-time pad early in the 20th century.
In colloquial use, the term "code"
is often used to mean any method of encryption or concealment of
meaning. However, in cryptography, code has a more specific meaning: the
replacement of a unit of plaintext (i.e., a meaningful word or phrase)
with a code word
(for example, "wallaby" replaces "attack at dawn"). A cypher, in
contrast, is a scheme for changing or substituting an element below such
a level (a letter, a syllable, or a pair of letters, etc.) in order to
produce a cyphertext.
Cryptanalysis
is the term used for the study of methods for obtaining the meaning of
encrypted information without access to the key normally required to do
so; i.e., it is the study of how to "crack" encryption algorithms or
their implementations.
Some use the terms "cryptography" and "cryptology" interchangeably in English,
while others (including US military practice generally) use
"cryptography" to refer specifically to the use and practice of
cryptographic techniques and "cryptology" to refer to the combined study
of cryptography and cryptanalysis.
English is more flexible than several other languages in which
"cryptology" (done by cryptologists) is always used in the second sense
above. RFC2828 advises that steganography is sometimes included in cryptology.
The study of characteristics of languages that have some
application in cryptography or cryptology (e.g. frequency data, letter
combinations, universal patterns, etc.) is called cryptolinguistics.
Before the modern era, cryptography focused on message confidentiality (i.e., encryption)—conversion of messages
from a comprehensible form into an incomprehensible one and back again
at the other end, rendering it unreadable by interceptors or
eavesdroppers without secret knowledge (namely the key needed for
decryption of that message). Encryption attempted to ensure secrecy in communications, such as those of spies, military leaders, and diplomats.
In recent decades, the field has expanded beyond confidentiality
concerns to include techniques for message integrity checking,
sender/receiver identity authentication, digital signatures, interactive proofs and secure computation, among others.
The main classical cipher types are transposition ciphers,
which rearrange the order of letters in a message (e.g., 'hello world'
becomes 'ehlol owrdl' in a trivially simple rearrangement scheme), and substitution ciphers,
which systematically replace letters or groups of letters with other
letters or groups of letters (e.g., 'fly at once' becomes 'gmz bu podf'
by replacing each letter with the one following it in the Latin alphabet).[21]
Simple versions of either have never offered much confidentiality from
enterprising opponents. An early substitution cipher was the Caesar cipher, in which each letter in the plaintext was replaced by a letter some fixed number of positions further down the alphabet. Suetonius reports that Julius Caesar used it with a shift of three to communicate with his generals. Atbash is an example of an early Hebrew cipher. The earliest known use of cryptography is some carved ciphertext on stone in Egypt
(ca 1900 BCE), but this may have been done for the amusement of
literate observers rather than as a way of concealing information.
The Greeks of Classical times are said to have known of ciphers (e.g., the scytale transposition cipher claimed to have been used by the Spartan military).
Steganography (i.e., hiding even the existence of a message so as to
keep it confidential) was also first developed in ancient times. An
early example, from Herodotus, was a message tattooed on a slave's shaved head and concealed under the regrown hair. More modern examples of steganography include the use of invisible ink, microdots, and digital watermarks to conceal information.
In India, the 2000-year-old Kamasutra of Vātsyāyana
speaks of two different kinds of ciphers called Kautiliyam and
Mulavediya. In the Kautiliyam, the cipher letter substitutions are based
on phonetic relations, such as vowels becoming consonants. In the
Mulavediya, the cipher alphabet consists of pairing letters and using
the reciprocal ones.
In Sassanid Persia, there were two secret scripts, according to the Muslim author Ibn al-Nadim: the šāh-dabīrīya (literally "King's script") which was used for official correspondence, and the rāz-saharīya which was used to communicate secret messages with other countries.
David Kahn notes in The Codebreakers that modern cryptology originated among the Arabs, the first people to systematically document cryptanalytic methods. Al-Khalil (717–786) wrote the Book of Cryptographic Messages, which contains the first use of permutations and combinations to list all possible Arabic words with and without vowels.
First page of a book by Al-Kindi which discusses encryption of messages
Ciphertexts produced by a classical cipher
(and some modern ciphers) will reveal statistical information about the
plaintext, and that information can often be used to break the cipher.
After the discovery of frequency analysis, perhaps by the Arab mathematician and polymathAl-Kindi (also known as Alkindus) in the 9th century,
nearly all such ciphers could be broken by an informed attacker. Such
classical ciphers still enjoy popularity today, though mostly as puzzles (see cryptogram). Al-Kindi wrote a book on cryptography entitled Risalah fi Istikhraj al-Mu'amma (Manuscript for the Deciphering Cryptographic Messages), which described the first known use of frequency analysis cryptanalysis techniques.
Language letter frequencies may offer little help for some extended historical encryption techniques such as homophonic cipher
that tend to flatten the frequency distribution. For those ciphers,
language letter group (or n-gram) frequencies may provide an attack.
Essentially all ciphers remained vulnerable to cryptanalysis
using the frequency analysis technique until the development of the
polyalphabetic cipher, most clearly by Leon Battista Alberti around the year 1467, though there is some indication that it was already known to Al-Kindi.
Alberti's innovation was to use different ciphers (i.e., substitution
alphabets) for various parts of a message (perhaps for each successive
plaintext letter at the limit). He also invented what was probably the
first automatic cipher device, a wheel that implemented a partial realization of his invention. In the Vigenère cipher, a polyalphabetic cipher, encryption uses a key word, which controls letter substitution depending on which letter of the key word is used. In the mid-19th century Charles Babbage showed that the Vigenère cipher was vulnerable to Kasiski examination, but this was first published about ten years later by Friedrich Kasiski.
Although frequency analysis can be a powerful and general
technique against many ciphers, encryption has still often been
effective in practice, as many a would-be cryptanalyst was unaware of
the technique. Breaking a message without using frequency analysis
essentially required knowledge of the cipher used and perhaps of the key
involved, thus making espionage, bribery, burglary, defection, etc.,
more attractive approaches to the cryptanalytically uninformed. It was
finally explicitly recognized in the 19th century that secrecy of a
cipher's algorithm is not a sensible nor practical safeguard of message
security; in fact, it was further realized that any adequate
cryptographic scheme (including ciphers) should remain secure even if
the adversary fully understands the cipher algorithm itself. Security of
the key used should alone be sufficient for a good cipher to maintain
confidentiality under an attack. This fundamental principle was first
explicitly stated in 1883 by Auguste Kerckhoffs and is generally called Kerckhoffs's Principle; alternatively and more bluntly, it was restated by Claude Shannon, the inventor of information theory and the fundamentals of theoretical cryptography, as Shannon's Maxim—'the enemy knows the system'.
Different physical devices and aids have been used to assist with
ciphers. One of the earliest may have been the scytale of ancient
Greece, a rod supposedly used by the Spartans as an aid for a
transposition cipher. In medieval times, other aids were invented such
as the cipher grille,
which was also used for a kind of steganography. With the invention of
polyalphabetic ciphers came more sophisticated aids such as Alberti's
own cipher disk, Johannes Trithemius' tabula recta scheme, and Thomas Jefferson's wheel cypher (not publicly known, and reinvented independently by Bazeries
around 1900). Many mechanical encryption/decryption devices were
invented early in the 20th century, and several patented, among them rotor machines—famously including the Enigma machine used by the German government and military from the late 1920s and during World War II.
The ciphers implemented by better quality examples of these machine
designs brought about a substantial increase in cryptanalytic difficulty
after WWI.
Early computer-era cryptography
Cryptanalysis
of the new mechanical ciphering devices proved to be both difficult and
laborious. In the United Kingdom, cryptanalytic efforts at Bletchley Park during WWII spurred the development of more efficient means for carrying out repetitious tasks, such as military code breaking (decryption). This culminated in the development of the Colossus, the world's first fully electronic, digital, programmable computer, which assisted in the decryption of ciphers generated by the German Army's Lorenz SZ40/42 machine.
Extensive open academic research into cryptography is relatively recent, beginning in the mid-1970s. In the early 1970s IBM
personnel designed the Data Encryption Standard (DES) algorithm that
became the first federal government cryptography standard in the United
States. In 1976 Whitfield Diffie and Martin Hellman published the Diffie–Hellman key exchange algorithm. In 1977 the RSA algorithm was published in Martin Gardner's Scientific American column. Since then, cryptography has become a widely used tool in communications, computer networks, and computer security generally.
Some modern cryptographic techniques can only keep their keys secret if certain mathematical problems are intractable, such as the integer factorization or the discrete logarithm problems, so there are deep connections with abstract mathematics. There are very few cryptosystems that are proven to be unconditionally secure. The one-time pad
is one, and was proven to be so by Claude Shannon. There are a few
important algorithms that have been proven secure under certain
assumptions. For example, the infeasibility of factoring extremely large
integers is the basis for believing that RSA is secure, and some other
systems, but even so, proof of unbreakability is unavailable since the
underlying mathematical problem remains open. In practice, these are
widely used, and are believed unbreakable in practice by most competent
observers. There are systems similar to RSA, such as one by Michael O. Rabin that are provably secure provided factoring n = pq is impossible; it is quite unusable in practice. The discrete logarithm problem
is the basis for believing some other cryptosystems are secure, and
again, there are related, less practical systems that are provably
secure relative to the solvability or insolvability discrete log
problem.
As well as being aware of cryptographic history, cryptographic
algorithm and system designers must also sensibly consider probable
future developments while working on their designs. For instance,
continuous improvements in computer processing power have increased the
scope of brute-force attacks, so when specifying key lengths, the required key lengths are similarly advancing. The potential impact of quantum computing are already being considered by some cryptographic system designers developing post-quantum cryptography.
The announced imminence of small implementations of these machines may
be making the need for preemptive caution rather more than merely
speculative.
Modern cryptography
Prior to the early 20th century, cryptography was mainly concerned with linguistic and lexicographic
patterns. Since then cryptography has broadened in scope, and now makes
extensive use of mathematical subdisciplines, including information theory, computational complexity, statistics, combinatorics, abstract algebra, number theory, and finite mathematics. Cryptography is also a branch of engineering,
but an unusual one since it deals with active, intelligent, and
malevolent opposition; other kinds of engineering (e.g., civil or
chemical engineering) need deal only with neutral natural forces. There
is also active research examining the relationship between cryptographic
problems and quantum physics.
Just as the development of digital computers and electronics
helped in cryptanalysis, it made possible much more complex ciphers.
Furthermore, computers allowed for the encryption of any kind of data
representable in any binary format, unlike classical ciphers which only
encrypted written language texts; this was new and significant. Computer
use has thus supplanted linguistic cryptography, both for cipher design
and cryptanalysis. Many computer ciphers can be characterized by their
operation on binarybit
sequences (sometimes in groups or blocks), unlike classical and
mechanical schemes, which generally manipulate traditional characters
(i.e., letters and digits) directly. However, computers have also
assisted cryptanalysis, which has compensated to some extent for
increased cipher complexity. Nonetheless, good modern ciphers have
stayed ahead of cryptanalysis; it is typically the case that use of a
quality cipher is very efficient (i.e., fast and requiring few
resources, such as memory or CPU capability), while breaking it requires
an effort many orders of magnitude larger, and vastly larger than that
required for any classical cipher, making cryptanalysis so inefficient
and impractical as to be effectively impossible.
Symmetric-key cryptography, where a single key is used for encryption and decryption
Symmetric-key cryptography refers to encryption methods in which both
the sender and receiver share the same key (or, less commonly, in which
their keys are different, but related in an easily computable way).
This was the only kind of encryption publicly known until June 1976.
One round (out of 8.5) of the IDEA cipher, used in most versions of PGP and OpenPGP compatible software for time-efficient encryption of messages
Symmetric key ciphers are implemented as either block ciphers or stream ciphers.
A block cipher enciphers input in blocks of plaintext as opposed to
individual characters, the input form used by a stream cipher.
The Data Encryption Standard (DES) and the Advanced Encryption Standard (AES) are block cipher designs that have been designated cryptography standards by the US government (though DES's designation was finally withdrawn after the AES was adopted). Despite its deprecation as an official standard, DES (especially its still-approved and much more secure triple-DES variant) remains quite popular; it is used across a wide range of applications, from ATM encryption to e-mail privacy and secure remote access.
Many other block ciphers have been designed and released, with
considerable variation in quality. Many, even some designed by capable
practitioners, have been thoroughly broken, such as FEAL.
Stream ciphers, in contrast to the 'block' type, create an
arbitrarily long stream of key material, which is combined with the
plaintext bit-by-bit or character-by-character, somewhat like the one-time pad.
In a stream cipher, the output stream is created based on a hidden
internal state that changes as the cipher operates. That internal state
is initially set up using the secret key material. RC4 is a widely used stream cipher. Block ciphers can be used as stream ciphers by generating blocks of a keystream (in place of a Pseudorandom number generator) and applying an XOR operation to each bit of the plaintext with each bit of the keystream.
Message authentication codes
(MACs) are much like cryptographic hash functions, except that a secret
key can be used to authenticate the hash value upon receipt; this additional complication blocks an attack scheme against bare digest algorithms, and so has been thought worth the effort. Cryptographic hash functions are a third type of cryptographic algorithm. They take a message of any length as input, and output a short, fixed-length hash,
which can be used in (for example) a digital signature. For good hash
functions, an attacker cannot find two messages that produce the same
hash. MD4 is a long-used hash function that is now broken; MD5, a strengthened variant of MD4, is also widely used but broken in practice. The US National Security Agency developed the Secure Hash Algorithm series of MD5-like hash functions: SHA-0 was a flawed algorithm that the agency withdrew; SHA-1 is widely deployed and more secure than MD5, but cryptanalysts have identified attacks against it; the SHA-2
family improves on SHA-1, but is vulnerable to clashes as of 2011; and
the US standards authority thought it "prudent" from a security
perspective to develop a new standard to "significantly improve the
robustness of NIST's overall hash algorithm toolkit." Thus, a hash function design competition was meant to select a new U.S. national standard, to be called SHA-3, by 2012. The competition ended on October 2, 2012, when the NIST announced that Keccak would be the new SHA-3 hash algorithm.
Unlike block and stream ciphers that are invertible, cryptographic hash
functions produce a hashed output that cannot be used to retrieve the
original input data. Cryptographic hash functions are used to verify the
authenticity of data retrieved from an untrusted source or to add a
layer of security.
Public-key cryptography, where different keys are used for encryption and decryption.
Symmetric-key cryptosystems use the same key for encryption and
decryption of a message, although a message or group of messages can
have a different key than others. A significant disadvantage of
symmetric ciphers is the key management
necessary to use them securely. Each distinct pair of communicating
parties must, ideally, share a different key, and perhaps for each
ciphertext exchanged as well. The number of keys required increases as
the square
of the number of network members, which very quickly requires complex
key management schemes to keep them all consistent and secret.
In a groundbreaking 1976 paper, Whitfield Diffie and Martin Hellman proposed the notion of public-key (also, more generally, called asymmetric key) cryptography in which two different but mathematically related keys are used—a public key and a private key.
A public key system is so constructed that calculation of one key (the
'private key') is computationally infeasible from the other (the 'public
key'), even though they are necessarily related. Instead, both keys are
generated secretly, as an interrelated pair. The historian David Kahn
described public-key cryptography as "the most revolutionary new
concept in the field since polyalphabetic substitution emerged in the
Renaissance".
In public-key cryptosystems, the public key may be freely
distributed, while its paired private key must remain secret. In a
public-key encryption system, the public key is used for encryption, while the private or secret key
is used for decryption. While Diffie and Hellman could not find such a
system, they showed that public-key cryptography was indeed possible by
presenting the Diffie–Hellman key exchange protocol, a solution that is now widely used in secure communications to allow two parties to secretly agree on a shared encryption key.
The X.509 standard defines the most commonly used format for public key certificates.
Diffie and Hellman's publication sparked widespread academic
efforts in finding a practical public-key encryption system. This race
was finally won in 1978 by Ronald Rivest, Adi Shamir, and Len Adleman, whose solution has since become known as the RSA algorithm.
A document published in 1997 by the Government Communications Headquarters (GCHQ), a British intelligence organization, revealed that cryptographers at GCHQ had anticipated several academic developments. Reportedly, around 1970, James H. Ellis had conceived the principles of asymmetric key cryptography. In 1973, Clifford Cocks invented a solution that was very similar in design rationale to RSA. In 1974, Malcolm J. Williamson is claimed to have developed the Diffie–Hellman key exchange.
In
this example the message is only signed and not encrypted. 1) Alice
signs a message with her private key. 2) Bob can verify that Alice sent
the message and that the message has not been modified.
Public-key cryptography is also used for implementing digital signature schemes. A digital signature is reminiscent of an ordinary signature; they both have the characteristic of being easy for a user to produce, but difficult for anyone else to forge.
Digital signatures can also be permanently tied to the content of the
message being signed; they cannot then be 'moved' from one document to
another, for any attempt will be detectable. In digital signature
schemes, there are two algorithms: one for signing, in which a secret key is used to process the message (or a hash of the message, or both), and one for verification, in which the matching public key is used with the message to check the validity of the signature. RSA and DSA are two of the most popular digital signature schemes. Digital signatures are central to the operation of public key infrastructures and many network security schemes (e.g., SSL/TLS, many VPNs, etc.).
Public-key algorithms are most often based on the computational complexity of "hard" problems, often from number theory. For example, the hardness of RSA is related to the integer factorization problem, while Diffie–Hellman and DSA are related to the discrete logarithm problem. The security of elliptic curve cryptography is based on number theoretic problems involving elliptic curves. Because of the difficulty of the underlying problems, most public-key algorithms involve operations such as modular
multiplication and exponentiation, which are much more computationally
expensive than the techniques used in most block ciphers, especially
with typical key sizes. As a result, public-key cryptosystems are
commonly hybrid cryptosystems,
in which a fast high-quality symmetric-key encryption algorithm is used
for the message itself, while the relevant symmetric key is sent with
the message, but encrypted using a public-key algorithm. Similarly,
hybrid signature schemes are often used, in which a cryptographic hash
function is computed, and only the resulting hash is digitally signed.
Cryptographic hash functions
Cryptographic
hash functions are cryptographic algorithms that generate and use keys
to encrypt data, and such functions may be viewed as keys themselves.
They take a message of any length as input, and output a short,
fixed-length hash,
which can be used in (for example) a digital signature. For good hash
functions, an attacker cannot find two messages that produce the same
hash. MD4 is a long-used hash function that is now broken; MD5, a strengthened variant of MD4, is also widely used but broken in practice. The US National Security Agency developed the Secure Hash Algorithm series of MD5-like hash functions: SHA-0 was a flawed algorithm that the agency withdrew; SHA-1 is widely deployed and more secure than MD5, but cryptanalysts have identified attacks against it; the SHA-2
family improves on SHA-1, but is vulnerable to clashes as of 2011; and
the US standards authority thought it "prudent" from a security
perspective to develop a new standard to "significantly improve the
robustness of NIST's overall hash algorithm toolkit." Thus, a hash function design competition was meant to select a new U.S. national standard, to be called SHA-3, by 2012. The competition ended on October 2, 2012, when the NIST announced that Keccak would be the new SHA-3 hash algorithm.
Unlike block and stream ciphers that are invertible, cryptographic hash
functions produce a hashed output that cannot be used to retrieve the
original input data. Cryptographic hash functions are used to verify the
authenticity of data retrieved from an untrusted source or to add a
layer of security.
The goal of cryptanalysis is to find some weakness or insecurity in a
cryptographic scheme, thus permitting its subversion or evasion.
It is a common misconception that every encryption method can be broken. In connection with his WWII work at Bell Labs, Claude Shannon proved that the one-time pad cipher is unbreakable, provided the key material is truly random, never reused, kept secret from all possible attackers, and of equal or greater length than the message. Most ciphers, apart from the one-time pad, can be broken with enough computational effort by brute force attack, but the amount of effort needed may be exponentially
dependent on the key size, as compared to the effort needed to make use
of the cipher. In such cases, effective security could be achieved if
it is proven that the effort required (i.e., "work factor", in Shannon's
terms) is beyond the ability of any adversary. This means it must be
shown that no efficient method (as opposed to the time-consuming brute
force method) can be found to break the cipher. Since no such proof has
been found to date, the one-time-pad remains the only theoretically
unbreakable cipher. Although well-implemented one-time-pad encryption
cannot be broken, traffic analysis is still possible.
There are a wide variety of cryptanalytic attacks, and they can
be classified in any of several ways. A common distinction turns on what
Eve (an attacker) knows and what capabilities are available. In a ciphertext-only attack,
Eve has access only to the ciphertext (good modern cryptosystems are
usually effectively immune to ciphertext-only attacks). In a known-plaintext attack, Eve has access to a ciphertext and its corresponding plaintext (or to many such pairs). In a chosen-plaintext attack, Eve may choose a plaintext and learn its corresponding ciphertext (perhaps many times); an example is gardening, used by the British during WWII. In a chosen-ciphertext attack, Eve may be able to choose ciphertexts and learn their corresponding plaintexts. Finally in a man-in-the-middle
attack Eve gets in between Alice (the sender) and Bob (the recipient),
accesses and modifies the traffic and then forwards it to the recipient. Also important, often overwhelmingly so, are mistakes (generally in the design or use of one of the protocols involved).
Poznań monument (center)
to Polish cryptanalysts whose breaking of Germany's Enigma machine
ciphers, beginning in 1932, altered the course of World War II
Cryptanalysis of symmetric-key ciphers typically involves looking for
attacks against the block ciphers or stream ciphers that are more
efficient than any attack that could be against a perfect cipher. For
example, a simple brute force attack against DES requires one known
plaintext and 255 decryptions, trying approximately half of
the possible keys, to reach a point at which chances are better than
even that the key sought will have been found. But this may not be
enough assurance; a linear cryptanalysis attack against DES requires 243 known plaintexts (with their corresponding ciphertexts) and approximately 243 DES operations. This is a considerable improvement over brute force attacks.
Public-key algorithms are based on the computational difficulty
of various problems. The most famous of these are the difficulty of integer factorization of semiprimes and the difficulty of calculating discrete logarithms, both of which are not yet proven to be solvable in polynomial time (P) using only a classical Turing-complete computer. Much public-key cryptanalysis concerns designing algorithms in P that can solve these problems, or using other technologies, such as quantum computers. For instance, the best-known algorithms for solving the elliptic curve-based
version of discrete logarithm are much more time-consuming than the
best-known algorithms for factoring, at least for problems of more or
less equivalent size. Thus, to achieve an equivalent strength of
encryption, techniques that depend upon the difficulty of factoring
large composite numbers, such as the RSA cryptosystem, require larger
keys than elliptic curve techniques. For this reason, public-key
cryptosystems based on elliptic curves have become popular since their
invention in the mid-1990s.
While pure cryptanalysis uses weaknesses in the algorithms
themselves, other attacks on cryptosystems are based on actual use of
the algorithms in real devices, and are called side-channel attacks.
If a cryptanalyst has access to, for example, the amount of time the
device took to encrypt a number of plaintexts or report an error in a
password or PIN character, he may be able to use a timing attack
to break a cipher that is otherwise resistant to analysis. An attacker
might also study the pattern and length of messages to derive valuable
information; this is known as traffic analysis
and can be quite useful to an alert adversary. Poor administration of a
cryptosystem, such as permitting too short keys, will make any system
vulnerable, regardless of other virtues. Social engineering and other attacks against humans (e.g., bribery, extortion, blackmail, espionage, torture,
...) are usually employed due to being more cost-effective and feasible
to perform in a reasonable amount of time compared to pure
cryptanalysis by a high margin.
Cryptographic primitives
Much of the theoretical work in cryptography concerns cryptographic primitives—algorithms
with basic cryptographic properties—and their relationship to other
cryptographic problems. More complicated cryptographic tools are then
built from these basic primitives. These primitives provide fundamental
properties, which are used to develop more complex tools called cryptosystems or cryptographic protocols, which guarantee one or more high-level security properties. Note, however, that the distinction between cryptographic primitives
and cryptosystems, is quite arbitrary; for example, the RSA algorithm
is sometimes considered a cryptosystem, and sometimes a primitive.
Typical examples of cryptographic primitives include pseudorandom functions, one-way functions, etc.
One or more cryptographic primitives are often used to develop a more complex algorithm, called a cryptographic system, or cryptosystem. Cryptosystems (e.g., El-Gamal encryption)
are designed to provide particular functionality (e.g., public key
encryption) while guaranteeing certain security properties (e.g., chosen-plaintext attack (CPA) security in the random oracle model).
Cryptosystems use the properties of the underlying cryptographic
primitives to support the system's security properties. As the
distinction between primitives and cryptosystems is somewhat arbitrary, a
sophisticated cryptosystem can be derived from a combination of several
more primitive cryptosystems. In many cases, the cryptosystem's
structure involves back and forth communication among two or more
parties in space (e.g., between the sender of a secure message and its
receiver) or across time (e.g., cryptographically protected backup data). Such cryptosystems are sometimes called cryptographic protocols.
Lightweight
cryptography (LWC) concerns cryptographic algorithms developed for a
strictly constrained environment. The growth of Internet of Things (IoT)
has spiked research into the development of lightweight algorithms that
are better suited for the environment. An IoT environment requires
strict constraints on power consumption, processing power, and security. Algorithms such as PRESENT, AES, and SPECK are examples of the many LWC algorithms that have been developed to achieve the standard set by the National Institute of Standards and Technology.
Cryptography
is widely used on the internet to help protect user-data and prevent
eavesdropping. To ensure secrecy during transmission, many systems use
private key cryptography to protect transmitted information. With
public-key systems, one can maintain secrecy without a master key or a
large number of keys. But, some algorithms like Bitlocker and Veracrypt
are generally not private-public key cryptography. Such as Veracrypt,
it uses a password hash to generate the single private key. However, it
can be configured to run in public-private key systems. The C++ opensource encryption library OpenSSL provides free and opensource encryption software and tools. The most commonly used encryption cipher suit is AES, as it has hardware acceleration for all x86 based processors that has AES-NI. A close contender is ChaCha20-Poly1305, which is a stream cipher, however it is commonly used for mobile devices as they are ARM based which does not feature AES-NI instruction set extension.
Cybersecurity
Cryptography can be used to secure communications by encrypting them. Websites use encryption via HTTPS. "End-to-end" encryption, where only sender and receiver can read messages, is implemented for email in Pretty Good Privacy and for secure messaging in general in WhatsApp, Signal and Telegram.
Operating systems use encryption to keep passwords secret,
conceal parts of the system, and ensure that software updates are truly
from the system maker.
Instead of storing plaintext passwords, computer systems store hashes
thereof; then, when a user logs in, the system passes the given password
through a cryptographic hash function
and compares it to the hashed value on file. In this manner, neither
the system nor an attacker has at any point access to the password in
plaintext.
Encryption is sometimes used to encrypt one's entire drive. For example, University College London has implemented BitLocker (a program by Microsoft) to render drive data opaque without users logging in.
Cryptography has long been of interest to intelligence gathering and law enforcement agencies. Secret communications may be criminal or even treasonous. Because of its facilitation of privacy,
and the diminution of privacy attendant on its prohibition,
cryptography is also of considerable interest to civil rights
supporters. Accordingly, there has been a history of controversial legal
issues surrounding cryptography, especially since the advent of
inexpensive computers has made widespread access to high-quality
cryptography possible.
In some countries, even the domestic use of cryptography is, or has been, restricted. Until 1999, France significantly restricted the use of cryptography domestically, though it has since relaxed many of these rules. In China and Iran, a license is still required to use cryptography. Many countries have tight restrictions on the use of cryptography. Among the more restrictive are laws in Belarus, Kazakhstan, Mongolia, Pakistan, Singapore, Tunisia, and Vietnam.
In the United States, cryptography is legal for domestic use, but there has been much conflict over legal issues related to cryptography. One particularly important issue has been the export of cryptography and cryptographic software and hardware. Probably because of the importance of cryptanalysis in World War II
and an expectation that cryptography would continue to be important for
national security, many Western governments have, at some point,
strictly regulated export of cryptography. After World War II, it was
illegal in the US to sell or distribute encryption technology overseas;
in fact, encryption was designated as auxiliary military equipment and
put on the United States Munitions List. Until the development of the personal computer, asymmetric key algorithms (i.e., public key techniques), and the Internet,
this was not especially problematic. However, as the Internet grew and
computers became more widely available, high-quality encryption
techniques became well known around the globe.
In the 1990s, there were several challenges to US export regulation of cryptography. After the source code for Philip Zimmermann's Pretty Good Privacy (PGP) encryption program found its way onto the Internet in June 1991, a complaint by RSA Security
(then called RSA Data Security, Inc.) resulted in a lengthy criminal
investigation of Zimmermann by the US Customs Service and the FBI, though no charges were ever filed. Daniel J. Bernstein, then a graduate student at UC Berkeley, brought a lawsuit against the US government challenging some aspects of the restrictions based on free speech grounds. The 1995 case Bernstein v. United States ultimately resulted in a 1999 decision that printed source code for cryptographic algorithms and systems was protected as free speech by the United States Constitution.
In 1996, thirty-nine countries signed the Wassenaar Arrangement,
an arms control treaty that deals with the export of arms and
"dual-use" technologies such as cryptography. The treaty stipulated that
the use of cryptography with short key-lengths (56-bit for symmetric
encryption, 512-bit for RSA) would no longer be export-controlled. Cryptography exports from the US became less strictly regulated as a consequence of a major relaxation in 2000; there are no longer very many restrictions on key sizes in US-exported mass-market software. Since this relaxation in US export restrictions, and because most personal computers connected to the Internet include US-sourced web browsers such as Firefox or Internet Explorer, almost every Internet user worldwide has potential access to quality cryptography via their browsers (e.g., via Transport Layer Security). The Mozilla Thunderbird and Microsoft OutlookE-mail client programs similarly can transmit and receive emails via TLS, and can send and receive email encrypted with S/MIME. Many Internet users don't realize that their basic application software contains such extensive cryptosystems.
These browsers and email programs are so ubiquitous that even
governments whose intent is to regulate civilian use of cryptography
generally don't find it practical to do much to control distribution or
use of cryptography of this quality, so even when such laws are in
force, actual enforcement is often effectively impossible.
Another contentious issue connected to cryptography in the United States is the influence of the National Security Agency on cipher development and policy. The NSA was involved with the design of DES during its development at IBM and its consideration by the National Bureau of Standards as a possible Federal Standard for cryptography. DES was designed to be resistant to differential cryptanalysis,
a powerful and general cryptanalytic technique known to the NSA and
IBM, that became publicly known only when it was rediscovered in the
late 1980s. According to Steven Levy, IBM discovered differential cryptanalysis,
but kept the technique secret at the NSA's request. The technique
became publicly known only when Biham and Shamir re-discovered and
announced it some years later. The entire affair illustrates the
difficulty of determining what resources and knowledge an attacker might
actually have.
Another instance of the NSA's involvement was the 1993 Clipper chip affair, an encryption microchip intended to be part of the Capstone cryptography-control initiative. Clipper was widely criticized by cryptographers for two reasons. The cipher algorithm (called Skipjack)
was then classified (declassified in 1998, long after the Clipper
initiative lapsed). The classified cipher caused concerns that the NSA
had deliberately made the cipher weak in order to assist its
intelligence efforts. The whole initiative was also criticized based on
its violation of Kerckhoffs's Principle, as the scheme included a special escrow key held by the government for use by law enforcement (i.e. wiretapping).
Cryptography is central to digital rights management (DRM), a group of techniques for technologically controlling use of copyrighted material, being widely implemented and deployed at the behest of some copyright holders. In 1998, U.S. PresidentBill Clinton signed the Digital Millennium Copyright Act
(DMCA), which criminalized all production, dissemination, and use of
certain cryptanalytic techniques and technology (now known or later
discovered); specifically, those that could be used to circumvent DRM
technological schemes.
This had a noticeable impact on the cryptography research community
since an argument can be made that any cryptanalytic research violated
the DMCA. Similar statutes have since been enacted in several countries
and regions, including the implementation in the EU Copyright Directive. Similar restrictions are called for by treaties signed by World Intellectual Property Organization member-states.
The United States Department of Justice and FBI have not enforced the DMCA as rigorously as had been feared by some, but the law, nonetheless, remains a controversial one. Niels Ferguson, a well-respected cryptography researcher, has publicly stated that he will not release some of his research into an Intel security design for fear of prosecution under the DMCA. Cryptologist Bruce Schneier has argued that the DMCA encourages vendor lock-in, while inhibiting actual measures toward cyber-security. Both Alan Cox (longtime Linux kernel developer) and Edward Felten (and some of his students at Princeton) have encountered problems related to the Act. Dmitry Sklyarov
was arrested during a visit to the US from Russia, and jailed for five
months pending trial for alleged violations of the DMCA arising from
work he had done in Russia, where the work was legal. In 2007, the
cryptographic keys responsible for Blu-ray and HD DVD content scrambling were discovered and released onto the Internet. In both cases, the Motion Picture Association of America sent out numerous DMCA takedown notices, and there was a massive Internet backlash triggered by the perceived impact of such notices on fair use and free speech.
In the United Kingdom, the Regulation of Investigatory Powers Act
gives UK police the powers to force suspects to decrypt files or hand
over passwords that protect encryption keys. Failure to comply is an
offense in its own right, punishable on conviction by a two-year jail
sentence or up to five years in cases involving national security. Successful prosecutions have occurred under the Act; the first, in 2009, resulted in a term of 13 months' imprisonment.
Similar forced disclosure laws in Australia, Finland, France, and India
compel individual suspects under investigation to hand over encryption
keys or passwords during a criminal investigation.
In the United States, the federal criminal case of United States v. Fricosu addressed whether a search warrant can compel a person to reveal an encryptionpassphrase or password. The Electronic Frontier Foundation (EFF) argued that this is a violation of the protection from self-incrimination given by the Fifth Amendment. In 2012, the court ruled that under the All Writs Act, the defendant was required to produce an unencrypted hard drive for the court.
In many jurisdictions, the legal status of forced disclosure remains unclear.
The 2016 FBI–Apple encryption dispute
concerns the ability of courts in the United States to compel
manufacturers' assistance in unlocking cell phones whose contents are
cryptographically protected.
As a potential counter-measure to forced disclosure some cryptographic software supports plausible deniability, where the encrypted data is indistinguishable from unused random data (for example such as that of a drive which has been securely wiped).