The just war theory (Latin: bellum iustum) is a doctrine, also referred to as a tradition, of military ethics that aims to ensure that a war is morally justifiable through a series of criteria, all of which must be met for a war to be considered just. It has been studied by military leaders, theologians, ethicists and policymakers. The criteria are split into two groups: jus ad bellum ("right to go to war") and jus in bello
("right conduct in war"). The first group of criteria concerns the
morality of going to war, and the second group of criteria concerns the
moral conduct within war. There have been calls for the inclusion of a third category of just war theory (jus post bellum)
dealing with the morality of post-war settlement and reconstruction.
The just war theory postulates the belief that war, while it is terrible
but less so with the right conduct, is not always the worst option.
Important responsibilities, undesirable outcomes, or preventable
atrocities may justify war.
Opponents of the just war theory may either be inclined to a stricter pacifist
standard (proposing that there has never been nor can there ever be a
justifiable basis for war) or they may be inclined toward a more
permissive nationalist
standard (proposing that a war need only to serve a nation's interests
to be justifiable). In many cases, philosophers state that individuals
do not need to be plagued by a guilty conscience if they are required to
fight. A few philosophers ennoble the virtues of the soldier while they
also declare their apprehensions for war itself. A few, such as Rousseau, argue for insurrection against oppressive rule.
The historical aspect, or the "just war tradition", deals with
the historical body of rules or agreements that have applied in various
wars across the ages. The just war tradition also considers the writings
of various philosophers and lawyers through history, and examines both
their philosophical visions of war's ethical limits and whether their
thoughts have contributed to the body of conventions that have evolved
to guide war and warfare.
In the twenty-first century there has been significant debate
between traditional just war theorists, who largely support the existing
law of war and develop arguments to support it, and revisionists who reject many traditional assumptions, although not necessarily advocating a change in the law.
Origins
Ancient Egypt
A 2017 study found that the just war tradition can be traced as far back as to Ancient Egypt.
Egyptian ethics of war usually centered on three main ideas, these
including the cosmological role of Egypt, the pharaoh as a divine office
and executor of the will of the gods, and the superiority of the
Egyptian state and population over all other states and peoples.
Egyptian political theology held that the pharaoh had the exclusive
legitimacy in justly initiating a war, usually claimed to carry out the
will of the gods. Senusret I, in the Twelfth Dynasty,
claimed, "I was nursed to be a conqueror...his [Atum's] son and his
protector, he gave me to conquer what he conquered." Later pharaohs also
considered their sonship of the god Amun-Re as granting them absolute
ability to declare war on the deity's behalf. Pharaohs often visited
temples prior to initiating campaigns, where the pharaoh was believed to
receive their commands of war from the deities. For example, Kamose
claimed that "I went north because I was strong (enough) to attack the
Asiatics through the command of Amon, the just of counsels." A stele erected by Thutmose III at the Temple of Amun at Karnak "provides an unequivocal statement of the pharaoh's divine mandate to wage war on his enemies." As the period of the New Kingdom
progressed and Egypt heightened its territorial ambition, so did the
invocation of just war aid the justification of these efforts. The
universal principle of Maat,
signifying order and justice, was central to the Egyptian notion of
just war and its ability to guarantee Egypt virtually no limits on what
it could take, do, or use to guarantee the ambitions of the state.
India
The Indian Hinduepic, the Mahabharata, offers the first written discussions of a "just war" (dharma-yuddha or "righteous war"). In it, one of five ruling brothers (Pandavas)
asks if the suffering caused by war can ever be justified. A long
discussion then ensues between the siblings, establishing criteria like proportionality (chariots cannot attack cavalry, only other chariots; no attacking people in distress), just means (no poisoned or barbed arrows), just cause (no attacking out of rage), and fair treatment of captives and the wounded.
In Sikhism, the term dharamyudh
describes a war that is fought for just, righteous or religious
reasons, especially in defence of one's own beliefs. Though some core
tenets in the Sikh religion are understood to emphasise peace and
nonviolence, especially before the 1606 execution of Guru Arjan by Mughal emperor Jahangir, military force may be justified if all peaceful means to settle a conflict have been exhausted, thus resulting in a dharamyudh.
East Asian
Chinese philosophy produced a massive body of work on warfare, much of it during the Zhou dynasty, especially the Warring States era.
War was justified only as a last resort and only by the rightful
sovereign; however, questioning the decision of the emperor concerning
the necessity of a military action was not permissible. The success of a
military campaign was sufficient proof that the campaign had been
righteous.
Japan did not develop its own doctrine of just war but between
the 5th and the 7th centuries drew heavily from Chinese philosophy, and
especially Confucian views. As part of the Japanese campaign to take the
northeastern island Honshu, Japanese military action was portrayed as an effort to "pacify" the Emishi people, who were likened to "bandits" and "wild-hearted wolf cubs" and accused of invading Japan's frontier lands.
Ancient Greece and Rome
The notion of just war in Europe originates and is developed first in ancient Greece and then in the Roman Empire.
It was Aristotle who first introduced the concept and terminology to the Hellenic world
that called war a last resort requiring conduct that would allow the
restoration of peace. Aristotle argues that the cultivation of a
military is necessary and good for the purpose of self-defense, not for
conquering: "The proper object of practising military training is not in
order that men may enslave those who do not deserve slavery, but in
order that first they may themselves avoid becoming enslaved to others" (Politics, Book 7).
In ancient Rome,
a "just cause" for war might include the necessity of repelling an
invasion, or retaliation for pillaging or a breach of treaty. War was always potentially nefas ("wrong, forbidden"), and risked religious pollution and divine disfavor. A "just war" (bellum iustum) thus required a ritualized declaration by the fetial priests. More broadly, conventions of war and treaty-making were part of the ius gentium, the "law of nations", the customary moral obligations regarded as innate and universal to human beings.
Christian views
Christian theory of the Just War begins around the time of Augustine of Hippo
The Just War theory, with some amendments, is still used by
Christians today as a guide to whether or not a war can be justified.
War may be necessary and right, even though it may not be good. In the
case of a country that has been invaded by an occupying force, war may
be the only way to restore justice.
Saint Augustine
Saint Augustine
held that individuals should not resort immediately to violence, but
God has given the sword to government for a good reason (based upon
Romans 13:4). In Contra Faustum Manichaeum book 22 sections
69–76, Augustine argues that Christians, as part of a government, need
not be ashamed of protecting peace and punishing wickedness when they
are forced to do so by a government. Augustine asserted that was a
personal and philosophical stance: "What is here required is not a
bodily action, but an inward disposition. The sacred seat of virtue is
the heart."
Nonetheless, he asserted, peacefulness in the face of a grave wrong that
could be stopped by only violence would be a sin. Defense of one's self
or others could be a necessity, especially when it is authorized by a
legitimate authority:
They who have waged war in
obedience to the divine command, or in conformity with His laws, have
represented in their persons the public justice or the wisdom of
government, and in this capacity have put to death wicked men; such
persons have by no means violated the commandment, "Thou shalt not
kill."
While
not breaking down the conditions necessary for war to be just,
Augustine nonetheless originated the very phrase itself in his work The City of God:
But, say they, the wise man will wage Just Wars. As
if he would not all the rather lament the necessity of just wars, if he
remembers that he is a man; for if they were not just he would not wage
them, and would therefore be delivered from all wars.
Augustine further taught:
No war is undertaken by a good state except on behalf of good faith or for safety.
J. Mark Mattox writes,
In terms of the traditional
notion of jus ad bellum (justice of war, that is, the circumstances in
which wars can be justly fought), war is a coping mechanism for
righteous sovereigns who would ensure that their violent international
encounters are minimal, a reflection of the Divine Will to the greatest
extent possible, and always justified. In terms of the traditional
notion of jus in bello (justice in war, or the moral considerations
which ought to constrain the use of violence in war), war is a coping
mechanism for righteous combatants who, by divine edict, have no choice
but to subject themselves to their political masters and seek to ensure
that they execute their war-fighting duty as justly as possible.
The medieval Peace of God (Latin: pax dei)
was a 10th century mass movement in Western Europe instigated by the
clergy that granted immunity from violence for non-combatants.
Starting in the 11th Century, the Truce of God (Latin: treuga dei) involved Church rules that successfully limited when and where fighting could occur: Catholic forces (e.g. of warring barons) could not fight each other on Sundays, Thursdays, holidays, the entirety of Lent and Advent and other times, severely disrupting the conduct of wars. The 1179 Third Council of the Lateran adopted a version of it for the whole church.
The just war theory by Thomas Aquinas has had a lasting impact on later generations of thinkers and was part of an emerging consensus in Medieval Europe on just war. In the 13th century Aquinas reflected in detail on peace and war. Aquinas was a Dominican friar and contemplated the teachings of the Bible on peace and war in combination with ideas from Aristotle, Plato, Socrates, Saint Augustine and other philosophers whose writings are part of the Western canon. Aquinas' views on war drew heavily on the Decretum Gratiani, a book the Italian monk Gratian had compiled with passages from the Bible. After its publication in the 12th century, the Decretum Gratiani had been republished with commentary from Pope Innocent IV and the Dominican friar Raymond of Penafort. Other significant influences on Aquinas just war theory were Alexander of Hales and Henry of Segusio.
In Summa Theologica Aquinas asserted that it is not always a sin
to wage war, and he set out criteria for a just war. According to
Aquinas, three requirements must be met. Firstly, the war must be waged
upon the command of a rightful sovereign.
Secondly, the war needs to be waged for just cause, on account of some
wrong the attacked have committed. Thirdly, warriors must have the right
intent, namely to promote good and to avoid evil.
Aquinas came to the conclusion that a just war could be offensive and
that injustice should not be tolerated so as to avoid war. Nevertheless,
Aquinas argued that violence must only be used as a last resort. On the
battlefield,
violence was only justified to the extent it was necessary. Soldiers
needed to avoid cruelty and a just war was limited by the conduct of
just combatants. Aquinas argued that it was only in the pursuit of
justice, that the good intention of a moral act could justify negative
consequences, including the killing of the innocent during a war.
John Colet
famously preached a Lenten sermon before Henry VIII, who was preparing
for a war, quoting Cicero "Better an unjust peace rather than the
justest war."
Erasmus of Rotterdam wrote numerous works on peace which criticized Just War theory as a smokescreen and added extra limitations, notably The Complaint of Peace and the Treatise on War (Dulce bellum inexpertis).
A leading humanist writer after the Reformation was legal theorist Hugo Grotius, whose De jura belli ac pacis re-considered Just War and fighting wars justly.
First World War
At the beginning of the First World War,
a group of theologians in Germany published a manifesto that sought to
justify the actions of the German government. At the British
government's request, Randall Davidson, Archbishop of Canterbury,
took the lead in collaborating with a large number of other religious
leaders, including some with whom he had differed in the past, to write a
rebuttal of the Germans' contentions. Both German and British
theologians based themselves on the just war theory, each group seeking
to prove that it applied to the war waged by its own side.
Contemporary Catholic doctrine
The just war doctrine of the Catholic Church found in the 1992 Catechism of the Catholic Church, in paragraph 2309, lists four strict conditions for "legitimate defense by military force:"
The damage inflicted by the aggressor on the nation or community of nations must be lasting, grave and certain.
All other means of putting an end to it must have been shown to be impractical or ineffective.
There must be serious prospects of success.
The use of arms must not produce evils and disorders graver than the evil to be eliminated.
If this responsibility justifies
the possession of sufficient means to exercise this right to defense,
States still have the obligation to do everything possible "to ensure
that the conditions of peace exist, not only within their own territory
but throughout the world". It is important to remember that "it is one
thing to wage a war of self-defense; it is quite another to seek to
impose domination on another nation. The possession of war potential
does not justify the use of force for political or military objectives.
Nor does the mere fact that war has unfortunately broken out mean that
all is fair between the warring parties".
The Charter of the United Nations ... is based on a
generalized prohibition of a recourse to force to resolve disputes
between States, with the exception of two cases: legitimate defence and
measures taken by the Security Council within the area of its
responsibilities for maintaining peace. In every case, exercising the right to self-defence must respect "the traditional limits of necessity and proportionality".
Therefore, engaging in a preventive war without clear proof that
an attack is imminent cannot fail to raise serious moral and juridical
questions. International legitimacy for the use of armed force, on
the basis of rigorous assessment and with well-founded motivations, can
only be given by the decision of a competent body that identifies
specific situations as threats to peace and authorizes an intrusion into
the sphere of autonomy usually reserved to a State.
Pope John Paul II in an address to a group of soldiers said the following:
Peace,
as taught by Sacred Scripture and the experience of men itself, is more
than just the absence of war. And the Christian is aware that on earth a
human society that is completely and always peaceful is, unfortunately,
an utopia and that the ideologies which present it as easily attainable
only nourish vain hopes. The cause of peace will not go forward by
denying the possibility and the obligation to defend it.
Russian Orthodox Church
The War and Peace section in the Basis of the Social Concept of the Russian Orthodox Church is crucial for understanding the Russian Orthodox Church's
attitude towards war. The document offers criteria of distinguishing
between an aggressive war, which is unacceptable, and a justified war,
attributing the highest moral and sacred value of military acts of
bravery to a true believer who participates in a justified war.
Additionally, the document considers the just war criteria as developed
in Western Christianity to be eligible for Russian Orthodoxy; therefore,
the justified war theory in Western theology is also applicable to the
Russian Orthodox Church.
In the same document, it is stated that wars have accompanied human history since the fall of man, and according to the gospel,
they will continue to accompany it. While recognizing war as evil, the
Russian Orthodox Church does not prohibit its members from participating
in hostilities if there is the security of their neighbours and the
restoration of trampled justice at stake. War is considered to be
necessary but undesirable. It is also stated that the Russian Orthodox
Church has had profound respect for soldiers who gave their lives to
protect the life and security of their neighbours.
Just war tradition
The just war theory, propounded by the medieval Christian philosopher Thomas Aquinas, was developed further by legal scholars in the context of international law. Cardinal Cajetan, the jurist Francisco de Vitoria, the two Jesuit priestsLuis de Molina and Francisco Suárez, as well as the humanistHugo Grotius and the lawyer Luigi Taparelli were most influential in the formation of a just war tradition. The just war tradition, which was well established by the 19th century, found its practical application in the Hague Peace Conferences (1899 and 1907) and in the founding of the League of Nations in 1920. After the United States Congress declared war on Germany in 1917, Cardinal James Gibbons issued a letter that all Catholics were to support the war
because "Our Lord Jesus Christ does not stand for peace at any price...
If by Pacifism is meant the teaching that the use of force is never
justifiable, then, however well meant, it is mistaken, and it is hurtful
to the life of our country."
Just-war theorists combine a moral abhorrence towards war with a
readiness to accept that war may sometimes be necessary. The criteria of
the just-war tradition act as an aid in determining whether resorting
to arms is morally permissible. Just-war theories aim "to distinguish
between justifiable and unjustifiable uses of organized armed forces";
they attempt "to conceive of how the use of arms might be restrained,
made more humane, and ultimately directed towards the aim of
establishing lasting peace and justice".
The just war tradition addresses the morality of the use of force
in two parts: when it is right to resort to armed force (the concern of
jus ad bellum) and what is acceptable in using such force (the concern of jus in bello).
In 1869 the Russian military theorist Genrikh Antonovich Leer theorized on the advantages and potential benefits of war.
The Soviet leader Vladimir Lenin defined only three types of just war.
But picture to yourselves a slave-owner who owned 100
slaves warring against a slave-owner who owned 200 slaves for a more
"just" distribution of slaves. Clearly, the application of the term
"defensive" war, or war "for the defense of the fatherland" in such a
case would be historically false, and in practice would be sheer
deception of the common people, of philistines, of ignorant people, by
the astute slaveowners. Precisely in this way are the present-day
imperialist bourgeoisie deceiving the peoples by means of "national
ideology" and the term "defense of the fatherland" in the present war
between slave-owners for fortifying and strengthening slavery.
The anarcho-capitalist scholar Murray Rothbard (1926-1995) stated that "a just
war exists when a people tries to ward off the threat of coercive
domination by another people, or to overthrow an already-existing
domination. A war is unjust, on the other hand, when a people try
to impose domination on another people or try to retain an
already-existing coercive rule over them."
The consensus among Christians on the use of violence has
changed radically since the crusades were fought. The just war theory
prevailing for most of the last two centuries—that violence is an evil
that can, in certain situations, be condoned as the lesser of evils—is
relatively young. Although it has inherited some elements (the criteria
of legitimate authority, just cause, right intention) from the older war
theory that first evolved around AD 400, it has rejected two premises
that underpinned all medieval just wars, including crusades: first, that
violence could be employed on behalf of Christ's intentions for mankind
and could even be directly authorized by him; and second, that it was a
morally neutral force that drew whatever ethical coloring it had from
the intentions of the perpetrators.
Criteria
The just war theory has two sets of criteria, the first establishing jus ad bellum (the right to go to war), and the second establishing jus in bello (right conduct within war).
Only duly constituted public authorities may wage war. "A just war
must be initiated by a political authority within a political system
that allows distinctions of justice. Dictatorships (e.g. Hitler's
regime) or deceptive military actions (e.g. the 1968 US bombing of Cambodia)
are typically considered as violations of this criterion. The
importance of this condition is key. Plainly, we cannot have a genuine
process of judging a just war within a system that represses the process
of genuine justice. A just war must be initiated by a political
authority within a political system that allows distinctions of
justice".
Probability of success
According to this principle, there must be good grounds for concluding that aims of the just war are achievable. This principle emphasizes that mass violence must not be undertaken if it is unlikely to secure the just cause.
This criterion is to avoid invasion for invasion's sake and links to
the proportionality criteria. One cannot invade if there is no chance
of actually winning. However, wars are fought with imperfect knowledge,
so one must simply be able to make a logical case that one can win;
there is no way to know this in advance. These criteria move the
conversation from moral and theoretical grounds to practical grounds. Essentially, this is meant to gather coalition building and win approval of other state actors.
Last resort
The principle of last resort stipulates that all non-violent options
must first be exhausted before the use of force can be justified.
Diplomatic options, sanctions, and other non-military methods must be
attempted or validly ruled out before the engagement of hostilities.
Further, in regard to the amount of harm—proportionally—the principle of
last resort would support using small intervention forces first and
then escalating rather than starting a war with massive force such as carpet bombing or nuclear warfare.
Just cause
The reason for going to war needs to be just and cannot, therefore,
be solely for recapturing things taken or punishing people who have done
wrong; innocent life must be in imminent danger and intervention must
be to protect life. A contemporary view of just cause was expressed in
1993 when the US Catholic Conference said: "Force may be used only to
correct a grave, public evil, i.e., aggression or massive violation of
the basic human rights of whole populations."
Jus in bello
Once war has begun, just war theory (jus in bello) also directs how combatants are to act or should act:
Just war conduct should be governed by the principle of distinction.
The acts of war should be directed towards enemy combatants, and not
towards non-combatants
caught in circumstances that they did not create. The prohibited acts
include bombing civilian residential areas that include no legitimate military targets, committing acts of terrorism or reprisal against civilians or prisoners of war (POWs), and attacking neutral
targets. Moreover, combatants are not permitted to attack enemy
combatants who have surrendered, or who have been captured, or who are
injured and not presenting an immediate lethal threat, or who are parachuting from disabled aircraft and are not airborne forces, or who are shipwrecked.
Just war conduct should be governed by the principle of
proportionality. Combatants must make sure that the harm caused to
civilians or civilian property is not excessive in relation to the
concrete and direct military advantage anticipated by an attack on a legitimate military objective.
This principle is meant to discern the correct balance between the
restriction imposed by a corrective measure and the severity of the
nature of the prohibited act.
Just war conduct should be governed by the principle of military
necessity. An attack or action must be intended to help in the defeat of
the enemy; it must be an attack on a legitimate military objective,
and the harm caused to civilians or civilian property must be
proportional and not excessive in relation to the concrete and direct
military advantage anticipated. This principle is meant to limit
excessive and unnecessary death and destruction.
Combatants may not use weapons or other methods of warfare that are considered evil, such as mass rape, forcing enemy combatants to fight against their own side or using weapons whose effects cannot be controlled (e.g., nuclear/biological weapons).
Ending a war: Jus post bellum
In
recent years, some theorists, such as Gary Bass, Louis Iasiello and
Brian Orend, have proposed a third category within the just war theory. Jus post bellum
concerns justice after a war, including peace treaties, reconstruction,
environmental remediation, war crimes trials, and war reparations. Jus post bellum has been added to deal with the fact that some hostile actions may take place outside a traditional battlefield. Jus post bellum
governs the justice of war termination and peace agreements, as well as
the prosecution of war criminals, and publicly labelled terrorists. The
idea has largely been added to help decide what to do if there are
prisoners that have been taken during battle. It is, through government
labelling and public opinion, that people use jus post bellum to
justify the pursuit of labelled terrorist for the safety of the
government's state in a modern context. The actual fault lies with the
aggressor and so by being the aggressor, they forfeit their rights for
honourable treatment by their actions. That theory is used to justify
the actions taken by anyone fighting in a war to treat prisoners outside
of war.
The moral equality of combatants (MEC) or moral equality of soldiers is the principle that soldiers fighting on both sides of a war are equally honorable, unless they commit war crimes, regardless of whether they fight for a just cause. MEC is a key element underpinning international humanitarian law (IHL)—which applies the rules of war equally to both sides—and traditional just war theory. According to philosopher Henrik Syse,
MEC presents a serious quandary because "it makes as little practical
sense to ascribe blame to individual soldiers for the cause of the war
in which they fight as it makes theoretical sense to hold the fighters
on the two sides to be fully morally equal". The moral equality of combatants has been cited in relation to the Israeli–Palestinian conflict or the U.S.-led wars in Iraq and Afghanistan.
Traditional view
MEC as a formal doctrine was articulated in Just and Unjust Wars (1977) by Michael Walzer, although earlier just war theorists such as Augustine and Aquinas
argued that soldiers should obey their leaders when fighting. There is
dispute over whether early modern just war theory promoted MEC. A full crystallization of MEC could only occur after both jus ad bellum and jus in bello were developed. Proponents of MEC argue that individual soldiers are not well-placed to determine the justness of a war.
Walzer, for example, argues that the entire responsibility for an
unjust war is borne by military and civilian leaders who choose to go to
war, rather than individual soldiers who have little say in the matter.
MEC is one of the underpinnings of international humanitarian law (IHL), which applies equally to both sides regardless of the justice of their cause. In IHL, this principle is known as equality of belligerents. This contradicts the legal principle of ex injuria jus non oritur that no one should be able to derive benefit from their illegal action. British jurist Hersch Lauterpacht
articulated the pragmatic basis of belligerent equality, stating: "it
is impossible to visualize the conduct of hostilities in which one side
would be bound by rules of warfare without benefiting from them and the
other side would benefit from them without being bound by them".
International law scholar Eliav Lieblich states that the moral
responsibility of soldiers who participate in unjust wars is "one of the
stickiest problems in the ethics of war".
Revisionist challenge
There is no equivalent to MEC in peacetime circumstances. In 2006, philosopher Jeff McMahan began to contest MEC, arguing that soldiers fighting an unjust or illegal war are not morally equal to those fighting in self-defense. Although they do not favor criminal prosecution of individual soldiers who fight in an unjust war, they argue that individual soldiers should assess the legality or morality of the war they are asked to fight, and refuse if it is an illegal or unjust war.
According to the revisionist view, a soldier or officer who knows or
strongly suspects that their side is fighting an unjust war has a moral
obligation not to fight it, unless this would entail capital punishment
or some other extreme consequence.
Opponents of MEC—sometimes grouped under the label of revisionist just war theory—nevertheless generally support the belligerent equality principle of IHL on pragmatic grounds. In his 2018 book The Crime of Aggression, Humanity, and the Soldier, law scholar Tom Dannenbaum
was one of the first to propose legal reforms based on rejection of
MEC. Dannenbaum argued that soldiers who refuse to fight illegal wars
should be allowed selective conscientious objection and be accepted as
refugees if they have to flee their country. He also argued that
soldiers fighting against a war of aggression should be recognized as
victims in postwar reparations processes.
Public opinion
A
2019 study found that the majority of Americans endorse the revisionist
view on MEC and many are even willing to allow a war crime against
noncombatants to go unpunished when committed by soldiers who are
fighting a just war. Responding to the study, Walzer cautioned that differently phrased questions might have led to different results.
Voltage fluctuations measured by the EEG bioamplifier and electrodes allow the evaluation of normal brain activity. As the electrical activity monitored by EEG originates in neurons in the underlying brain tissue, the recordings made by the electrodes on the surface of the scalp
vary in accordance with their orientation and distance to the source of
the activity. Furthermore, the value recorded is distorted by
intermediary tissues and bones, which act in a manner akin to resistors
and capacitors in an electrical circuit.
This means that not all neurons will contribute equally to an EEG
signal, with an EEG predominately reflecting the activity of cortical neurons near the electrodes on the scalp. Deep structures within the brain further away from the electrodes will not contribute directly to an EEG; these include the base of the corticalgyrus, mesial walls of the major lobes, hippocampus, thalamus, and brain stem.
A healthy human EEG will show certain patterns of activity that
correlate with how awake a person is. The range of frequencies one
observes are between 1 and 30 Hz, and amplitudes will vary between 20
and 100 μV. The observed frequencies are subdivided into various groups:
alpha (8–13 Hz), beta (13–30 Hz), delta (0.5–4 Hz), and theta (4–7 Hz). Alpha waves
are observed when a person is in a state of relaxed wakefulness and are
mostly prominent over the parietal and occipital sites. During intense mental activity,
beta waves are more prominent in frontal areas as well as other
regions. If a relaxed person is told to open their eyes, one observes
alpha activity decreasing and an increase in beta activity. Theta and delta waves are not generally seen in wakefulness - if they are, it is a sign of brain dysfunction.
EEG can detect abnormal electrical discharges such as sharp waves, spikes, or spike-and-wave complexes, as observable in people with epilepsy; thus, it is often used to inform medical diagnosis. EEG can detect the onset and spatio-temporal (location and time) evolution of seizures and the presence of status epilepticus. It is also used to help diagnose sleep disorders, depth of anesthesia, coma, encephalopathies, cerebral hypoxia after cardiac arrest, and brain death. EEG used to be a first-line method of diagnosis for tumors, stroke, and other focal brain disorders,but this use has decreased with the advent of high-resolution anatomical imaging techniques such as magnetic resonance imaging (MRI) and computed tomography
(CT). Despite its limited spatial resolution, EEG continues to be a
valuable tool for research and diagnosis. It is one of the few mobile
techniques available and offers millisecond-range temporal resolution,
which is not possible with CT, PET, or MRI.
Derivatives of the EEG technique include evoked potentials (EP), which involves averaging the EEG activity time-locked to the presentation of a stimulus of some sort (visual, somatosensory, or auditory). Event-related potentials (ERPs) refer to averaged EEG responses that are time-locked to more complex processing of stimuli; this technique is used in cognitive science, cognitive psychology, and psychophysiological research.
Uses
Epilepsy
EEG is the gold standard diagnostic procedure to confirm epilepsy. The sensitivity
of a routine EEG to detect interictal epileptiform discharges at
epilepsy centers has been reported to be in the range of 29–55%.
Given the low to moderate sensitivity, a routine EEG (typically with a
duration of 20–30 minutes) can be normal in people that have epilepsy.
When an EEG shows interictal epileptiform discharges (e.g. sharp waves,
spikes, spike-and-wave, etc.) it is confirmatory of epilepsy in nearly all cases (high specificity),
however up to 3.5% of the general population may have epileptiform
abnormalities in an EEG without ever having had a seizure (low false positive rate) or with a very low risk of developing epilepsy in the future.
When a routine EEG is normal and there is a high suspicion or
need to confirm epilepsy, it may be repeated or performed with a longer
duration in the epilepsy monitoring unit (EMU) or at home with an
ambulatory EEG. In addition, there are activating maneuvers such as
photic stimulation, hyperventilation and sleep deprivation that can
increase the diagnostic yield of the EEG.
Epilepsy Monitoring Unit (EMU)
At
times, a routine EEG is not sufficient to establish the diagnosis or
determine the best course of action in terms of treatment. In this case,
attempts may be made to record an EEG while a seizure is occurring. This is known as an ictal
recording, as opposed to an interictal recording, which refers to the
EEG recording between seizures. To obtain an ictal recording, a
prolonged EEG is typically performed accompanied by a time-synchronized
video and audio recording. This can be done either as an outpatient (at
home) or during a hospital admission, preferably to an Epilepsy
Monitoring Unit (EMU) with nurses and other personnel trained in the
care of patients with seizures. Outpatient ambulatory video EEGs
typically last one to three days. An admission to an Epilepsy Monitoring
Unit typically lasts several days but may last for a week or longer.
While in the hospital, seizure medications are usually withdrawn to
increase the odds that a seizure will occur during admission. For
reasons of safety, medications are not withdrawn during an EEG outside
of the hospital. Ambulatory video EEGs, therefore, have the advantage of
convenience and are less expensive than a hospital admission, but they
also have the disadvantage of a decreased probability of recording a
clinical event.
Epilepsy monitoring is often considered when patients continue
having events despite being on anti-seizure medications or if there is
concern that the patient's events have an alternate diagnosis, e.g., psychogenic non-epileptic seizures, syncope (fainting), sub-cortical movement disorders, migraine variants, stroke, etc. In cases of epileptic seizures, continuous EEG monitoring helps to characterize seizures
and localize/lateralize the region of the brain from which a seizure
originates. This can help identify appropriate non-medication treatment
options.
In clinical use, EEG traces are visually analyzed by neurologists to
look at various features. Increasingly, quantitative analysis of EEG is
being used in conjunction with visual analysis. Quantitative analysis
displays like power spectrum analysis, alpha-delta ratio, amplitude
integrated EEG, and spike detection can help quickly identify segments
of EEG that need close visual analysis or, in some cases, be used as
surrogates for quick identification of seizures in long-term recordings.
Other brain disorders
An EEG might also be helpful for diagnosing or treating the following disorders:
Brain tumor
Brain damage from head injury
Brain dysfunction that can have a variety of causes (encephalopathy)
serve as an adjunct test of brain death in comatose patients
prognosticate in comatose patients (in certain instances) or in
newborns with brain injury from various causes around the time of birth
determine whether to wean anti-epileptic medications.
Intensive Care Unit (ICU)
EEG can also be used in intensive care units
for brain function monitoring to monitor for non-convulsive
seizures/non-convulsive status epilepticus, to monitor the effect of
sedative/anesthesia in patients in medically induced coma (for treatment
of refractory seizures or increased intracranial pressure), and to monitor for secondary brain damage in conditions such as subarachnoid hemorrhage (currently a research method).
In cases where significant brain injury is suspected, e.g., after cardiac arrest, EEG can provide some prognostic information.
If a patient with epilepsy is being considered for resective surgery,
it is often necessary to localize the focus (source) of the epileptic
brain activity with a resolution greater than what is provided by scalp
EEG. In these cases, neurosurgeons typically implant strips and grids of
electrodes or penetrating depth electrodes under the dura mater, through either a craniotomy or a burr hole. The recording of these signals is referred to as electrocorticography
(ECoG), subdural EEG (sdEEG), intracranial EEG (icEEG), or stereotactic
EEG (sEEG). The signal recorded from ECoG is on a different scale of
activity than the brain activity recorded from scalp EEG. Low-voltage,
high-frequency components that cannot be seen easily (or at all) in
scalp EEG can be seen clearly in ECoG. Further, smaller electrodes
(which cover a smaller parcel of brain surface) allow for better spatial
resolution to narrow down the areas critical for seizure onset and
propagation. Some clinical sites record data from penetrating
microelectrodes.
Home ambulatory EEG
Sometimes
it is more convenient or clinically necessary to perform ambulatory EEG
recordings in the home of the patient. These studies typically have a
duration of 24–72 hours.
Research use
EEG and the related study of ERPs are used extensively in neuroscience, cognitive science, cognitive psychology, neurolinguistics, and psychophysiological research, as well as to study human functions such as swallowing.
Any EEG techniques used in research are not sufficiently standardised
for clinical use, and many ERP studies fail to report all of the
necessary processing steps for data collection and reduction,
limiting the reproducibility and replicability of many studies. Based
on a 2024 systematic literature review and meta analysis commissioned by
the Patient-Centered Outcomes Research Institute (PCORI), EEG scans
cannot be used reliably to assist in making a clinical diagnosis of
ADHD. However, EEG continues to be used in research on mental disabilities, such as auditory processing disorder (APD), ADD, and ADHD.
Hardware costs are significantly lower than those of most other techniques
EEG prevents limited availability of technologists to provide immediate care in high traffic hospitals.
EEG only requires a quiet room and briefcase-size equipment, whereas
fMRI, SPECT, PET, MRS, or MEG require bulky and immobile equipment. For
example, MEG requires equipment consisting of liquid helium-cooled detectors that can be used only in magnetically shielded rooms, altogether costing upwards of several million dollars; and fMRI requires the use of a 1-ton magnet in, again, a shielded room.
EEG can readily have a high temporal resolution, (although
sub-millisecond resolution generates less meaningful data), because the
two to 32 data streams generated by that number of electrodes is easily
stored and processed, whereas 3D spatial technologies provide thousands
or millions times as many input data streams, and are thus limited by
hardware and software. EEG is commonly recorded at sampling rates between 250 and 2000 Hz in clinical and research settings.
EEG is relatively tolerant of subject movement, unlike most other
neuroimaging techniques. There even exist methods for minimizing, and
even eliminating movement artifacts in EEG data
EEG is silent, which allows for better study of the responses to auditory stimuli.
EEG does not aggravate claustrophobia, unlike fMRI, PET, MRS, SPECT, and sometimes MEG
EEG does not involve exposure to high-intensity (>1 Tesla)
magnetic fields, as in some of the other techniques, especially MRI and
MRS. These can cause a variety of undesirable issues with the data, and
also prohibit use of these techniques with participants that have metal
implants in their body, such as metal-containing pacemakers
ERP studies can be conducted with relatively simple paradigms, compared with IE block-design fMRI studies
Relatively non-invasive, in contrast to electrocorticography, which requires electrodes to be placed on the actual surface of the brain.
EEG also has some characteristics that compare favorably with behavioral testing:
EEG can detect covert processing (i.e., processing that does not require a response)
EEG can be used in subjects who are incapable of making a motor response
EEG is a method widely used in the study of sport performance, valued for its portability and lightweight design
Some ERP components can be detected even when the subject is not attending to the stimuli
Unlike other means of studying reaction time, ERPs can elucidate stages of processing (rather than just the result)
the simplicity of EEG readily provides for tracking of brain changes
during different phases of life. EEG sleep analysis can indicate
significant aspects of the timing of brain development, including
evaluating adolescent brain maturation.
In EEG there is a better understanding of what signal is
measured as compared to other research techniques, e.g. the BOLD
response in MRI.
Disadvantages
Low spatial resolution on the scalp. fMRI,
for example, can directly display areas of the brain that are active,
while EEG requires intense interpretation just to hypothesize what areas
are activated by a particular response.
Depending on the orientation and location of the dipole causing an
EEG change, there may be a false localization due to the inverse
problem.
EEG poorly measures neural activity that occurs below the upper layers of the brain (the cortex).
Unlike PET and MRS, cannot identify specific locations in the brain at which various neurotransmitters, drugs, etc. can be found.
Often takes a long time to connect a subject to EEG, as it requires
precise placement of dozens of electrodes around the head and the use of
various gels, saline solutions, and/or pastes to maintain good
conductivity, and a cap is used to keep them in place. While the length
of time differs dependent on the specific EEG device used, as a general
rule it takes considerably less time to prepare a subject for MEG, fMRI,
MRS, and SPECT.
Signal-to-noise ratio is poor, so sophisticated data analysis and
relatively large numbers of subjects are needed to extract useful
information from EEG.
EEGs are not currently very compatible with individuals who have
coarser and/or textured hair. Even protective styles can pose issues
during testing. Researchers are currently trying to build better options
for patients and technicians alike.
Furthermore, researchers are starting to implement more
culturally-informed data collection practices to help reduce racial
biases in EEG research.
With other neuroimaging techniques
Simultaneous EEG recordings and fMRI scans have been obtained successfully,
though recording both at the same time effectively requires that
several technical difficulties be overcome, such as the presence of
ballistocardiographic artifact, MRI pulse artifact and the induction of
electrical currents in EEG wires that move within the strong magnetic
fields of the MRI. While challenging, these have been successfully
overcome in a number of studies.
MRI's produce detailed images created by generating strong
magnetic fields that may induce potentially harmful displacement force
and torque. These fields produce potentially harmful radio frequency
heating and create image artifacts rendering images useless. Due to
these potential risks, only certain medical devices can be used in an MR
environment.
Similarly, simultaneous recordings with MEG and EEG have also
been conducted, which has several advantages over using either technique
alone:
EEG requires accurate information about certain aspects of the
skull that can only be estimated, such as skull radius, and
conductivities of various skull locations. MEG does not have this issue,
and a simultaneous analysis allows this to be corrected for.
MEG and EEG both detect activity below the surface of the cortex
very poorly, and like EEG, the level of error increases with the depth
below the surface of the cortex one attempts to examine. However, the
errors are very different between the techniques, and combining them
thus allows for correction of some of this noise.
MEG has access to virtually no sources of brain activity below a few
centimetres under the cortex. EEG, on the other hand, can receive
signals from greater depth, albeit with a high degree of noise.
Combining the two makes it easier to determine what in the EEG signal
comes from the surface (since MEG is very accurate in examining signals
from the surface of the brain), and what comes from deeper in the brain,
thus allowing for analysis of deeper brain signals than either EEG or
MEG on its own.
Recently, a combined EEG/MEG (EMEG) approach has been investigated
for the purpose of source reconstruction in epilepsy diagnosis.
EEG has also been combined with positron emission tomography.
This provides the advantage of allowing researchers to see what EEG
signals are associated with different drug actions in the brain.
Recent studies using machine learning techniques such as neural networks with statistical temporal features extracted from frontal lobe EEG brainwave data has shown high levels of success in classifying mental states (Relaxed, Neutral, Concentrating), mental emotional states (Negative, Neutral, Positive) and thalamocortical dysrhythmia.
Mechanisms
The brain's electrical charge is maintained by billions of neurons. Neurons are electrically charged (or "polarized") by membrane transport proteins that pump ions across their membranes. Neurons are constantly exchanging ions with the extracellular milieu, for example to maintain resting potential and to propagate action potentials.
Ions of similar charge repel each other, and when many ions are pushed
out of many neurons at the same time, they can push their neighbours,
who push their neighbours, and so on, in a wave. This process is known
as volume conduction. When the wave of ions reaches the electrodes on
the scalp, they can push or pull electrons on the metal in the
electrodes. Since metal conducts the push and pull of electrons easily,
the difference in push or pull voltages between any two electrodes can
be measured by a voltmeter. Recording these voltages over time gives us the EEG.
The electric potential generated by an individual neuron is far too small to be picked up by EEG or MEG. EEG activity therefore always reflects the summation of the synchronous activity
of thousands or millions of neurons that have similar spatial
orientation. If the cells do not have similar spatial orientation, their
ions do not line up and create waves to be detected. Pyramidal neurons
of the cortex are thought to produce the most EEG signal because they
are well-aligned and fire together. Because voltage field gradients fall
off with the square of distance, activity from deep sources is more
difficult to detect than currents near the skull.
Scalp EEG activity shows oscillations at a variety of frequencies. Several of these oscillations have characteristic frequency ranges, spatial distributions and are associated with different states of brain functioning (e.g., waking and the various sleep stages). These oscillations represent synchronized activity
over a network of neurons. The neuronal networks underlying some of
these oscillations are understood (e.g., the thalamocortical resonance
underlying sleep spindles),
while many others are not (e.g., the system that generates the
posterior basic rhythm). Research that measures both EEG and neuron
spiking finds the relationship between the two is complex, with a
combination of EEG power in the gamma band and phase in the delta band relating most strongly to neuron spike activity.
Method
In conventional scalp EEG, the recording is obtained by placing electrodes on the scalp with a conductive gel or paste, usually after preparing the scalp area by light abrasion to reduce impedance
due to dead skin cells. Many systems typically use electrodes, each of
which is attached to an individual wire. Some systems use caps or nets
into which electrodes are embedded; this is particularly common when
high-density arrays of electrodes are needed.
Electrode locations and names are specified by the International 10–20 system
for most clinical and research applications (except when high-density
arrays are used). This system ensures that the naming of electrodes is
consistent across laboratories. In most clinical applications, 19
recording electrodes (plus ground and system reference) are used. A smaller number of electrodes are typically used when recording EEG from neonates.
Additional electrodes can be added to the standard set-up when a
clinical or research application demands increased spatial resolution
for a particular area of the brain. High-density arrays (typically via
cap or net) can contain up to 256 electrodes more-or-less evenly spaced
around the scalp.
Each electrode is connected to one input of a differential amplifier
(one amplifier per pair of electrodes); a common system reference
electrode is connected to the other input of each differential
amplifier. These amplifiers amplify the voltage between the active
electrode and the reference (typically 1,000–100,000 times, or 60–100 dB
of power gain). In analog EEG, the signal is then filtered (next
paragraph), and the EEG signal is output as the deflection of pens as
paper passes underneath. Most EEG systems these days, however, are
digital, and the amplified signal is digitized via an analog-to-digital converter, after being passed through an anti-aliasing filter.
Analog-to-digital sampling typically occurs at 256–512 Hz in clinical
scalp EEG; sampling rates of up to 20 kHz are used in some research
applications.
During the recording, a series of activation procedures may be
used. These procedures may induce normal or abnormal EEG activity that
might not otherwise be seen. These procedures include hyperventilation,
photic stimulation (with a strobe light), eye closure, mental activity,
sleep and sleep deprivation. During (inpatient) epilepsy monitoring, a
patient's typical seizure medications may be withdrawn.
The digital EEG signal is stored electronically and can be filtered for display. Typical settings for the high-pass filter and a low-pass filter are 0.5–1 Hz and 35–70 Hz respectively. The high-pass filter typically filters out slow artifact, such as electrogalvanic signals and movement artifact, whereas the low-pass filter filters out high-frequency artifacts, such as electromyographic signals. An additional notch filter
is typically used to remove artifact caused by electrical power lines
(60 Hz in the United States and 50 Hz in many other countries).
The EEG signals can be captured with opensource hardware such as OpenBCI and the signal can be processed by freely available EEG software such as EEGLAB or the Neurophysiological Biomarker Toolbox.
As part of an evaluation for epilepsy surgery, it may be
necessary to insert electrodes near the surface of the brain, under the
surface of the dura mater. This is accomplished via burr hole or craniotomy. This is referred to variously as "electrocorticography (ECoG)", "intracranial EEG (I-EEG)" or "subdural EEG (SD-EEG)". Depth electrodes may also be placed into brain structures, such as the amygdala or hippocampus,
structures, which are common epileptic foci and may not be "seen"
clearly by scalp EEG. The electrocorticographic signal is processed in
the same manner as digital scalp EEG (above), with a couple of caveats.
ECoG is typically recorded at higher sampling rates than scalp EEG
because of the requirements of Nyquist theorem
– the subdural signal is composed of a higher predominance of higher
frequency components. Also, many of the artifacts that affect scalp EEG
do not impact ECoG, and therefore display filtering is often not needed.
A typical adult human EEG signal is about 10 μV to 100 μV in amplitude when measured from the scalp.
Since an EEG voltage signal represents a difference between the
voltages at two electrodes, the display of the EEG for the reading
encephalographer may be set up in one of several ways. The
representation of the EEG channels is referred to as a montage.
Sequential montage
Each channel (i.e., waveform) represents the difference between two
adjacent electrodes. The entire montage consists of a series of these
channels. For example, the channel "Fp1-F3" represents the difference in
voltage between the Fp1 electrode and the F3 electrode. The next
channel in the montage, "F3-C3", represents the voltage difference
between F3 and C3, and so on through the entire array of electrodes.
Referential montage
Each channel represents the difference between a certain electrode
and a designated reference electrode. There is no standard position for
this reference; it is, however, at a different position than the
"recording" electrodes. Midline positions are often used because they do
not amplify the signal in one hemisphere vs. the other, such as Cz, Oz,
Pz etc. as online reference. The other popular offline references are:
REST reference: which is an offline computational reference at
infinity where the potential is zero. REST (reference electrode
standardization technique) takes the equivalent sources inside the brain
of any a set of scalp recordings as springboard to link the actual
recordings with any an online or offline( average, linked ears etc.)
non-zero reference to the new recordings with infinity zero as the
standardized reference.
"linked ears": which is a physical or mathematical average of electrodes attached to both earlobes or mastoids.
Average reference montage
The outputs of all of the amplifiers are summed and averaged, and
this averaged signal is used as the common reference for each channel.
Laplacian montage
Each channel represents the difference between an electrode and a weighted average of the surrounding electrodes.
When analog (paper) EEGs are used, the technologist switches between
montages during the recording in order to highlight or better
characterize certain features of the EEG. With digital EEG, all signals
are typically digitized and stored in a particular (usually referential)
montage; since any montage can be constructed mathematically from any
other, the EEG can be viewed by the electroencephalographer in any
display montage that is desired.
The EEG is read by a clinical neurophysiologist or neurologist (depending on local custom and law regarding medical specialities),
optimally one who has specific training in the interpretation of EEGs
for clinical purposes. This is done by visual inspection of the
waveforms, called graphoelements. The use of computer signal processing
of the EEG – so-called quantitative electroencephalography – is somewhat controversial when used for clinical purposes (although there are many research uses).
Dry EEG electrodes
In the early 1990s Babak Taheri, at University of California, Davis
demonstrated the first single and also multichannel dry active
electrode arrays using micro-machining. The single channel dry EEG
electrode construction and results were published in 1994. The arrayed electrode was also demonstrated to perform well compared to silver/silver chloride electrodes. The device consisted of four sites of sensors with integrated electronics to reduce noise by impedance matching.
The advantages of such electrodes are: (1) no electrolyte used, (2) no
skin preparation, (3) significantly reduced sensor size, and (4)
compatibility with EEG monitoring systems. The active electrode array is
an integrated system made of an array of capacitive sensors with local
integrated circuitry housed in a package with batteries to power the
circuitry. This level of integration was required to achieve the
functional performance obtained by the electrode. The electrode was
tested on an electrical test bench and on human subjects in four
modalities of EEG activity, namely: (1) spontaneous EEG, (2) sensory
event-related potentials, (3) brain stem potentials, and (4) cognitive
event-related potentials. The performance of the dry electrode compared
favorably with that of the standard wet electrodes in terms of skin
preparation, no gel requirements (dry), and higher signal-to-noise
ratio.
In 1999 researchers at Case Western Reserve University, in Cleveland, Ohio, led by Hunter Peckham, used 64-electrode EEG skullcap to return limited hand movements to quadriplegic
Jim Jatich. As Jatich concentrated on simple but opposite concepts like
up and down, his beta-rhythm EEG output was analysed using software to
identify patterns in the noise. A basic pattern was identified and used
to control a switch: Above average activity was set to on, below average
off. As well as enabling Jatich to control a computer cursor the
signals were also used to drive the nerve controllers embedded in his
hands, restoring some movement.
In 2018, a functional dry electrode composed of a polydimethylsiloxane elastomer filled with conductive carbon nanofibers was reported. This research was conducted at the U.S. Army Research Laboratory.
EEG technology often involves applying a gel to the scalp which
facilitates strong signal-to-noise ratio. This results in more
reproducible and reliable experimental results. Since patients dislike
having their hair filled with gel, and the lengthy setup requires
trained staff on hand, utilizing EEG outside the laboratory setting can
be difficult. Additionally, it has been observed that wet electrode sensors' performance reduces after a span of hours. Therefore, research has been directed to developing dry and semi-dry EEG bioelectronic interfaces.
Dry electrode signals depend upon mechanical contact. Therefore,
it can be difficult getting a usable signal because of impedance between
the skin and the electrode. Some EEG systems attempt to circumvent this issue by applying a saline solution. Others have a semi dry nature and release small amounts of the gel upon contact with the scalp.
Another solution uses spring loaded pin setups. These may be
uncomfortable. They may also be dangerous if they were used in a
situation where a patient could bump their head since they could become
lodged after an impact trauma incident.
Currently, headsets are available incorporating dry electrodes with up to 30 channels.
Such designs are able to compensate for some of the signal quality
degradation related to high impedances by optimizing pre-amplification,
shielding and supporting mechanics.
Limitations
EEG has several limitations. Most important is its poor spatial resolution.
EEG is most sensitive to a particular set of post-synaptic potentials:
those generated in superficial layers of the cortex, on the crests of gyri directly abutting the skull and radial to the skull. Dendrites which are deeper in the cortex, inside sulci, in midline or deep structures (such as the cingulate gyrus or hippocampus), or producing currents that are tangential to the skull, make far less contribution to the EEG signal.
EEG recordings do not directly capture axonal action potentials. An action potential can be accurately represented as a current quadrupole,
meaning that the resulting field decreases more rapidly than the ones
produced by the current dipole of post-synaptic potentials.
In addition, since EEGs represent averages of thousands of neurons, a
large population of cells in synchronous activity is necessary to cause a
significant deflection on the recordings. Action potentials are very
fast and, as a consequence, the chances of field summation are slim.
However, neural backpropagation,
as a typically longer dendritic current dipole, can be picked up by EEG
electrodes and is a reliable indication of the occurrence of neural
output.
Not only do EEGs capture dendritic currents almost exclusively as
opposed to axonal currents, they also show a preference for activity on
populations of parallel dendrites and transmitting current in the same
direction at the same time. Pyramidal neurons
of cortical layers II/III and V extend apical dendrites to layer I.
Currents moving up or down these processes underlie most of the signals
produced by electroencephalography.
Therefore, EEG provides information with a large bias to select
neuron types, and generally should not be used to make claims about
global brain activity. The meninges, cerebrospinal fluid and skull "smear" the EEG signal, obscuring its intracranial source.
It is mathematically impossible to reconstruct a unique intracranial current source for a given EEG signal, as some currents produce potentials that cancel each other out. This is referred to as the inverse problem. However, much work has been done to produce remarkably good estimates of, at least, a localized electric dipole that represents the recorded currents.
EEG vis-à-vis fMRI, fNIRS, fUS and PET
EEG
has several strong points as a tool for exploring brain activity. EEGs
can detect changes over milliseconds, which is excellent considering an action potential takes approximately 0.5–130 milliseconds to propagate across a single neuron, depending on the type of neuron. Other methods of looking at brain activity, such as PET, fMRI or fUS
have time resolution between seconds and minutes. EEG measures the
brain's electrical activity directly, while other methods record changes
in blood flow (e.g., SPECT, fMRI, fUS) or metabolic activity (e.g., PET, NIRS), which are indirect markers of brain electrical activity.
EEG can be used simultaneously with fMRI or fUS so that
high-temporal-resolution data can be recorded at the same time as
high-spatial-resolution data, however, since the data derived from each
occurs over a different time course, the data sets do not necessarily
represent exactly the same brain activity.
There are technical difficulties associated with combining EEG and fMRI
including the need to remove the MRI gradient artifact present
during MRI acquisition. Furthermore, currents can be induced in moving
EEG electrode wires due to the magnetic field of the MRI.
EEG can be used simultaneously with NIRS
or fUS without major technical difficulties. There is no influence of
these modalities on each other and a combined measurement can give
useful information about electrical activity as well as hemodynamics at
medium spatial resolution.
EEG vis-à-vis MEG
EEG reflects correlated synaptic activity caused by post-synaptic potentials of cortical neurons. The ionic currents involved in the generation of fast action potentials may not contribute greatly to the averaged field potentials representing the EEG.
More specifically, the scalp electrical potentials that produce EEG are
generally thought to be caused by the extracellular ionic currents
caused by dendritic electrical activity, whereas the fields producing magnetoencephalographic signals are associated with intracellular ionic currents.
Normal activity
Human EEG with prominent resting state
activity – alpha-rhythm. Left: EEG traces (horizontal – time in seconds;
vertical – amplitudes, scale 100 μV). Right: power spectra of shown
signals (vertical lines – 10 and 20 Hz, scale is linear). Alpha-rhythm
consists of sinusoidal-like waves with frequencies in 8–12 Hz range
(11 Hz in this case) more prominent in posterior sites. Alpha range is
red at power spectrum graph.
Human EEG with in resting state. Left: EEG
traces (horizontal – time in seconds; vertical – amplitudes, scale
100 μV). Right: power spectra of shown signals (vertical lines – 10 and
20 Hz, scale is linear). 80–90% of people have prominent sinusoidal-like
waves with frequencies in 8–12 Hz range – alpha rhythm. Others (like
this) lack this type of activity.
Common artifacts in human EEG. 1:
Electrooculographic artifact caused by the excitation of eyeball's
muscles (related to blinking, for example). Big-amplitude, slow,
positive wave prominent in frontal electrodes. 2: Electrode's artifact
caused by bad contact (and thus bigger impedance) between P3 electrode
and skin. 3: Swallowing artifact. 4: Common reference electrode's
artifact caused by bad contact between reference electrode and skin.
Huge wave similar in all channels.
One second of EEG signal
The EEG is typically described in terms of (1) rhythmic activity
and (2) transients. The rhythmic activity is divided into bands by
frequency. To some degree, these frequency bands are a matter of
nomenclature (i.e., any rhythmic activity between 8–12 Hz can be
described as "alpha"), but these designations arose because rhythmic
activity within a certain frequency range was noted to have a certain
distribution over the scalp or a certain biological significance.
Frequency bands are usually extracted using spectral methods (for
instance Welch) as implemented for instance in freely available EEG
software such as EEGLAB or the Neurophysiological Biomarker Toolbox.
Computational processing of the EEG is often named quantitative electroencephalography (qEEG).
Most of the cerebral signal observed in the scalp EEG falls in
the range of 1–20 Hz (activity below or above this range is likely to be
artifactual, under standard clinical recording techniques). Waveforms
are subdivided into bandwidths known as alpha, beta, theta, and delta to
signify the majority of the EEG used in clinical practice.
Associated with inhibition of elicited responses (has been found to
spike in situations where a person is actively trying to repress a
response or action).
Displays during cross-modal sensory processing (perception that combines two different senses, such as sound and sight)
Also is shown during short-term memory matching of recognized objects, sounds, or tactile sensations
A decrease in gamma-band activity may be associated with
cognitive decline, especially when related to the theta band; however,
this has not been proven for use as a clinical diagnostic measurement
Mu suppression could indicate that motor mirror neurons are working. Deficits in Mu suppression, and thus in mirror neurons, might play a role in autism.
The practice of using only whole numbers in the definitions comes
from practical considerations in the days when only whole cycles could
be counted on paper records. This leads to gaps in the definitions, as
seen elsewhere on this page. The theoretical definitions have always
been more carefully defined to include all frequencies. Unfortunately
there is no agreement in standard reference works on what these ranges
should be– values for the upper end of alpha and lower end of
beta include 12, 13, 14 and 15. If the threshold is taken as 14 Hz, then
the slowest beta wave has about the same duration as the longest spike
(70 ms), which makes this the most useful value.
Wave patterns
Delta waves
is the frequency range up to 4 Hz. It tends to be the highest in
amplitude and the slowest waves. It is seen normally in adults in slow-wave sleep.
It is also seen normally in babies. It may occur focally with
subcortical lesions and in general distribution with diffuse lesions,
metabolic encephalopathy hydrocephalus or deep midline lesions. It is
usually most prominent frontally in adults (e.g. FIRDA – frontal
intermittent rhythmic delta) and posteriorly in children (e.g. OIRDA –
occipital intermittent rhythmic delta).
Theta
is the frequency range from 4 Hz to 7 Hz. Theta is seen normally in
young children. It may be seen in drowsiness or arousal in older
children and adults; it can also be seen in meditation.
Excess theta for age represents abnormal activity. It can be seen as a
focal disturbance in focal subcortical lesions; it can be seen in
generalized distribution in diffuse disorder or metabolic encephalopathy
or deep midline disorders or some instances of hydrocephalus. On the
contrary this range has been associated with reports of relaxed,
meditative, and creative states.
Alpha is the frequency range from 8 Hz to 12 Hz. Hans Berger
named the first rhythmic EEG activity he observed the "alpha wave".
This was the "posterior basic rhythm" (also called the "posterior
dominant rhythm" or the "posterior alpha rhythm"), seen in the posterior
regions of the head on both sides, higher in amplitude on the dominant
side. It emerges with closing of the eyes and with relaxation, and
attenuates with eye opening or mental exertion. The posterior basic
rhythm is actually slower than 8 Hz in young children (therefore
technically in the theta range).
In addition to the posterior basic rhythm, there are other normal alpha rhythms such as the mu rhythm (alpha activity in the contralateral sensory and motor
cortical areas) that emerges when the hands and arms are idle; and the
"third rhythm" (alpha activity in the temporal or frontal lobes).
Alpha can be abnormal; for example, an EEG that has diffuse alpha
occurring in coma and is not responsive to external stimuli is referred
to as "alpha coma".
Beta
is the frequency range from 13 Hz to about 30 Hz. It is seen usually on
both sides in symmetrical distribution and is most evident frontally.
Beta activity is closely linked to motor behavior and is generally
attenuated during active movements.
Low-amplitude beta with multiple and varying frequencies is often
associated with active, busy or anxious thinking and active
concentration. Rhythmic beta with a dominant set of frequencies is
associated with various pathologies, such as Dup15q syndrome, and drug effects, especially benzodiazepines.
It may be absent or reduced in areas of cortical damage. It is the
dominant rhythm in patients who are alert or anxious or who have their
eyes open.
Gamma
is the frequency range approximately 30–100 Hz. Gamma rhythms are
thought to represent binding of different populations of neurons
together into a network for the purpose of carrying out a certain
cognitive or motor function.
Mu
range is 8–13 Hz and partly overlaps with other frequencies. It
reflects the synchronous firing of motor neurons in rest state. Mu
suppression is thought to reflect motor mirror neuron systems, because
when an action is observed, the pattern extinguishes, possibly because
the normal and mirror neuronal systems "go out of sync" and interfere
with one other.
"Ultra-slow" or "near-DC"
activity is recorded using DC amplifiers in some research contexts. It
is not typically recorded in a clinical context because the signal at
these frequencies is susceptible to a number of artifacts.
Some features of the EEG are transient rather than rhythmic. Spikes and sharp waves may represent seizure activity or interictal
activity in individuals with epilepsy or a predisposition toward
epilepsy. Other transient features are normal: vertex waves and sleep
spindles are seen in normal sleep.
There are types of activity that are statistically uncommon, but
not associated with dysfunction or disease. These are often referred to
as "normal variants". The mu rhythm is an example of a normal variant.
The normal electroencephalogram (EEG) varies by age. The prenatal EEG
and neonatal EEG is quite different from the adult EEG. Fetuses in the
third trimester and newborns display two common brain activity patterns:
"discontinuous" and "trace alternant." "Discontinuous" electrical
activity refers to sharp bursts of electrical activity followed by low
frequency waves. "Trace alternant" electrical activity describes sharp
bursts followed by short high amplitude intervals and usually indicates
quiet sleep in newborns. The EEG in childhood generally has slower frequency oscillations than the adult EEG.
The normal EEG also varies depending on state. The EEG is used along with other measurements (EOG, EMG) to define sleep stages in polysomnography.
Stage I sleep (equivalent to drowsiness in some systems) appears on the
EEG as drop-out of the posterior basic rhythm. There can be an increase
in theta frequencies. Santamaria and Chiappa cataloged a number of the
variety of patterns associated with drowsiness. Stage II sleep is
characterized by sleep spindles – transient runs of rhythmic activity in
the 12–14 Hz range (sometimes referred to as the "sigma" band) that
have a frontal-central maximum. Most of the activity in Stage II is in
the 3–6 Hz range. Stage III and IV sleep are defined by the presence of
delta frequencies and are often referred to collectively as "slow-wave
sleep". Stages I–IV comprise non-REM (or "NREM") sleep. The EEG in REM
(rapid eye movement) sleep appears somewhat similar to the awake EEG.
EEG under general anesthesia depends on the type of anesthetic employed. With halogenated anesthetics, such as halothane or intravenous agents, such as propofol,
a rapid (alpha or low beta), nonreactive EEG pattern is seen over most
of the scalp, especially anteriorly; in some older terminology this was
known as a WAR (widespread anterior rapid) pattern, contrasted with a
WAIS (widespread slow) pattern associated with high doses of opiates.
Anesthetic effects on EEG signals are beginning to be understood at the
level of drug actions on different kinds of synapses and the circuits
that allow synchronized neuronal activity.
Artifacts
"Breach effect" redirects here. For the birth position, see breech birth.
EEG is an extremely useful technique for studying brain activity, but the signal measured is always contaminated by artifacts
which can impact the analysis of the data. An artifact is any measured
signal that does not originate within the brain. Although multiple
algorithms exist for the removal of artifacts, the problem of how to
deal with them remains an open question. The source of artifacts can be
from issues relating to the instrument, such as faulty electrodes, line
noise or high electrode impedance, or they may be from the physiology
of the subject being recorded. This can include, eye blinks and
movement, cardiac activity and muscle activity and these types of
artifacts are more complicated to remove. Artifacts may bias the visual
interpretation of EEG data as some may mimic cognitive activity that
could affect diagnoses of problems such as Alzheimer's disease or sleep
disorders. As such the removal of such artifacts in EEG data used for
practical applications is of the utmost importance.
Artifact removal
It
is important to be able to distinguish artifacts from genuine brain
activity in order to prevent incorrect interpretations of EEG data.
General approaches for the removal of artifacts from the data are,
prevention, rejection and cancellation. The goal of any approach is to
develop methodology capable of identifying and removing artifacts
without affecting the quality of the EEG signal. As artifact sources are
quite different the majority of researchers focus on developing
algorithms that will identify and remove a single type of noise in the
signal. Simple filtering using a notch filter
is commonly employed to reject components with a 50/60 Hz frequency.
However such simple filters are not an appropriate choice for dealing
with all artifacts, as for some, their frequencies will overlap with the
EEG frequencies.
Regression algorithms have a moderate computation cost and are
simple. They represented the most popular correction method up until the
mid-1990s when they were replaced by "blind source separation" type
methods. Regression algorithms work on the premise that all artifacts
are comprised by one or more reference channels. Subtracting these
reference channels from the other contaminated channels, in either the
time or frequency domain, by estimating the impact of the reference
channels on the other channels, would correct the channels for the
artifact. Although the requirement of reference channels ultimately lead
to this class of algorithm being replaced, they still represent the
benchmark to which modern algorithms are evaluated against.
Blind source separation (BSS) algorithms employed to remove artifacts include principal component analysis (PCA) and independent component analysis (ICA) and several algorithms in this class have been successful at tackling most physiological artifacts.
Physiological artifacts
Ocular artifacts
Ocular
artifacts affect the EEG signal significantly. This is due to eye
movements involving a change in electric fields surrounding the eyes,
distorting the electric field over the scalp, and as EEG is recorded on
the scalp, it therefore distorts the recorded signal. A difference of
opinion exists among researchers, with some arguing ocular artifacts
are, or may be reasonably described as a single generator, whilst others
argue it is important to understand the potentially complicated
mechanisms. Three potential mechanisms have been proposed to explain the
ocular artifact.
The first is corneal retinal dipole movement which argues that an electric dipole
is formed between the cornea and retina, as the former is positively
and the latter negatively charged. When the eye moves, so does this
dipole which impacts the electrical field over the scalp, this is the
most standard view. The second mechanism is retinal dipole movement,
which is similar to the first but differing in that it argues there is a
potential difference, hence dipole across the retina with the cornea
having little effect. The third mechanism is eyelid movement. It is
known that there is a change in voltage around the eyes when the eyelid
moves, even if the eyeball does not. It is thought that the eyelid can
be described as a sliding potential source and that the impacting of
blinking is different to eye movement on the recorded EEG.
Eyelid fluttering artifacts of a characteristic type were
previously called Kappa rhythm (or Kappa waves). It is usually seen in
the prefrontal leads, that is, just over the eyes. Sometimes they are
seen with mental activity. They are usually in the Theta (4–7 Hz) or
Alpha (7–14 Hz) range. They were named because they were believed to
originate from the brain. Later study revealed they were generated by
rapid fluttering of the eyelids, sometimes so minute that it was
difficult to see. They are in fact noise in the EEG reading, and should
not technically be called a rhythm or wave. Therefore, current usage in
electroencephalography refers to the phenomenon as an eyelid fluttering
artifact, rather than a Kappa rhythm (or wave).
The propagation of the ocular artifact is impacted by multiple
factors including the properties of the subject's skull, neuronal
tissues and skin but the signal may be approximated as being inversely
proportional to the distance from the eyes squared. The electrooculogram
(EOG) consists of a series of electrodes measuring voltage changes
close to the eye and is the most common tool for dealing with the eye
movement artifact in the EEG signal.
Muscular artifacts
Another
source of artifacts are various muscle movements across the body. This
particular class of artifact is usually recorded by all electrodes on
the scalp due to myogenic activity
(increase or decrease of blood pressure). The origin of these
artifacts have no single location and arises from functionally
independent muscle groups, meaning the characteristics of the artifact
are not constant. The observed patterns due to muscular artifacts will
change depending on subject sex, the particular muscle tissue, and its
degree of contraction. The frequency range for muscular artifacts is
wide and overlaps with every classic EEG rhythm. However most of the
power is concentrated in the lower range of the observed frequencies of
20 to 300 Hz making the gamma band particularly susceptible to muscular
artifacts. Some muscle artifacts may have activity with a frequency as
low as 2 Hz, so delta and theta bands may also be affected by muscle
activity. Muscular artifacts may impact sleep studies, as unconscious bruxism
(grinding of teeth) movements or snoring can seriously impact the
quality of the recorded EEG. In addition the recordings made of epilepsy
patients may be significantly impacted by the existence of muscular
artifacts.
Cardiac artifacts
The potential due to cardiac activity introduces electrocardiograph (ECG) errors in the EEG. Artifacts arising due to cardiac activity may be removed with the help of an ECG reference signal.
Other physiological artifacts
Glossokinetic
artifacts are caused by the potential difference between the base and
the tip of the tongue. Minor tongue movements can contaminate the EEG,
especially in parkinsonian and tremor disorders.
Environmental artifacts
In
addition to artifacts generated by the body, many artifacts originate
from outside the body. Movement by the patient, or even just settling of
the electrodes, may cause electrode pops, spikes originating from a momentary change in the impedance of a given electrode. Poor grounding of the EEG electrodes can cause significant 50 or 60 Hz artifact, depending on the local power system's frequency. A third source of possible interference can be the presence of an IV drip; such devices can cause rhythmic, fast, low-voltage bursts, which may be confused for spikes.
Abnormal activity
Abnormal activity can broadly be separated into epileptiform and non-epileptiform activity. It can also be separated into focal or diffuse.
Focal epileptiform discharges represent fast, synchronous
potentials in a large number of neurons in a somewhat discrete area of
the brain. These can occur as interictal activity, between seizures, and
represent an area of cortical irritability that may be predisposed to
producing epileptic seizures. Interictal discharges are not wholly
reliable for determining whether a patient has epilepsy nor where
his/her seizure might originate. (See focal epilepsy.)
Generalized epileptiform discharges often have an anterior
maximum, but these are seen synchronously throughout the entire brain.
They are strongly suggestive of a generalized epilepsy.
Focal non-epileptiform abnormal activity may occur over areas of the brain where there is focal damage of the cortex or white matter.
It often consists of an increase in slow frequency rhythms and/or a
loss of normal higher frequency rhythms. It may also appear as focal or
unilateral decrease in amplitude of the EEG signal.
Diffuse non-epileptiform abnormal activity may manifest as
diffuse abnormally slow rhythms or bilateral slowing of normal rhythms,
such as the PBR.
Intracortical Encephalogram electrodes and sub-dural electrodes
can be used in tandem to discriminate and discretize artifact from
epileptiform and other severe neurological events.
More advanced measures of abnormal EEG signals have also recently
received attention as possible biomarkers for different disorders such
as Alzheimer's disease.
Remote communication
Systems for decoding imagined speech from EEG have applications such as in brain–computer interfaces.
EEG diagnostics
The Department of Defense (DoD) and Veteran's Affairs (VA), and U.S Army Research Laboratory (ARL), collaborated on EEG diagnostics in order to detect mild to moderate Traumatic Brain Injury (mTBI) in combat soldiers.
Between 2000 and 2012, 75 percent of U.S. military operations brain
injuries were classified mTBI. In response, the DoD pursued new
technologies capable of rapid, accurate, non-invasive, and field-capable
detection of mTBI to address this injury.
Combat personnel often develop PTSD and mTBI in correlation. Both
conditions present with altered low-frequency brain wave oscillations.
Altered brain waves from PTSD patients present with decreases in
low-frequency oscillations, whereas, mTBI injuries are linked to
increased low-frequency wave oscillations. Effective EEG diagnostics can
help doctors accurately identify conditions and appropriately treat
injuries in order to mitigate long-term effects.
Traditionally, clinical evaluation of EEGs involved visual
inspection. Instead of a visual assessment of brain wave oscillation
topography, quantitative electroencephalography (qEEG), computerized
algorithmic methodologies, analyzes a specific region of the brain and
transforms the data into a meaningful "power spectrum" of the area.
Accurately differentiating between mTBI and PTSD can significantly
increase positive recovery outcomes for patients especially since
long-term changes in neural communication can persist after an initial
mTBI incident.
Inexpensive
EEG devices exist for the low-cost research and consumer markets.
Recently, a few companies have miniaturized medical grade EEG technology
to create versions accessible to the general public. Some of these
companies have built commercial EEG devices retailing for less than
US$100.
In 2004 OpenEEG released its ModularEEG as open source hardware.
Compatible open source software includes a game for balancing a ball.
In 2007 NeuroSky
released the first affordable consumer based EEG along with the game
NeuroBoy. This was also the first large scale EEG device to use dry
sensor technology.
In 2008 the Final Fantasy developer Square Enix announced that it was partnering with NeuroSky to create a game, Judecca.
In 2009 Mattel partnered with NeuroSky to release the Mindflex, a game that used an EEG to steer a ball through an obstacle course. By far the best-selling consumer based EEG to date.
In 2009 Uncle Milton Industries partnered with NeuroSky to release the Star WarsForce Trainer, a game designed to create the illusion of possessing the Force.
In 2010, NeuroSky added a blink and electromyography function to the MindSet.
In 2011, NeuroSky released the MindWave, an EEG device designed for educational purposes and games. The MindWave won the Guinness Book of World Records award for "Heaviest machine moved using a brain control interface".
In 2012, a Japanese gadget project, neurowear,
released Necomimi: a headset with motorized cat ears. The headset is a
NeuroSky MindWave unit with two motors on the headband where a cat's
ears might be. Slipcovers shaped like cat ears sit over the motors so
that as the device registers emotional states the ears move to relate.
For example, when relaxed, the ears fall to the sides and perk up when
excited again.
In 2014, OpenBCI released an eponymous open source
brain-computer interface after a successful kickstarter campaign in
2013. The board, later renamed "Cyton", has 8 channels, expandable to 16
with the Daisy module. It supports EEG, EKG, and EMG. The Cyton Board is based on the Texas Instruments ADS1299 IC
and the Arduino or PIC microcontroller, and initially costed $399
before increasing in price to $999. It uses standard metal cup
electrodes and conductive paste.
In 2015, Mind Solutions Inc released the smallest consumer BCI to date, the NeuroSync. This device functions as a dry sensor at a size no larger than a Bluetooth ear piece.
In 2021, BioSerenity
release the Neuronaute and Icecap a single-use disposable EEG headset
that allows recording with equivalent quality to traditional cup
electrodes.
Future research
The EEG has been used for many purposes besides the conventional uses
of clinical diagnosis and conventional cognitive neuroscience. An early
use was during World War II by the U.S. Army Air Corps to screen out
pilots in danger of having seizures; long-term EEG recordings in epilepsy patients are still used today for seizure prediction. Neurofeedback remains an important extension, and in its most advanced form is also attempted as the basis of brain computer interfaces. The EEG is also used quite extensively in the field of neuromarketing.
The EEG is altered by drugs that affect brain functions, the chemicals that are the basis for psychopharmacology. Berger's early experiments recorded the effects of drugs on EEG. The science of pharmaco-electroencephalography has developed methods to identify substances that systematically alter brain functions for therapeutic and recreational use.
Honda is attempting to develop a system to enable an operator to control its Asimo robot using EEG, a technology it eventually hopes to incorporate into its automobiles.
EEGs have been used as evidence in criminal trials in the Indian state of Maharashtra. Brain Electrical Oscillation Signature Profiling (BEOS), an EEG technique, was used in the trial of State of Maharashtra v. Sharma
to show Sharma remembered using arsenic to poison her ex-fiancé,
although the reliability and scientific basis of BEOS is disputed.
A lot of research is currently being carried out in order to make
EEG devices smaller, more portable and easier to use. So called
"Wearable EEG" is based upon creating low power wireless collection
electronics and 'dry' electrodes which do not require a conductive gel
to be used.
Wearable EEG aims to provide small EEG devices which are present only
on the head and which can record EEG for days, weeks, or months at a
time, as ear-EEG.
Such prolonged and easy-to-use monitoring could make a step change in
the diagnosis of chronic conditions such as epilepsy, and greatly
improve the end-user acceptance of BCI systems.
Research is also being carried out on identifying specific solutions to
increase the battery lifetime of Wearable EEG devices through the use
of the data reduction approach.
In research, currently EEG is often used in combination with machine learning.
EEG data are pre-processed then passed on to machine learning
algorithms. These algorithms are then trained to recognize different
diseases like schizophrenia, epilepsy or dementia. Furthermore, they are increasingly used to study seizure detection.
By using machine learning, the data can be analyzed automatically. In
the long run this research is intended to build algorithms that support
physicians in their clinical practice and to provide further insights into diseases. In this vein, complexity measures of EEG data are often calculated, such as Lempel-Ziv complexity, fractal dimension, and spectral flatness. It has been shown that combining or multiplying such measures can reveal previously hidden information in EEG data.
EEG signals from musical performers were used to create instant
compositions and one CD by the Brainwave Music Project, run at the Computer Music Center at Columbia University by Brad Garton and Dave Soldier. Similarly, an hour-long recording of the brainwaves of Ann Druyan was included on the Voyager Golden Record, launched on the Voyager
probes in 1977, in case any extraterrestrial intelligence could decode
her thoughts, which included what it was like to fall in love.
History
In 1875, Richard Caton (1842–1926), a physician practicing in Liverpool, presented his findings about electrical phenomena of the exposed cerebral hemispheres of rabbits and monkeys in the British Medical Journal. In 1890, Polish physiologist Adolf Beck
published an investigation of spontaneous electrical activity of the
brain of rabbits and dogs that included rhythmic oscillations altered by
light. Beck started experiments on the electrical brain activity of
animals. Beck placed electrodes directly on the surface of the brain to
test for sensory stimulation. His observation of fluctuating brain
activity led to the conclusion of brain waves.
German physiologist and psychiatrist Hans Berger (1873–1941) recorded the first human EEG in 1924.
Expanding on work previously conducted on animals by Richard Caton and
others, Berger also invented the electroencephalograph (giving the
device its name), an invention described "as one of the most surprising,
remarkable, and momentous developments in the history of clinical
neurology". His discoveries were first confirmed by British scientists Edgar Douglas Adrian and B. H. C. Matthews in 1934 and developed by them.
In 1934, Fisher and Lowenbach first demonstrated epileptiform spikes. In 1935, Gibbs, Davis and Lennox described interictal spike waves and the three cycles/s pattern of clinical absence seizures, which began the field of clinical electroencephalography. Subsequently, in 1936 Gibbs and Jasper
reported the interictal spike as the focal signature of epilepsy. The
same year, the first EEG laboratory opened at Massachusetts General
Hospital.
Franklin Offner (1911–1999), professor of biophysics at Northwestern University developed a prototype of the EEG that incorporated a piezoelectric inkwriter called a Crystograph (the whole device was typically known as the Offner Dynograph).
In 1947, The American EEG Society was founded and the first International EEG congress was held. In 1953 Aserinsky and Kleitman described REM sleep.
In the 1950s, William Grey Walter developed an adjunct to EEG called EEG topography,
which allowed for the mapping of electrical activity across the surface
of the brain. This enjoyed a brief period of popularity in the 1980s
and seemed especially promising for psychiatry. It was never accepted by
neurologists and remains primarily a research tool.
An electroencephalograph system manufactured by Beckman Instruments was used on at least one of the Project Gemini
manned spaceflights (1965–1966) to monitor the brain waves of
astronauts on the flight. It was one of many Beckman Instruments
specialized for and used by NASA.
The first instance of the use of EEG to control a physical
object, a robot, was in 1988. The robot would follow a line or stop
depending on the alpha activity of the subject. If the subject relaxed
and closed their eyes therefore increasing alpha activity, the bot would
move. Opening their eyes thus decreasing alpha activity would cause the
robot to stop on the trajectory.
In October 2018, scientists connected the brains of three people
to experiment with the process of thoughts sharing. Five groups of three
people participated in the experiment using EEG. The success rate of
the experiment was 81%.