Liberal internationalism is a foreign policy doctrine that supports international institutions, open markets, cooperative security, and liberal democracy. At its core, it holds that states should participate in international institutions that uphold rules-based norms, promote liberal democracy, and facilitate cooperation on transnational problems (such as environmental problems, arms control, and public health).
Proponents of liberal internationalism argue that the adoption of
this foreign policy orientation by the United States during the 20th
century has improved American liberty at home and ensured American hegemony in world politics, as well as facilitated the spread of liberal democracy and markets. Critics of the foreign policy doctrine (such as realists and proponents of retrenchment) argue that it tends towards military interventionism and contributes to disorder (for example, through democracy promotion and trade liberalization).
Among policymakers, liberal internationalism influenced British Foreign Secretary and Prime Minister Lord Palmerston, and was developed in the second decade of the 20th century under U.S. President Woodrow Wilson. In this form it became known as Wilsonianism. After World War I, the foreign policy doctrine of liberal internationalism was retained by the intellectual founders of the League of Nations and augmented somewhat with ideas from classical radicalism and the political party platform of the International Entente of Radical and Similar Democratic Parties. Daniel Deudney and John Ikenberry have also associated liberal internationalism with foreign policy ideas promoted by Franklin D. Roosevelt.
Paul K. MacDonald has linked diplomatic practices developed at the 1899
and 1907 Hague conferences as being key repertoires of subsequent
liberal internationalism.
Theory
The
goal of liberal internationalism is to achieve global structures within
the international system that are inclined towards promoting a liberal
world order. It foresees a gradual transformation of world politics from
anarchy to common institutions and the rule of law. To that extent,
global free trade, liberal economics and liberal political systems are
all encouraged. In addition, liberal internationalists are dedicated
towards encouraging democracy to emerge globally. Once realized, it will
result in a "peace dividend", as liberal states have relations that are
characterized by non-violence, and that relations between democracies
are characterized by the democratic peace theory.
Liberal internationalism states that, through multilateral organizations such as the United Nations, it is possible to avoid the worst excesses of "power politics"
in relations between nations. In addition, liberal internationalists
believe that the best way to spread democracy is to treat all states
equally and cooperatively, whether they are initially democratic or not.
According to Abrahamsen, liberal internationalism provides more opportunities to middle powers to advance their economic, security, and political interests.
Examples
Examples of liberal internationalists include former British Prime Minister Tony Blair, U.S. President Barack Obama, and then Secretary of State Hillary Clinton, and current Secretary of State Antony Blinken. In the US, it is often associated with the American Democratic Party. Some liberal-leaning neoconservatives shifted towards liberal internationalism in the 2010s.
Multilateral institutions, such as UNDP, UNICEF, WHO, and the UN General Assembly, have also been considered examples of liberal internationalism.
According to Ikenberry and Yolchi Funabashi, one of the key
pillars of liberal internationalism in practice is the democratic
constitution and trade-based prosperity of Japan, which makes Japan a
major stabilizer of liberal international order in the Asia-Pacific.
The Truman Doctrine is an American foreign policy that pledges American "support for democracies against authoritarian threats." The doctrine originated with the primary goal of countering the growth of the Soviet bloc during the Cold War. It was announced to Congress by President Harry S. Truman on March 12, 1947, and further developed on July 4, 1948, when he pledged to oppose the communistrebellions in Greece and Soviet demands from Turkey. More generally, the Truman Doctrine implied American support for other nations threatened by Moscow. It led to the formation of NATO in 1949. Historians often use Truman's speech to Congress on March 12, 1947, to date the start of the Cold War.
Truman told Congress that "it must be the policy of the United
States to support free peoples who are resisting attempted subjugation
by armed minorities or by outside pressures." Truman contended that because totalitarian regimes coerced free peoples, they automatically represented a threat to international peace and the national security of the United States.
Truman argued that if Greece and Turkey did not receive the aid, they
would inevitably fall out of the United States' sphere of influence and
into the communist bloc, with grave consequences throughout the region.
The Truman Doctrine was informally extended to become the basis
of American Cold War policy throughout Europe and around the world. It shifted U.S. policy toward the Soviet Union from a wartime alliance to containment of Soviet expansion, as advocated by diplomat George Kennan.
At the conclusion of World War II, Turkey was pressured by the Soviet
government to allow Russian shipping to flow freely through the Turkish Straits, which connected the Black Sea to the Mediterranean.
As the Turkish government would not submit to the Soviet Union's
requests, tensions arose in the region, leading to a show of naval force
on the site of the Straits. Since British assistance to Turkey had
ended in 1947, the U.S. dispatched military aid to ensure that Turkey
would retain chief control of the passage. Turkey received $100 million
in economic and military aid and the U.S. Navy sent the Midway-class aircraft carrierUSS Franklin D. Roosevelt.
In October 1944, British and Greek forces landed in Greece following the gradual withdrawal of Axis occupational forces from the country. Despite the Caserta Agreement stipulating that all Greek resistance factions would join a new Greek army under British command, General Ronald Scobie ordered the EAM's armed wing, ELAS, to unilaterally disarm on December, 1 1944. EAM responded to the "Scobie Order" by organizing a rally in Athens on December 3 in protest, which was shot at by Greek security forces, killing 28 protestors. This sparked the Dekemvriana,
a series of clashes between EAM and Greek government forces along with
their British allies. It ended in EAM's defeat and disarmament under the
terms of the Treaty of Varkiza, which marked the end of ELAS and broke EAM's power. This was followed by the White Terror, a period of persecution against Greek leftists, which contributed to the outbreak of the Greek Civil War in 1946.
After the civil war broke out, Communist Party of Greece (KKE) guerrillas revolted against the internationally recognized Greek government which was formed after elections in 1946 which were boycotted by the KKE. The British realized that the KKE were being directly funded by Josip Broz Tito in neighboring Yugoslavia. In line with the Anglo-Soviet percentages agreement, the KKE received no help from the Soviet Union, and Yugoslavia provided them support and sanctuary against Joseph Stalin's wishes.
By late 1946, Britain informed the United States that due to its own
declining economy, it could no longer continue to provide military and
economic support to the Greek government.[10]
In 1946–47, the United States and the Soviet Union moved from being
wartime allies to Cold War adversaries. The breakdown of Allied
cooperation in Germany provided a backdrop of escalating tensions for
the Truman Doctrine. To Truman, the growing unrest in Greece began to look like a pincer movement against the oil-rich areas of the Middle East and the warm-water ports of the Mediterranean. In February 1946, Kennan, an American diplomat in Moscow, sent his famed "Long Telegram",
which predicted the Soviets would only respond to force and that the
best way to handle them would be through a long-term strategy of
containment; that is, stopping their geographical expansion. After the
British warned that they could no longer help Greece, and following
Prime Minister Konstantinos Tsaldaris's visit to Washington in December 1946 to ask for American assistance,
the U.S. State Department formulated a plan. Aid would be given to both
Greece and Turkey, to help cool the long-standing rivalry between them.
American policy makers recognized the instability of the region,
fearing that if Greece was lost to communism, Turkey would not last
long. Similarly, if Turkey yielded to Soviet demands, the position of
Greece would be endangered. A regional domino effect
threat therefore guided the American decision. Greece and Turkey were
strategic allies important for geographical reasons as well, for the
fall of Greece would put the Soviets on a particularly dangerous flank
for the Turks, and strengthen the Soviet Union's ability to cut off
allied supply lines in the event of war.
Truman's address
President Truman's 1947 Message to Congress, Recommending Assistance to Greece and Turkey
To pass any legislation Truman needed the support of the Republicans, who controlled both houses of Congress. The chief Republican spokesman Senator Arthur H. Vandenberg strongly supported Truman and overcame the doubts of isolationists such as Senator Robert A. Taft. Truman laid the groundwork for his request by having key congressional leaders meet with himself, Secretary of State George Marshall, and Undersecretary of State Dean Acheson.
Acheson laid out the "domino theory" in the starkest terms, comparing a
communist state to a rotten apple that could spread its infection to an
entire barrel. Vandenberg was impressed, and advised Truman to appear
before Congress and "scare the hell out of the American people." On March 7, Acheson warned Truman that the communists in Greece could win within weeks without outside aid.
When a draft for Truman's address was circulated to policymakers,
Marshall, Kennan, and others criticized it for containing excess
"rhetoric." Truman responded that, as Vandenberg had suggested, his
request would only be approved if he played up the threat.
On March 12, 1947, Truman appeared before a joint session of Congress. In his eighteen-minute speech, he stated:
I believe it must be the policy of
the United States to support free peoples who are resisting attempted
subjugation by armed minorities or by outside pressures.
I believe that we must assist free peoples to work out their own destinies in their own way.
I believe that our help should be primarily through economic and
financial aid which is essential to economic stability and orderly
political processes.
The domestic reaction to Truman's speech was broadly positive, though
there were dissenters. Anti-communists in both parties supported both
Truman's proposed aid package and the doctrine behind it, and Collier's described it as a "popularity jackpot" for the President. Influential columnist Walter Lippmann
was more skeptical, noting the open-ended nature of Truman's pledge; he
felt so strongly that he almost came to blows while arguing with
Acheson over the doctrine. Others argued that the Greek monarchy Truman proposed to defend was itself a repressive government, rather than a democracy.
Despite these objections, the fear that there was a growing communist threat almost guaranteed the bill's passage. In May 1947, two months after Truman's request, a large majority of
Congress approved $400 million in military and economic aid to Greece
and Turkey.
Increased American aid assisted the Greek government's defeat of the
KKE, after interim defeats for government forces from 1946 to 1948.
The Truman Doctrine was the first in a series of containment moves by
the United States, followed by economic restoration of Western Europe
through the Marshall Plan and military containment by the creation of NATO in 1949.
Historian Eric Foner writes that the doctrine "set a precedent for American assistance to anticommunist
regimes throughout the world, no matter how undemocratic, and for the
creation of a set of global military alliances directed against the
Soviet Union."
The Truman Doctrine underpinned American Cold War policy in Europe and around the world. In the words of historian James T. Patterson:
The Truman Doctrine was a highly publicized commitment of
a sort the administration had not previously undertaken. Its sweeping
rhetoric, promising that the United States should aid all 'free people'
being subjugated, set the stage for innumerable later ventures that led
to globalisation commitments. It was in these ways a major step.
The doctrine endured, historian Dennis Merill argues, because it
addressed broader cultural insecurity regarding modern life in a
globalized world. It dealt with Washington's concern over communism's
domino effect, it enabled a media-sensitive presentation of the doctrine
that won bipartisan
support, and it mobilized American economic power to modernize and
stabilize unstable regions without direct military intervention. It
brought nation-building activities and modernization programs to the
forefront of foreign policy.
The Truman Doctrine became a metaphor for aid to keep a nation
from communist influence. Truman used disease imagery not only to
communicate a sense of impending disaster in the spread of communism but
also to create a "rhetorical vision" of containing it by extending a
protective shield around non-communist countries throughout the world.
It echoed the "quarantine the aggressor" policy Truman's predecessor, Franklin D. Roosevelt, had sought to impose to contain German and Japanese
expansion in 1937 ("quarantine" suggested the role of public health
officials handling an infectious disease). The medical metaphor extended
beyond the immediate aims of the Truman Doctrine in that the imagery,
combined with fire and flood imagery evocative of disaster, provided the
United States with an easy transition to direct military confrontation
in later years with the Korean War and the Vietnam War.
By framing ideological differences in life or death terms, Truman was
able to garner support for this communism-containing policy.
During the Cold War, many neutral countries, either those in what is considered the Third World, or those having no formal alliance with either the United States or the Soviet Union, viewed the claim of "Free World" leadership by the United States as grandiose and illegitimate.
One of the earliest uses of the term Free World as a politically significant term occurs in Frank Capra's World War II propaganda film series Why We Fight. In Prelude to War,
the first film of that series, the "free world" is portrayed as a white
planet, directly contrasted with the black planet called the "slave
world". The film depicts the free world as the Western Hemisphere, led
by the United States and Western Europe, and the slave world as the
Eastern Hemisphere, dominated by Nazi Germany, Fascist Italy and the Japanese Empire.
21st century usage
While "Free World" had its origins in the Cold War, the phrase is still used after the end of the Cold War and during the Global War on Terrorism. Samuel P. Huntington said the term has been replaced by the concept of the international community,
which, he argued, "has become the euphemistic collective noun
(replacing "the Free World") to give global legitimacy to actions
reflecting the interests of the United States and other Western powers."
Leadership of the Free World
United States
The "Leader of the Free World" was a colloquialism, first used during the Cold War, to describe either the United States or, more commonly, the President of the United States. The term when used in this context suggested that the United States was the principal democratic superpower, and the US president was by extension the leader of the world's democratic states, i.e. the "Free World".
But remember, we have differences with our allies all over the world.
They are family differences, and sometimes they are acute, but, by and
large, the reason we call it "free world" is because each nation in it
wants to remain independent under its own government and not under some
dictatorial form of government.
The phrase has its origin in the 1940s during the Second World War, especially through the anti-fascistFree World magazine and the US propaganda film series Why We Fight. At this time, the term was criticized for including the Soviet Union (USSR),
which critics saw as a totalitarian dictatorship. However, the term
became more widely used against the USSR and its allies during the 1950s
in the Cold War era (in the wake of Truman Doctrine),
when the US depicted a foreign policy based on a struggle between "a
democratic alliance and a communist realm set on world domination",
according to The Atlantic. The term here was criticized again for including right-wing dictatorships such as Francoist Spain, and Nikita Khrushchev said in the 21st Congress of the Soviet Communist Party that "the so-called free world constitutes the kingdom of the dollar".
Although in decline after the mid-1970s, the term was heavily referenced in US foreign policy up until the dissolution of the Soviet Union in December 1991. After the presidency of George H. W. Bush the term has largely fallen out of use, in part for its usage in rhetoric critical of US policy.
Terms implying a leadership role in the "free world" later came to be used for other persons, places, or states. However, the term is still used at times to described the President of the United States.
European Union
On 6 May 2010, upon an address to the plenary chamber of the European Parliament, the then US Vice President Joe Biden, stated that Brussels
had a "legitimate claim" to the title of "capital of the free world",
normally a title reserved for Washington. He added that Brussels is a
"great city which boasts 1,000 years of history and serves as capital of
Belgium, the home of many of the institutions of the European Union and the headquarters of the NATO alliance."
Germany
When Time declared the German ChancellorAngela MerkelTime Person of the Year for 2015, they referred to her as "Europe's most powerful leader", and the cover bore the title "Chancellor of the Free World".
Following the election of Donald Trump to the US presidency in November 2016, The New York Times called Merkel "the Liberal West's Last Defender", and a number of commentators called her "the next leader of the free world". Merkel herself rejected the description. An article by James Rubin in Politico about a White House meeting between Merkel and Trump was ironically titled "The Leader of the Free World Meets Donald Trump".
German commentators agreed with Merkel's assessment, and Friedrich Merz, a CDU politician, said that a German chancellor could never be "leader of the free world". In April 2017, columnist James Kirchick stressed the importance of the German elections
(on which "the future of the free world" depended) since America had
"abdicated its traditional role as leader of the free world by electing
Trump, the United Kingdom was turning inward after the referendum decision to leave
the European Union, and France was also traditionally unilateralist and
now had an inexperienced president"; he called Merkel "something less
than leader of the free world ... but something greater than the leader
of just another random country". References to America's abdication of its role as leader of the free world continued or increased after Donald Trump questioned the unconditional defence of NATO partners and the Paris climate accord.
Jagoda Marinić, writing for The New York Times,
noted that "Barack Obama all but literally passed on the mantle of
'leader of the free world' to Ms. Merkel (and not Mr. Trump), and most
Germans feel empowered by that new responsibility" and that Germany "is
coming to understand its role in standing up for liberal democracy in a
world turning more and more authoritarian."
Other commentators—in the United States and Europe—rejected the appellation "Leader of the Free World": some argued that there is no single leader of the 'free world'; others queried whether Merkel remained the "leader of the free world" and the champion of liberal values. Questioned about Merkel's standing following her performance in the German elections in September 2017, former US Secretary of State Hillary Clinton opined that Merkel was "the most important leader in the free world".
However, after Merkel's party suffered losses in the 2017 election and
there were delays in forming a government, the claim that Merkel is the
true leader of the free world was referred to as a "joke", described as a media phenomenon, and otherwise called into question.
When Merkel retired as Chancellor, Hillary Clinton
wrote that "she led Europe through difficult times with steadiness and
bravery, and for four long years, she was the leader of the free world."
Mentalism may cause harm through a combination of social inequalities,
insults, indignities, and overt discrimination. Some examples of these
include refusal of service and the denial of human rights.
Mentalism does not only describe how individuals are treated by the general public. The concept also encapsulates how individuals are treated by mental health professionals, the legal system and other institutions.
The term "sanism" was coined by Morton Birnbaum, a physician, lawyer, and mental health advocate. Judi Chamberlin coined the term "mentalism" in a chapter of the book Women Look at Psychiatry.
Definition
The terms mentalism, from "mental", and sanism, from "sane", have become established in some contexts, although concepts such as social stigma, and in some cases ableism,
may be used in similar but not identical ways. While mentalism and
sanism are used interchangeably, sanism is becoming predominant in
certain circles, such as academics. Those who identify as mad, mad
advocates, and in a socio-political context where sanism is gaining
ground as a movement.
The movement of sanism is an act of resistance among those who identify
as mad, consumer survivors, and mental health advocates. In academia evidence of this movement can be found in the number of recent publications about sanism and social work practice.
Etymologies
The term "sanism" was coined by Morton Birnbaum during his work representing Edward Stephens, a mental health patient, in a legal case in the 1960s.
Birnbaum was a physician, lawyer and mental health advocate who helped
establish a constitutional right to treatment for psychiatric patients
along with safeguards against involuntary commitment. Since first noticing the term in 1980, New York legal professor Michael L. Perlin subsequently continued its use.
In 1975 Judi Chamberlin coined the term mentalism in a book chapter of Women Look at Psychiatry. The term became more widely known when she used it in 1978 in her book On Our Own: Patient Controlled Alternatives to the Mental Health System, which for some time became the standard text of the psychiatric survivor movement in the US.
People began to recognize a pattern in how they were treated, a set of
assumptions which most people seemed to hold about mental (ex-)patients
regardless of whether they applied to any particular individual at any
particular time – that they were incompetent, unable to do things for
themselves, constantly in need of supervision and assistance,
unpredictable, likely to be violent or irrational etc. It was realized
that not only did the general public express mentalist ideas, so did
ex-patients, a form of internalized oppression.
As of 1998 these terms have been adopted by some consumers/survivors in the UK and the US, but had not gained general currency. This left a conceptual gap filled in part by the concept of 'stigma', but this has been criticized for focusing less on institutionalized discrimination
with multiple causes, but on whether people perceive mental health
issues as shameful or worse than they are. Despite its use, a body of
literature demonstrated widespread discrimination across many spheres of
life, including employment, parental rights, housing, immigration, insurance, health care and access to justice.
However, the use of new "isms" has also been questioned on the grounds
that they can be perceived as divisive, out of date, or a form of undue political correctness. The same criticisms, in this view, may not apply so much to broader and more accepted terms like 'discrimination' or 'social exclusion'.
There is also the umbrella term ableism,
referring to discrimination against those who are (perceived as)
disabled. In terms of the brain, there is the movement for the
recognition of neurodiversity. The term 'psychophobia' (from psyche and phobia) has occasionally been used with a similar meaning.
Social division
Mentalism
at one extreme can lead to a categorical dividing of people into an
empowered group assumed to be normal, healthy, reliable, and capable,
and a powerless group assumed to be sick, disabled, crazy,
unpredictable, and violent. This divide can justify inconsiderate
treatment of the latter group and expectations of poorer standards of
living for them, for which they may be expected to express gratitude.
Further discrimination may involve labeling some as "high functioning"
and some as "low-functioning"; while this may enable the targeting of
resources, in both categories human behaviors are recast in pathological terms.According to Coni Kalinowski (a psychiatrist at the University of Nevada and Director of Mojave Community Services) and Pat Risser (a mental health consultant and self-described former recipient of mental health services).
The discrimination can be so fundamental and unquestioned that it can stop people truly empathizing
(although they may think they are) or genuinely seeing the other point
of view with respect. Some mental conditions can impair awareness and
understanding in certain ways at certain times, but mentalist
assumptions may lead others to erroneously believe that they necessarily
understand the person's situation and needs better than they do
themselves.
Reportedly even within the disability rights movement
internationally, "there is a lot of sanism", and "disability
organisations don't always 'get' mental health and don't want to be seen
as mentally defective." Conversely, those coming from the mental health
side may not view such conditions as disabilities in the same way.
Some national government-funded charities view the issue as
primarily a matter of stigmatizing attitudes within the general public,
perhaps due to people not having enough contact with those (diagnosed
with) mental illness, and one head of a schizophrenia charity has
compared mentalism to the way racism may be more prevalent when people
don't spend time together throughout life. A psychologist who runs The Living Museum
facilitating current or former psychiatric patients to exhibit artwork,
has referred to the attitude of the general public as psychophobia.
Clinical terminology
Mentalism may be codified in clinical terminology in subtle ways, including in the basic diagnostic categories used by psychiatry (as in the DSM or ICD). There is some ongoing debate as to which terms and criteria may communicate contempt or inferiority, rather than facilitate real understanding of people and their issues.
Some oppose the entire process as labeling
and some have responded to justifications for it – for example that it
is necessary for clinical or administrative purposes. Others argue that
most aspects could easily be expressed in a more accurate and less
offensive manner.
Some clinical terms may be used far beyond the usual narrowly defined
meanings, in a way that can obscure the regular human and social
context of people's experiences. For example, having a bad time may be
assumed to be decompensation; incarceration or solitary confinement
may be described as treatment regardless of benefit to the person;
regular activities like listening to music, engaging in exercise or
sporting activities, or being in a particular physical or social
environment (milieu), may be referred to as therapy; all sorts of responses and behaviors may be assumed to be symptoms; core adverse effects of drugs may be termed side effects.
The former director of a US-based psychiatric survivors organization focused on rights and freedoms, David Oaks, has advocated the taking back of words like "mad", "lunatic",
"crazy" or "bonkers". While acknowledging that some choose not to use
such words in any sense, he questions whether medical terms like
"mentally ill", "psychotic"
or "clinically depressed" really are more helpful or indicative of
seriousness than possible alternatives. Oaks says that for decades he
has been exploring the depths of sanism and has not yet found an end,
and suggests it may be the most pernicious 'ism' because people tend to
define themselves by their rationality and their core feelings.
One possible response is to critique conceptions of normality and the
problems associated with normative functioning around the world,
although in some ways that could also potentially constitute a form of
mentalism. After his 2012 accident breaking his neck and subsequent
retirement, Oaks refers to himself as "PsychoQuad" on his personal blog.
British writer Clare Allen argues that even reclaimed slang terms such as "mad" are just not accurate.
In addition, she sees the commonplace mis-use of concepts relating to
mental health problems – including for example jokes about people
hearing voices as if that automatically undermines their credibility –
as equivalent to racist or sexist phrases that would be considered
obviously discriminatory. She characterises such usage as indicating an
underlying psychophobia and contempt.
Blame
Interpretations of behaviors, and applications of treatments, may be
done in an judgmental way because of an underlying mentalism, according
to critics of psychiatry.
If a recipient of mental health services disagrees with treatment or
diagnosis, or does not change, they may be labeled as non-compliant,
uncooperative, or treatment-resistant. This is despite the fact that the
issue may be healthcare provider's inadequate understanding of the
person or their problems, adverse medication effects, a poor match
between the treatment and the person, stigma associated with the
treatment, difficulty with access, cultural unacceptability, or many
other issues.
Mentalism may lead people to assume that someone is not aware of
what they are doing and that there is no point trying to communicate
with them, despite the fact that they may well have a level of awareness
and desire to connect even if they are acting in a seemingly irrational
or self-harming way. In addition, mental health professionals
and others may tend to equate subduing a person with treatment; a quiet
client who causes no community disturbance may be deemed improved no
matter how miserable or incapacitated that person may feel as a result.
Clinicians may blame clients for not being sufficiently motivated to work on treatment goals or recovery, and as acting out when things are not agreed with or are found upsetting. But critics
say that in the majority of cases this is actually due to the client
having been treated in a disrespectful, judgmental, or dismissive
manner. Nevertheless, such behavior may be justified by characterizing
the client as demanding, angry or needing limits. To overcome this, it
has been suggested that power-sharing should be cultivated and that when
respectful communication breaks down, the first thing that needs to be asked is whether mentalist prejudices have been expressed.
Neglect
Mentalism has been linked to negligence
in monitoring for adverse effects of medications (or other
interventions), or to viewing such effects as more acceptable than they
would be for others. This has been compared to instances of maltreatment
based on racism. Mentalism has also been linked to neglect in failing to check for, or fully respect, people's past experiences of abuse or other trauma.
Treatments that do not support choice and self-determination
may cause people to re-experience the helplessness, pain, despair, and
rage that accompanied the trauma, and yet attempts to cope with this may
be labeled as acting out, manipulation, or attention-seeking.
In addition, mentalism can lead to "poor" or "guarded"
predictions of the future for a person, which could be an overly
pessimistic view skewed by a narrow clinical experience. It could also
be made impervious to contrary evidence because those who succeed can be
discounted as having been misdiagnosed or as not having a genuine form of a disorder – the no true Scotsman
fallacy. While some mental health problems can involve very substantial
disability and can be very difficult to overcome in society,
predictions based on prejudice and stereotypes can be self-fulfilling
because individuals pick up on a message that they have no real hope, and realistic hope is said to be a key foundation of recovery. At the same time, a trait or condition might be considered more a form of individual difference
that society needs to include and adapt to, in which case a mentalist
attitude might be associated with assumptions and prejudices about what
constitutes normal society and who is deserving of adaptations, support,
or consideration.
Institutional discrimination
This may be apparent in physical separation, including separate
facilities or accommodation, or in lower standards for some than others.
Mental health professionals may find themselves drawn into systems
based on bureaucratic and financial imperatives and social control, resulting in alienation from their original values,
disappointment in "the system", and adoption of the cynical, mentalist
beliefs that may pervade an organization. However, just as employees can
be dismissed for disparaging sexual or ethnic remarks, it is argued that staff who are entrenched in negative stereotypes, attitudes, and beliefs about those labeled with mental disorders need to be removed from service organizations. A related theoretical approach, known as expressed emotion, has also focused on negative interpersonal dynamics relating to care givers, especially within families. However, the point is also made
in such views that institutional and group environments can be
challenging from all sides, and that clear boundaries and rights are
required for everyone.
The mental health professions have themselves been criticized. While social work
(also known as clinical social work) has appeared to have more
potential than others to understand and assist those using services, and
has talked a lot academically about anti-oppressive practice intended to support people facing various -isms, it has allegedly failed to address mentalism to any significant degree. The field has been accused, by social work professionals with
experience of using services themselves, of failing to help people
identify and address what is oppressing them; of unduly deferring to
psychiatric or biomedical conventions particularly in regard to those
deemed most unwell; and of failing to address its own discriminatory
practices, including its conflicts of interest in its official role
aiding the social control of patients through involuntary commitment.
In the "user/survivor" movement in England, Pete Shaughnessy, a founder of mad pride, concluded that the National Health Service is "institutionally mentalist and has a lot of soul searching to do in the new Millennium",
including addressing the prejudice of its office staff. He suggested
that when prejudice is applied by the very professionals who aspire to
eradicate it, it raises the question of whether it will ever be
eradicated. Shaughnessy committed suicide in 2002.
The psychiatric survivors movement has been described as a feminist
issue, because the problems it addresses are "important for all women
because mentalism acts as a threat to all women" and "mentalism
threatens women's families and children."
A psychiatric survivor and professional has said that "Mentalism
parallels sexism and racism in creating an oppressed underclass, in this
case of people who have received psychiatric diagnosis and treatment".
She reported that the most frequent complaint of psychiatric patients is
that nobody listens, or only selectively in the course of trying to
make a diagnosis.
On a society-wide level, mentalism has been linked to people being kept in poverty as second class citizens; to employment discrimination keeping people living on handouts; to interpersonal discrimination hindering relationships; to stereotypes
promoted through the media spreading fears of unpredictability and
dangerousness; and to people fearing to disclose or talk about their
experiences.
Law
With regard to legal protections against discrimination, mentalism may only be covered under general frameworks such as the disability discrimination acts
that are in force in some countries, and which require a person to say
that they have a disability and to prove that they meet the criteria.
In terms of the legal system itself, the law is traditionally based on technical definitions of sanity and insanity,
and so the term "sanism" may be used in response. The concept is well
known in the US legal community, being referred to in nearly 300 law
review articles between 1992 and 2013, though is less well known in the
medical community.
Michael Perlin, Professor of Law at New York Law School,
has defined sanism as "an irrational prejudice of the same quality and
character as other irrational prejudices that cause and are reflected in
prevailing social attitudes of racism, sexism, homophobia, and ethnic
bigotry that permeates all aspects of mental disability law and affects
all participants in the mental disability law system: litigants, fact finders, counsel, and expert and lay witnesses."
Perlin notes that sanism affects the theory and practice of law
in largely invisible and socially acceptable ways, based mainly on "stereotype, myth, superstition, and deindividualization." He believes that its "corrosive effects have warped involuntary civil commitment law, institutional law, tort law, and all aspects of the criminal process (pretrial, trial and sentencing)."
According to Perlin, judges are far from immune, tending to reflect
sanist thinking that has deep roots within our culture. This results in
judicial decisions based on stereotypes in all areas of civil and criminal law, expressed in biased language and showing contempt for mental health professionals. Moreover, courts are often impatient and attribute mental problems to "weak character or poor resolve".
Sanist attitudes are prevalent in the teaching of law students,
both overtly and covertly, according to Perlin. He notes that this
impacts on the skills at the heart of lawyering such as "interviewing,
investigating, counseling and negotiating", and on every critical moment
of clinical experience: "the initial interview, case preparation, case
conferences, planning litigation (or negotiation) strategy, trial
preparation, trial and appeal."
There is also widespread discrimination by jurors, who Perlin
characterizes as demonstrating "irrational brutality, prejudice,
hostility, and hatred" towards defendants where there is an insanity defense.
Specific sanist myths include relying on popular images of craziness;
an 'obsession' with claims that mental problems can be easily faked and
experts duped; assuming an absolute link between mental illness and
dangerousness; an 'incessant' confusion and mixing up of different legal
tests of mental status; and assuming that defendants acquitted on
insanity defenses are likely to be released quickly. Although there are
claims that neuroimaging
has some potential to help in this area, Perlin concludes that it is
very difficult to weigh the truth or relevance of such results due to
the many uncertainties and limitations, and as it may be either
disregarded or over-hyped by scientists, lawyers or in the popular
imagination. He believes "the key to an answer here is a consideration
of sanism", because to a great extent it can "overwhelm all other
evidence and all other issues in this conversation". He suggests that
"only therapeutic jurisprudence has the potential power to 'strip the sanist facade'."
He has also addressed the topic of sanism as it affects which
sexual freedoms or protections are afforded to psychiatric patients,
especially in forensic facilities.
Sanism in the legal profession can affect many people in
communities who at some point in their life struggle with some degree of
mental health problems, according to Perlin. This may unjustly limit
their ability to legally resolve issues in their communities such as:
"contract problems, property problems, domestic relations problems, and
trusts and estates problems."
Susan Fraser, a lawyer in Canada who specializes in advocating
for vulnerable people, argues that sanism is based on fear of the
unknown, reinforced by stereotypes that dehumanize
individuals. She argues that this causes the legal system to fail to
properly defend patients' rights to refuse potentially harmful
medications; to investigate deaths in psychiatric hospitals
and other institutions in an equal way to others; and to fail to
properly listen to and respect the voices of mental health consumers and
survivors.
Education
Similar issues have been identified by Perlin in how children are dealt with in regard to learning disabilities, including in special education.
In any area of law, he points out, two of the most common sanist myths
are presuming that persons with mental disabilities are faking, or that
such persons would not be disabled if they only tried harder. In this
particular area, he concludes that labeled
children are stereotyped in a process rife with racial, class and
gender bias. Although intended to help some children, he contends that
in reality it can be not merely a double-edged sword but a triple,
quadruple or quintuple edged sword. The result of sanist prejudices and
misconceptions, in the context of academic competition, is that "we are
left with a system that is, in many important ways, stunningly
incoherent". Self-identifying Mad students often encounter sanist discrimination in post-secondary educational settings.
Oppression
A spiral of oppression experienced by some groups in society has been identified.
Firstly, oppressions occur on the basis of perceived or actual
differences (which may be related to broad group stereotypes such as
racism, sexism, classism, ageism, homophobia etc.). This can have negative physical, social, economic and psychological effects on individuals, including emotional distress
and what might be considered mental health problems. Then, society's
response to such distress may be to treat it within a system of medical
and social care rather than (also) understanding and challenging the
oppressions that gave rise to it, thus reinforcing the problem with
further oppressive attitudes and practices, which can lead to more
distress, and so on in a vicious cycle. In addition, due to coming into
contact with mental health services, people may become subject to the
oppression of mentalism, since society (and mental health services
themselves) have such negative attitudes towards people with a
psychiatric diagnosis, thus further perpetuating oppression and
discrimination.
People suffering such oppression within society may be drawn to
more radical political action, but sanist structures and attitudes have
also been identified in activist communities. This includes cliques and social hierarchies
that people with particular issues may find very difficult to break
into or be valued by. There may also be individual rejection of people
for strange behavior that is not considered culturally acceptable, or
alternatively insensitivity to emotional states including suicidality,
or denial that someone has issues if they appear to act normally.
Metamemory or Socratic awareness, a type of metacognition, is both the introspective knowledge of one's own memory capabilities (and strategies that can aid memory) and the processes involved in memory self-monitoring.
This self-awareness of memory has important implications for how people
learn and use memories. When studying, for example, students make
judgments of whether they have successfully learned the assigned
material and use these decisions, known as "judgments of learning", to
allocate study time.
History
Descartes, among other philosophers, marveled at the phenomenon of what we now know as metacognition.
"It was not so much thinking that was indisputable to Descartes, but
rather thinking about thinking. What he could not imagine was that the
person engaged in such self-reflective processing did not exist." In the late 19th century, Bowne and James contemplated, but did not scientifically examine, the relationship between memory judgments and memory performance.
During the reign of behaviorism in the mid-20th century, unobservable phenomena such as metacognition were largely ignored.
One early scientific study of metamemory was Hart's 1965 study, which
examined the accuracy of feeling of knowing (FOK). FOK occurs when an
individual feels that they have something in memory that cannot be recalled, but would be recognized if seen. Hart expanded upon limited investigations of FOK which had presupposed that FOK was accurate. The results of Hart's study indicate that FOK is indeed a relatively accurate indicator of what is in memory.
In a 1970 review of memory research, Tulving
and Madigan concluded that advances in the study of memory might
require the experimental investigation of “one of the truly unique
characteristics of human memory: its knowledge of its own knowledge”. It was around the same time that John H. Flavell coined the term "metamemory" in a discussion on the development of memory.
Since then, numerous metamemory phenomena have been studied, including
judgments of learning, feelings of knowing, knowing that you don't know,
and know vs. remember.
Nelson and Narens proposed a theoretical framework for understanding metacognition and metamemory. In this framework there are two levels: the object level (for example, cognition
and memory) and the meta level (for example, metacognition and
metamemory). Information flow from the meta level to the object level is
called control, and information flow from the object level to the meta
level is called monitoring. Both monitoring and control processes occur
in acquisition, retention, and retrieval. Examples of control processes
are allocating study time and selecting search strategies, and examples
of monitoring processes are ease of learning (EOL) and feeling of
knowing (FOK) judgments. Monitoring and control might be further divided
into subprocesses depending on the types of inputs, computations, and
outputs required at different stages of the memory process. For example,
monitoring abilities appear to be sufficiently different during
encoding-based and retrieval-based metamemory judgments to constitute
different monitoring systems.
The study of metamemory has some similarities to introspection in that it assumes that a memorizer is able to investigate and report on the contents of memory.
Current metamemory researchers acknowledge that an individual's
introspections contain both accuracies and distortions and are
interested in what this conscious monitoring (even if it is not always
accurate) reveals about the memory system.
Theories
Cue familiarity hypothesis
The cue familiarity hypothesis
was proposed by Reder and Ritter after completing a pair of experiments
which indicated that individuals can evaluate their ability to answer a
question before trying to answer it. This finding suggests that the question (cue) and not the actual memory (target) is crucial for making metamemory judgments.
Consequently, this hypothesis implies that judgments regarding
metamemory are based on an individual's level of familiarity with the
information provided in the cue.
Therefore, an individual is more likely to judge that they know the
answer to a question if they are familiar with its topic or terms and
more likely to judge that they do not know the answer to a question
which presents new or unfamiliar terms.
Accessibility hypothesis
The
accessibility hypothesis suggests that memory will be accurate when the
ease of processing (accessibility) is correlated with memory behaviour;
however, if the ease of processing is not correlated with memory in a
given task, then the judgments will not be accurate.
Proposed by Koriat, the theory suggests that participants base their
judgments on retrieved information rather than basing them on the sheer
familiarity of the cues. Along with the lexical unit, people may use partial information that could be correct or incorrect.
According to Koriat, the participants themselves do not know whether
the information they are retrieving is correct or incorrect most of the
time. The quality of information retrieved depends on individual elements of that information. The individual elements of information differ in strength and speed of access to the information. Research by Vigliocco, Antonini, and Garrett (1997) and Miozzo and Caramazza (1997) showed that individuals in a tip-of-the-tongue
(TOT) state were able to retrieve partial knowledge (gender) about the
unrecalled words, providing strong evidence for the accessibility
heuristic.
Competition hypothesis
The
competition hypothesis is best described using three principles. The
first is that many brain systems are activated by visual input, and the
activations by these different inputs compete for processing access. Secondly, competition occurs in multiple brain systems and is integrated amongst these individual systems. Finally, competition can be assessed (using top-down neural priming) based on the relevant characteristics of the object at hand.
More competition, also referred to as more interfering activation, leads to poorer recall when tested.
This hypothesis contrasts with the cue-familiarity hypothesis because
objects similar to the target can influence one's FOK, not just similar
associates of the cues.
It also contrasts with the accessibility hypothesis wherein the more
accessible information is, the higher the rating, or the better the
recall. According to the competition hypothesis, less activation would result in better recall. Whereas the accessibility view predicts higher metamemory ratings in interference conditions, the competition hypothesis predicts lower ratings.
Interactive hypothesis
The
interactive hypothesis constitutes a combination of the cue familiarity
and accessibility hypotheses. According to this hypothesis, cue
familiarity is employed initially, and only once cue familiarity fails
to provide enough information to make an inference does accessibility
come into play.
This "cascade" structure accounts for differences in the time required
to make a metamemory judgment; judgments which occur quickly are based
on cue familiarity, while slower responses are based on both cue
familiarity and accessibility.
Phenomena
Judgment of learning
Judgments of learning (JOLs) or metamemory judgments are made when knowledge is acquired. Metamnemonic judgments are based on different sources of information, and target information is important for JOLs. Intrinsic cues (based on the target information) and mnemonic cues (based on previous JOL performance) are especially important for JOLs.
Judgment of learning can be divided into four categories:
ease-of-learning judgments, paired-associate JOLs, ease-of-recognition
judgments, and free-recall JOLs.
Ease-of-Learning Judgments: These judgments are made
before a study trial. Subjects can evaluate how much studying will be
required to learn the particular information presented to them
(typically cue-target pairs).
These judgments can be categorized as preacquisition judgments which
are made before the knowledge is stored. Little research addresses this
kind of judgment; however, evidence suggests that JOLs are at least
somewhat accurate at predicting learning rates.
Therefore, these judgments occur in advance of learning and allow
individuals to allot study time to the material that they are required
to learn.
Paired-Associate Judgment of Learning: These judgments are
made at the time of study on cue-target pairs and are responsible for
predicting later memory performance (on cued recall or cued
recognition). One example of paired-associate JOLs is the cue-target
JOL, where the subject determines the retrievability of the target when
both the cue and target of the to-be-learned pair are presented.
Another example is the cue-only JOL, where the subject must determine
the retrievability of the target when only the cue is presented at the
time of judgment.
These two types of JOLs differ in their accuracy in predicting future
performance, and delayed judgments tend to be more accurate.
Ease-of-Recognition Judgments: This type of JOL predicts the likelihood of future recognition.
Subjects are given a list of words and asked to make judgments
concerning their later ability to recognize these words as old or new in
a recognition test. This helps determine their ability to recognize the words after acquisition.
Free-Recall Judgments of Learning: This type of JOL
predicts the likelihood of future free-recall. In this situation,
subjects assess a single target item and judge the likelihood of later
free-recall. It may appear similar to ease-of-recognition judgments, but it predicts recall instead of recognition.
Feeling of knowing judgments
Feeling of Knowing (FOK)
judgments refer to the predictions an individual makes of being able to
retrieve specific information (i.e., regarding his or her knowledge for
a specific subject) and, more specifically, whether that knowledge
exists within the person's memory. These judgments are made either prior to the memory target being found
or following a failed attempt to locate the target. Consequently, FOK
judgments focus not on the actual answer to a question, but rather on
whether an individual predicts that he or she does or does not know the
answer (high and low FOK ratings respectively). FOK judgments can also
be made regarding the likelihood of remembering information later on and
have proven to give fairly accurate indications of future memory.
An example of FOK is if you can't remember the answer when someone asks
you what city you're traveling to, but you feel that you would
recognize the name if you saw it on a map of the country.
An individual's FOK judgments are not necessarily accurate, and
attributes of all three metamemory hypotheses are apparent in the
factors that influence FOK judgments and their accuracy. For example, a
person is more likely to give a higher FOK rating (indicating that they
do know the answer) when presented with questions they feel they should know the answer to.
This is in keeping with the cue familiarity hypothesis, as the
familiarity of the question terms influences the individual's judgment.
Partial retrieval also impacts FOK judgments, as proposed by the
accessibility hypothesis. The accuracy of an FOK judgment is dependent
upon the accuracy of the partial information which is retrieved.
Consequently, accurate partial information leads to accurate FOK
judgments, while inaccurate partial information leads to inaccurate FOK
judgments.
FOK judgments are also influenced by the number of memory traces linked
to the cue. When a cue is linked to fewer memory traces, resulting in a
low level of competition, a higher FOK rating is given, thus supporting
the competition hypothesis.
Certain physiological states can also influence an individual's
FOK judgments. Altitude, for instance, has been shown to reduce FOK
judgments, despite having no effect on recall. In contrast, alcohol intoxication results in reduced recall while having no effect on FOK judgments.
Knowing that you do not know
When someone asks a person a question such as "What is your name?", the
person automatically knows the answer. However, when someone asks a
person a question such as "What was the fifth dinosaur ever
discovered?", the person also automatically knows that they do not know the answer.
A person knowing that they do not know is another aspect of metamemory
that enables people to respond quickly when asked a question that they
do not know the answer to. In other words, people are aware of the fact
that they do not know certain information and do not have to go through
the process of trying to find the answer within their memories, since
they know the information in question will never be remembered. One
theory as to why this knowledge of not knowing is so rapidly retrieved
is consistent with the cue-familiarity hypothesis. The cue familiarity
hypothesis states that metamemory judgments are made based on the
familiarity of the information presented in the cue.
The more familiar the information in the memory cue, the more likely a
person will make the judgment that they know that the target information
is in memory. With regards to knowing that you don't know, if the
memory cue information does not elicit any familiarity, then a person
quickly judges that the information is not stored in memory.
The right ventral prefrontal cortex and the insular cortex
are specific to "knowing that you don't know", whereas prefrontal
regions are generally more specific to the feeling of knowing. These findings suggest that a person knowing that they do not know and feeling of knowing
are two neuroanatomically dissociable features of metamemory. As well,
"knowing that you don't know" relies more on cue familiarity than
feeling of knowing does.
There are two basic types of "do not know" decisions. First is a slow, low confidence decision.
This occurs when a person has some knowledge relevant to the question
asked. This knowledge is located and evaluated to determine whether the
question can be answered based on what is stored in memory. In this
case, the relevant knowledge is not enough to answer the question.
Second, when a person has zero knowledge relevant to a question asked,
they are able to produce a rapid response of not knowing. This occurs because the initial search for information draws a blank and the search stops, thus producing a faster response.
Remember vs. know
The
quality of information that is recalled can vary greatly depending on
the information being remembered. It is important to understand the
differences between remembering something and knowing something. If
information about the learning context accompanies a memory (i.e. the
setting), it is called a "remember" experience. However, if a person
does not consciously remember the context in which he or she learned a
particular piece of information and only has the feeling of familiarity
towards it, it is called a "know" experience. It is widely believed that recognition has two underlying processes: recollection and familiarity.
The recollection process retrieves memories from one's past and can
elicit any number of associations of the prior experience ("remember").
In contrast, the familiarity process does not elicit associations with
the memory and there are no contextual details of the prior learning
occurrence ("know").
Since these two processes are dissociable, they can be affected by
different variables (i.e. when remember is affected know is not and vice
versa). For example, "remember" is affected by variables such as depth of processing, generation effects, the frequency of occurrence, divided attention at learning, and reading silently vs. aloud. In contrast, "know" is affected by repetition priming, stimulus modality, amount of maintenance rehearsal,
and suppression of focal attention. There are cases however, where
"remember" and "know" are both affected, but in opposite ways. An
example of this would be if "remember" responses are more common than
"know" responses. This can occur due to word versus nonword memory,
massed versus distributed practice, gradual versus abrupt presentations, and learning in a way that emphasizes similarities vs. differences.
Another aspect of the "remember" versus "know" phenomenon is hindsight bias,
also referred to as the "knew it all along effect". This occurs when a
person believes that an event is more deterministic after it has
happened.
That is, in the face of the outcome of a situation, people tend to
overestimate the quality of their previous knowledge, thus leading the
person to a distortion towards the provided information. Some
researchers believe that the original information gets distorted by the
new information at the time of encoding.
The term "creeping determinism" is used to emphasize the fact that it
is completely natural for one to integrate outcome information with the
original information to create an appropriate whole out of all the
pertinent information.
Although it was found that informing individuals about the hindsight
bias before they took part in experiments did not decrease the bias, it
is possible to avoid the effects of the hindsight bias.
Further, by discrediting the outcome knowledge, people are better able
to accurately retrieve their original knowledge state, therefore
reducing the hindsight bias.
Errors in being able to differentiate between ‘remembering’ versus ‘knowing’ can be attributed to a phenomenon known as source monitoring.
This is a framework where one tries to identify the context or source
from which a particular memory or event has arisen. This is more
prevalent with information that is ‘known’ rather than ‘remembered’.
Prospective memory
It is important to be able to keep track of future intentions and
plans, and most importantly, individuals need to remember to actually
carry out such intentions and plans. This memory for future events is prospective memory.
Prospective memory includes forming the intention to carry out a
particular task in the future, which action we’re going to use to carry
out the action, and when we want to do it. Thus, prospective memory is
in use continuously in day-to-day life. For example, prospective memory
is in use when you decide that you need to write and send a letter to a
friend.
There are two types of prospective memory; event-based and time based. Event-based prospective memory is when an environmental cue prompts you to carry out a task.
An example is when seeing a friend reminds you to ask him a question.
In contrast, time-based prospective memory occurs when you remember to
carry out a task at a specific time.
An example of this is remembering to phone your sister on her
birthday. Time-based prospective memory is more difficult than
event-based prospective memory because there is no environmental cue
prompting one to remember to carry out the task at that specific time.
In some cases, impairments to prospective memory can have dire
consequences. If an individual with diabetes cannot remember to take
their medication, they might face serious health consequences.
Prospective memory also generally gets worse with age, but the elderly
can implement strategies to improve prospective memory performance.
Improving memory
Mnemonics
A mnemonic is "a word, sentence, or picture device or technique for improving or strengthening memory".
Information learnt through mnemonics makes use of a form of deep
processing: elaborative encoding. It uses mnemonic tools such as imagery
in order to encode specific information with the goal of creating an
association between the tool and the information. This leads to the
information becoming more accessible and therefore leads to better
retention. One example of a mnemonic is the method of loci, in which the memorizer associates each to be remembered item with a different well-known location.
Then, during retrieval, the memorizer "strolls" along the locations and
remembers each related item. Other types of mnemonic tools including
the creating acronyms, the drawing effect (which states drawing
something increases the likelihood of remembering it), chunking and organisation and imagery (where you associate images with the information you are trying to remember).
The application of a mnemonic is intentional, suggesting that in
order to successfully use a mnemonic device an individual should be
aware that the mnemonic can aid his or her memory.
Awareness of how a mnemonic facilitates one's memory is an example of
metamemory. Wimmer and Tornquist conducted an experiment in which
participants were asked to recall a set of items.
Participants were made aware of the usefulness of a mnemonic device
(categorical grouping) either before or after recall. Participants who
were made aware of the usefulness of the mnemonic before recall
(displaying metamemory for the mnemonic's usefulness) were significantly
more likely to use the mnemonic than those who were not made aware of
the mnemonic before recall.
Exceptional memory
Mnemonists are people with exceptional memory. These individuals have seemingly effortless memories and perform tasks that may seem challenging to the general population.
They seem to have beyond normal abilities to encode and retrieve
information. There is strong evidence suggesting that exceptional
performance is acquired, rather than it being a natural ability, and
that "ordinary" people can improve their memory drastically with the use
of appropriate practice and strategies such as mnemonics.
However it is important to acknowledge that although sometimes these
well developed tools increase memorisation capabilities in general, more
often than not, mnemonists tend to have one domain they specialise in.
In other words, one strategy doesn't work for all sorts of
memorisations. Because metamemory is important for the selection and
application of strategies, it is also important for the improvement of
memory.
There are a number of mnemonists who specialise in different
areas of memory and make use of different strategies to do so. For
example, Ericsson et al. conducted a study with an undergraduate student
"S.F." who had an initial digit span of 7 (within the normal range).
This means that, on average, he was able to recall sequences of 7
random numbers after they were presented. Following more than 230 hours
of practice, S.F. was able to increase his digit span to 79. S.F.'s use
of mnemonics was essential. He used race times, ages, and dates to
categorize the numbers, creating mnemonic associations.
Another example of a mnemonist is Suresh Kumar Sharma, who holds the world record for reciting the most digits of pi (70,030).
Brain-imaging conducted by Tanaka et al. reveals that subjects
with exceptional performance activate some brain regions that are
different from those activated by control participants.
Some memory performance tasks in which people display exceptional
memory are chess, medicine, auditing, computer programming, bridge,
physics, sports, typing, juggling, dance, and music.
In their review, Pannu and Kaszniak reached 4 conclusions:
(1) There is a strong correlation between indices of
frontal lobe function or structural integrity and metamemory accuracy
(2) The combination of frontal lobe dysfunction and poor memory severely
impairs metamemorial processes (3) Metamemory tasks vary in subject
performance levels, and quite likely, in the underlying processes these
different tasks measure, and (4) Metamemory, as measured by experimental
tasks, may dissociate from basic memory retrieval processes and from
global judgments of memory.
Frontal lobe injury
Neurobiological research of metamemory is in its early stages, but recent evidence suggests that the frontal lobe
is involved. A study of patients with medial prefrontal cortex damage
showed that feeling-of-knowing judgments and memory confidence were
lower than in controls.
Studies suggest that right frontal lobe, especially the medial
frontal area, is important for metamemory. Damage to this area is
associated with impaired metamemory, especially for weak memory traces
and effortful episodic tasks.
Korsakoff's syndrome
Individuals
with Korsakoff's syndrome, the result of thiamine deficiency in chronic
alcoholics, have damage to the dorsomedial nucleus of the thalamus and
the mammillary nuclei, as well as degeneration of the frontal lobes.
They display both amnesia and poor metamemory. Shimamura and Squire
found that while patients with Korsakoff's syndrome displayed impaired
FOK judgments, other amnesic patients did not.
HIV
Pannu and Kaszniak found that patients with HIV had impaired metamemory.
However, a later study focusing on HIV found that this impairment was
primarily caused by the general fatigue associated with the disease.
Multiple sclerosis
Multiple sclerosis (MS) causes demyelination of the central nervous system.
One study found that individuals with MS displayed impaired metamemory
for tasks that required high monitoring, but metamemory for easier tasks
was not impaired.
Other disorders
Individuals
with temporal lobe epilepsy display impaired metamemory for some tasks
and not for others, but little research has been conducted in this area.
One of the characteristics of Alzheimer's disease (AD) is
decreased memory performance, but there are inconclusive results
regarding metamemory in AD.
Metamemory impairment is commonly observed in individuals late in the
progression of AD, and some studies also find metamemory impairment
early in AD, while others do not.
Individuals with either Parkinson's disease or Huntington's disease do not appear to have impaired metamemory.
Maturation
In general, metamemory improves as children mature, but even
preschoolers can display accurate metamemory. There are three areas of
metamemory that improve with age.
1) Declarative metamemory – As children mature they gain knowledge of
memory strategies. 2) Self-control – As children mature they generally
become better at allocating study time. 3) Self-monitoring – Older
children are better than younger children at JOL and EOL judgments.
Children can be taught to improve their metamemory through instruction
programs at school. Research suggests that children with ADHD may fall behind in the development of metamemory as preschoolers.
In a recent study on metacognition, measures of metamemory (such as study time allocation) and executive function were found to decline with age.
This contradicts earlier studies, which showed no decline when
metamemory was dissociated from other forms of memory and even suggested
that metamemory could improve with age.
In a cross-sectional study, it was found that the confidence
people have in the accuracy of their memory remains relatively constant
across age groups,
despite the memory impairment that occurs in other forms of memory in
the elderly. This is likely the reason why the tip-of-the-tongue
phenomenon becomes more common with age.
Pharmacology
In a study of self-reported effects of MDMA (ecstasy) on metamemory, metamemory variables such as memory-related feelings/beliefs and self-reported memory were examined. Results suggest that drug use may cause retrospective memory
failures. Although other factors such as high anxiety levels of drug
users might contribute to memory failure, drug use can impair metamemory
abilities. Further, research has shown that benzodiazepine lorazepam has effects on metamemory.
When studying four-letter nonsense words, persons on benzodiazepine
lorazepam displayed impaired episodic short-term memory and lower FOK
estimates. However, benzodiazepine lorazepam did not affect the
predictive accuracy of FOK judgments.
Metamemory in non-humans
Metamemory
has also been researched in non-humans. As it is impossible to
administer the questionnaires used in human trials, non-human trials are
performed using a Match-to-sample task, such as Hampton's use of delayed matching-to-sample (DMTS) tasks with Rhesus monkeys.
There is also evidence that metamemory can be created in AI technologies. Sudo et al.
used DMTS and reported that computational agents controlled by
artificial neural networks could evolve metamemory ability. Similarly,
despite starting from random neural networks that did not even have a
memory function, a model created by researchers at Nagoya University
was able to evolve to the point that it performed DMTS similarly to
monkeys. They reported that the neural network could examine its
memories, keep them, and separate outputs without requiring any
assistance or intervention by the researchers, suggesting the
plausibility of it having metamemory mechanisms.