Chauvinism is a form of extreme patriotism and nationalism
and a belief in national superiority and glory. It can be also defined
as "an irrational belief in the superiority or dominance of one's own
group or people".
Moreover, the chauvinist's own people are seen as unique and special
while the rest of the people are considered weak or inferior.
According to legend, French soldier Nicolas Chauvin was badly wounded in the Napoleonic Wars. He received a pension for his injuries but it was not enough to live on. After Napoleon abdicated, Chauvin was a fanatical Bonapartist despite the unpopularity of this view in Bourbon Restoration
France. His single-minded blind devotion to his cause, despite neglect
by his faction and harassment by its enemies, started the use of the
term.
Chauvinism has extended from its original use to include
fanatical devotion and undue partiality to any group or cause to which
one belongs, especially when such partisanship
includes prejudice against or hostility toward outsiders or rival
groups and persists even in the face of overwhelming opposition. This French quality finds its parallel in the British term jingoism, which has retained the meaning of chauvinism strictly in its original sense; that is, an attitude of belligerent nationalism.
In modern English, the word has come to be used in some quarters as shorthand for male chauvinism, a trend reflected in Merriam-Webster's Dictionary, which, as of 2018, begins its first example of use of the term chauvinism with "an attitude of superiority toward members of the opposite sex".
As nationalism
In 1945, political theorist Hannah Arendt described the concept thus:
Chauvinism is an almost natural
product of the national concept in so far as it springs directly from
the old idea of the "national mission." ... [A] nation's mission might
be interpreted precisely as bringing its light to other, less fortunate
peoples that, for whatever reason, have miraculously been left by
history without a national mission. As long as this concept did not
develop into the ideology of chauvinism and remained in the rather vague
realm of national or even nationalistic pride, it frequently resulted
in a high sense of responsibility for the welfare of backward people.
Male chauvinism
Male chauvinism is the belief that men are superior to women. The first documented use of the phrase "male chauvinism" is in the 1935 Clifford Odets play Till the Day I Die.
In the workplace
The balance of the workforce changed during World War II.
As men left their positions to enlist in the military and fight in the
war, women started replacing them. After the war ended, men returned
home to find jobs in the workplace, male chauvinism was on the rise,
according to Cynthia B. Lloyd. Previously, men had been the main source
of labour, and they expected to come back to their previous employments,
but they soon realized women had taken over many of their positions to
fill the void, says Lloyd.
Lloyd and Michael Korda have argued that as they integrated back
into the workforce, men returned to predominate, holding positions of
power while women worked as their secretaries, usually typing dictations
and answering telephone calls. This division of labor was understood
and expected, and women typically felt unable to challenge their
position or male superiors, argue Korda and Lloyd.
Causes
Chauvinism is seen by some as an influential factor in the TAT,
a psychological personality test. Through cross-examinations, the TAT
exhibits a tendency toward chauvinistic stimuli for its questions and
has the "potential for unfavorable clinical evaluation" for women.
An often cited study done in 1976 by Sherwyn Woods, Some Dynamics
of Male Chauvinism, attempts to find the underlying causes of male
chauvinism.
Male chauvinism was studied in the psychoanalytic
therapy of 11 men. It refers to the maintenance of fixed beliefs and
attitudes of male superiority, associated with overt or covert
depreciation of women. Challenging chauvinist attitudes often results In
anxiety or other symptoms. It is frequently not investigated in
psychotherapy because it is ego-syntonic, parallels cultural attitudes,
and because therapists often share similar bias or neurotic conflict.
Chauvinism was found to represent an attempt to ward off anxiety and
shame arising from one or more of four prime sources: unresolved
infantile strivings and regressive wishes, hostile envy of women,
oedipal anxiety, and power and dependency conflicts related to masculine
self-esteem. Mothers were more important than fathers in the development of male chauvinism, and resolution was sometimes associated with decompensation in wives.
Female chauvinism
Female chauvinism is the belief that women are morally superior to men, and is considered anti-feminist.
The term has been adopted by critics of some types or aspects of feminism; second-wave feminist Betty Friedan is a notable example. Ariel Levy used the term in similar, but opposite sense in her book, Female Chauvinist Pigs, in which she argues that many young women in the United States and beyond are replicating male chauvinism and older misogyniststereotypes.
Cogito, ergo sum is a Latinphilosophicalproposition by René Descartes usually translated into English as "I think, therefore I am". The phrase originally appeared in French as je pense, donc je suis in his Discourse on the Method, so as to reach a wider audience than Latin would have allowed. It appeared in Latin in his later Principles of Philosophy. As Descartes explained, "we cannot doubt of our existence while we doubt...." A fuller version, articulated by Antoine Léonard Thomas, aptly captures Descartes's intent: dubito, ergo cogito, ergo sum ("I doubt, therefore I think, therefore I am"). The concept is also sometimes known as the cogito.
This proposition became a fundamental element of Western philosophy, as it purported to form a secure foundation for knowledge in the face of radical doubt.
While other knowledge could be a figment of imagination, deception, or
mistake, Descartes asserted that the very act of doubting one's own
existence served—at minimum—as proof of the reality of one's own mind;
there must be a thinking entity—in this case the self—for there to be a thought.
The critique
against the proposition is the presupposition of an "I" doing the
thinking, so that the most Descartes was entitled to say was: "thinking
is occurring".
In Descartes's writings
Descartes first wrote the phrase in French in his 1637 Discourse on the Method. He referred to it in Latin without explicitly stating the familiar form of the phrase in his 1641 Meditations on First Philosophy. The earliest written record of the phrase in Latin is in his 1644 Principles of Philosophy,
where, in a margin note (see below), he provides a clear explanation of
his intent: "[W]e cannot doubt of our existence while we doubt". Fuller
forms of the phrase are attributable to other authors.
Discourse on the Method
The phrase first appeared (in French) in Descartes's 1637 Discourse on the Method in the first paragraph of its fourth part:
(French:) Ainsi,
à cause que nos sens nous trompent quelquefois, je voulus supposer
qu'il n'y avait aucune chose qui fût telle qu'ils nous la font imaginer;
Et parce qu'il y a des hommes qui se méprennent en raisonnant, même
touchant les plus simples matières de Géométrie, et y font des
Paralogismes, jugeant que j'étais sujet à faillir autant qu'aucun autre,
je rejetai comme fausses toutes les raisons que j'avais prises
auparavant pour Démonstrations; Et enfin, considérant que toutes les
mêmes pensées que nous avons étant éveillés nous peuvent aussi venir
quand nous dormons, sans qu'il y en ait aucune raison pour lors qui soit
vraie, je me résolus de feindre que toutes les choses qui m'étaient
jamais entrées en l'esprit n'étaient non plus vraies que les illusions
de mes songes. Mais aussitôt après je pris garde que, pendant que je
voulais ainsi penser que tout était faux, il fallait nécessairement que
moi qui le pensais fusse quelque chose; Et remarquant que cette vérité, je pense,donc je suis,
était si ferme et si assurée, que toutes les plus extravagantes
suppositions des Sceptiques n'étaient pas capables de l'ébranler, je
jugeai que je pouvais la recevoir sans scrupule pour le premier principe
de la Philosophie que je cherchais.
(English:) Accordingly, seeing that
our senses sometimes deceive us, I was willing to suppose that there
existed nothing really such as they presented to us; And because some
men err in reasoning, and fall into Paralogisms, even on the simplest
matters of Geometry, I, convinced that I was as open to error as any
other, rejected as false all the reasonings I had hitherto taken for
Demonstrations; And finally, when I considered that the very same
thoughts (presentations) which we experience when awake may also be
experienced when we are asleep, while there is at that time not one of
them true, I supposed that all the objects (presentations) that had ever
entered into my mind when awake, had in them no more truth than the
illusions of my dreams. But immediately upon this I observed that,
whilst I thus wished to think that all was false, it was absolutely
necessary that I, who thus thought, should be something; And as I
observed that this truth, I think,therefore I am,
was so certain and of such evidence that no ground of doubt, however
extravagant, could be alleged by the Sceptics capable of shaking it, I
concluded that I might, without scruple, accept it as the first principle of the philosophy of which I was in search.
Meditations on First Philosophy
In 1641, Descartes published (in Latin) Meditations on first philosophy in which he referred to the proposition, though not explicitly as "cogito, ergo sum" in Meditation II:
(Latin:) hoc pronuntiatum: ego sum, ego existo, quoties a me profertur, vel mente concipitur, necessario esse verum.
(English:) this proposition: I am thinking, therefore I am/exist, whenever it is uttered from me, or conceived by the mind, necessarily is true.
Principles of Philosophy
In 1644, Descartes published (in Latin) his Principles of Philosophy where the phrase "ego cogito, ergo sum" appears in Part 1, article 7:
(Latin:) Sic
autem rejicientes illa omnia, de quibus aliquo modo possumus dubitare,
ac etiam, falsa esse fingentes, facilè quidem, supponimus nullum esse
Deum, nullum coelum, nulla corpora; nosque etiam ipsos, non habere
manus, nec pedes, nec denique ullum corpus, non autem ideò nos qui talia
cogitamus nihil esse: repugnat enim ut putemus id quod cogitat eo ipso
tempore quo cogitat non existere. Ac proinde haec cognitio, ego cogito, ergo sum, est omnium prima & certissima, quae cuilibet ordine philosophanti occurrat.[l]
(English:) While we thus reject all
of which we can entertain the smallest doubt, and even imagine that it
is false, we easily indeed suppose that there is neither God, nor sky,
nor bodies, and that we ourselves even have neither hands nor feet, nor,
finally, a body; but we cannot in the same way suppose that we are not
while we doubt of the truth of these things; for there is a repugnance
in conceiving that what thinks does not exist at the very time when it
thinks. Accordingly, the knowledge, I think, therefore I am, is the first and most certain that occurs to one who philosophizes orderly.
Descartes's margin note for the above paragraph is:
(Latin:) Non posse à nobis dubitari, quin existamus dum dubitamus; atque hoc esse primum, quod ordine philosophando cognoscimus.
(English:) That we cannot doubt of
our existence while we doubt, and that this is the first knowledge we
acquire when we philosophize in order.
The Search for Truth
Descartes, in a lesser-known posthumously published work dated as written ca. 1647 and titled La Recherche de la Vérité par La Lumiere Naturale (The Search for Truth by Natural Light), wrote:
(Latin:) … [S]entio, oportere, ut quid dubitatio, quid cogitatio, quid exsistentia sit antè sciamus, quàm de veritate hujus ratiocinii : dubito, ergo sum, vel, quod idem est, cogito, ergo sum : plane simus persuasi.
(English:) … [I feel that] it is
necessary to know what doubt is, and what thought is, [what existence
is], before we can be fully persuaded of this reasoning — I doubt, therefore I am — or what is the same — I think, therefore I am.
Other forms
The proposition is sometimes given as dubito, ergo cogito, ergo sum. This fuller form was penned by the eloquent French literary critic, Antoine Léonard Thomas, in an award-winning 1765 essay in praise of Descartes, where it appeared as "Puisque je doute, je pense; puisque je pense, j'existe."
In English, this is "Since I doubt, I think; since I think, I exist";
with rearrangement and compaction, "I doubt, therefore I think,
therefore I am", or in Latin, "dubito, ergo cogito, ergo sum".
A further expansion, dubito, ergo cogito, ergo sum—res cogitans ("…—a thinking thing") extends the cogito with Descartes's statement in the subsequent Meditation, "Ego
sum res cogitans, id est dubitans, affirmans, negans, pauca
intelligens, multa ignorans, volens, nolens, imaginans etiam et sentiens
…", or, in English, "I am a thinking (conscious) thing, that is, a
being who doubts, affirms, denies, knows a few objects, and is ignorant
of many …". This has been referred to as "the expanded cogito".
Translation
Neither je pense nor cogito indicate whether the verb form corresponds to the English simple present or progressive aspect. Translation needs a larger context to determine aspect.
Following John Lyons (1982),
Vladimir Žegarac notes, "The temptation to use the simple present is
said to arise from the lack of progressive forms in Latin and French,
and from a misinterpretation of the meaning of cogito as habitual or generic." (Cf. gnomic aspect.) Ann Banfield
writes (also following Lyons), "In order for the statement on which
Descartes's argument depends to represent certain knowledge, … its tense
must be a true present—in English, a progressive, … not as 'I think'
but as 'I am thinking, in conformity with the general translation of the
Latin or French present tense in such nongeneric, nonstative contexts." Or in the words of Simon Blackburn,
"Descartes’s premise is not ‘I think’ in the sense of ‘I ski’, which
can be true even if you are not at the moment skiing. It is supposed to
be parallel to ‘I am skiing’."
Fumitaka Suzuki (2012) writes "Taking consideration of Cartesian
theory of continuous creation, which theory was developed especially in
the Meditations and in the Principles, we would assure that 'I am
thinking, therefore I am/exist' is the most appropriate English
translation of 'ego cogito, ergo sum'."
The similar translation “I am thinking, therefore I exist” of Descartes's correspondence in French (“je pense, donc je suis”) appears in The Philosophical Writings of Descartes by Cottingham et al. (1988).
The earliest known translation as "I am thinking, therefore I am" is from 1872 by Charles Porterfield Krauth.
Interpretation
As
put succinctly by Krauth (1872), "That cannot doubt which does not
think, and that cannot think which does not exist. I doubt, I think, I
exist."
The phrase cogito, ergo sum is not used in Descartes's Meditations on First Philosophy but the term "the cogito" is used to refer to an argument from it. In the Meditations, Descartes phrases the conclusion of the argument as "that the proposition, I am, I exist, is necessarily true whenever it is put forward by me or conceived in my mind." (Meditation II)
At the beginning of the second meditation, having reached what he
considers to be the ultimate level of doubt—his argument from the
existence of a deceiving god—Descartes examines his beliefs to see if
any have survived the doubt. In his belief in his own existence, he
finds that it is impossible to doubt that he exists. Even if there were
a deceiving god (or an evil demon),
one's belief in their own existence would be secure, for there is no
way one could be deceived unless one existed in order to be deceived.
But I have convinced myself that there is absolutely
nothing in the world, no sky, no earth, no minds, no bodies. Does it
now follow that I, too, do not exist? No. If I convinced myself of
something [or thought anything at all], then I certainly existed. But
there is a deceiver of supreme power and cunning who deliberately and
constantly deceives me. In that case, I, too, undoubtedly exist, if he
deceives me; and let him deceive me as much as he can, he will never
bring it about that I am nothing, so long as I think that I am
something. So, after considering everything very thoroughly, I must
finally conclude that the proposition, I am, I exist, is necessarily true whenever it is put forward by me or conceived in my mind. (AT VII 25; CSM II 16–17)
There are three important notes to keep in mind here. First, he claims only the certainty of his own
existence from the first-person point of view — he has not proved the
existence of other minds at this point. This is something that has to be
thought through by each of us for ourselves, as we follow the course of
the meditations. Second, he does not say that his existence is
necessary; he says that if he thinks, then necessarily he exists (see the instantiation principle).
Third, this proposition "I am, I exist" is held true not based on a
deduction (as mentioned above) or on empirical induction but on the
clarity and self-evidence of the proposition. Descartes does not use
this first certainty, the cogito, as a foundation upon which to
build further knowledge; rather, it is the firm ground upon which he can
stand as he works to discover further truths. As he puts it:
Archimedes used to demand just one firm and immovable
point in order to shift the entire earth; so I too can hope for great
things if I manage to find just one thing, however slight, that is
certain and unshakable. (AT VII 24; CSM II 16)
According to many Descartes specialists, including Étienne Gilson,
the goal of Descartes in establishing this first truth is to
demonstrate the capacity of his criterion — the immediate clarity and
distinctiveness of self-evident propositions — to establish true and
justified propositions despite having adopted a method of generalized
doubt. As a consequence of this demonstration, Descartes considers
science and mathematics to be justified to the extent that their
proposals are established on a similarly immediate clarity,
distinctiveness, and self-evidence that presents itself to the mind. The
originality of Descartes's thinking, therefore, is not so much in
expressing the cogito — a feat accomplished by other predecessors, as we shall see — but on using the cogito
as demonstrating the most fundamental epistemological principle, that
science and mathematics are justified by relying on clarity,
distinctiveness, and self-evidence.
Baruch Spinoza in "Principia philosophiae cartesianae" at its Prolegomenon identified "cogito ergo sum" the "ego sum cogitans" (I am a thinking being) as the thinking substance with his ontological interpretation.
Predecessors
Although the idea expressed in cogito, ergo sum is widely attributed to Descartes, he was not the first to mention it. Plato spoke about the "knowledge of knowledge" (Greek νόησις νοήσεως nóesis noéseos) and Aristotle explains the idea in full length:
But if life itself is good and pleasant (...) and if one
who sees is conscious that he sees, one who hears that he hears, one who
walks that he walks and similarly for all the other human activities
there is a faculty that is conscious of their exercise, so that whenever
we perceive, we are conscious that we perceive, and whenever we think,
we are conscious that we think, and to be conscious that we are
perceiving or thinking is to be conscious that we exist... (Nicomachean Ethics, 1170a25 ff.)
In the late sixth or early fifth century BC, Parmenides is quoted as saying "For to be aware and to be are the same" (B3). Augustine of Hippo in De Civitate Dei writes Si […] fallor, sum ("If I am mistaken, I am") (book XI, 26), and also anticipates modern refutations of the concept. Furthermore, in the Enchiridion Augustine attempts to refute skepticism
by stating, "[B]y not positively affirming that they are alive, the
skeptics ward off the appearance of error in themselves, yet they do
make errors simply by showing themselves alive; one cannot err who is
not alive. That we live is therefore not only true, but it is altogether
certain as well" (Chapter 7 section 20). In 1640 correspondence,
Descartes thanked two colleagues for drawing his attention to Augustine
and notes similarity and difference. (See CSMK III 159, 161.)
The 8th century Hindu philosopher Adi Shankara
wrote in a similar fashion, No one thinks, 'I am not', arguing that
one's existence cannot be doubted, as there must be someone there to
doubt. The central idea of cogito, ergo sum is also the topic of Mandukya Upanishad.
Spanish philosopher Gómez Pereira in his 1554 work De Inmortalitate Animae, published in 1749, wrote "nosco me aliquid noscere, & quidquid noscit, est, ergo ego sum" ("I know that I know something, anyone who knows exists, then I exist").
Critique
Use of "I"
In Descartes, The Project of Pure Enquiry, Bernard Williams provides a history and full evaluation of this issue. The first to raise the "I" problem was Pierre Gassendi.
He "points out that recognition that one has a set of thoughts does not
imply that one is a particular thinker or another. Were we to move from
the observation that there is thinking occurring to the attribution of
this thinking to a particular agent, we would simply assume what we set out to prove,
namely, that there exists a particular person endowed with the capacity
for thought". In other words, "the only claim that is indubitable here
is the agent-independent claim that there is cognitive activity
present". The objection, as presented by Georg Lichtenberg,
is that rather than supposing an entity that is thinking, Descartes
should have said: "thinking is occurring." That is, whatever the force
of the cogito, Descartes draws too much from it; the existence of a thinking thing, the reference of the "I," is more than the cogito can justify. Friedrich Nietzsche
criticized the phrase in that it presupposes that there is an "I", that
there is such an activity as "thinking", and that "I" know what
"thinking" is. He suggested a more appropriate phrase would be "it
thinks" wherein the "it" could be an impersonal subject as in the
sentence "It is raining."
Kierkegaard
The Danish philosopher Søren Kierkegaard calls the phrase a tautology in his Concluding Unscientific Postscript. He argues that the cogito
already presupposes the existence of "I", and therefore concluding with
existence is logically trivial. Kierkegaard's argument can be made
clearer if one extracts the premise "I think" into the premises "'x'
thinks" and "I am that 'x'", where "x" is used as a placeholder in order
to disambiguate the "I" from the thinking thing.
Here, the cogito has already assumed the "I"'s existence
as that which thinks. For Kierkegaard, Descartes is merely "developing
the content of a concept", namely that the "I", which already exists,
thinks.
As Kierkegaard argues, the proper logical flow of argument is that
existence is already assumed or presupposed in order for thinking to
occur, not that existence is concluded from that thinking.
Williams
Bernard
Williams claims that what we are dealing with when we talk of thought,
or when we say "I am thinking," is something conceivable from a third-person perspective; namely objective "thought-events" in the former case, and an objective thinker in the latter. He argues, first, that it is impossible to make sense of "there is thinking" without relativizing it to something.
However, this something cannot be Cartesian egos, because it is
impossible to differentiate objectively between things just on the basis
of the pure content of consciousness. The obvious problem is that,
through introspection, or our experience of consciousness,
we have no way of moving to conclude the existence of any
third-personal fact, to conceive of which would require something above
and beyond just the purely subjective contents of the mind.
Heidegger
As
a critic of Cartesian subjectivity, Heidegger sought to ground human
subjectivity in death as that certainty which individualizes and
authenticates our being. As he wrote in 1927:
"This certainty, that "I myself am in that I will die," is the
basic certainty of Dasein itself. It is a genuine statement of Dasein,
while cogito sum is only the semblance of such a statement. If
such pointed formulations mean anything at all, then the appropriate
statement pertaining to Dasein in its being would have to be sum moribundus [I am in dying], moribundus not as someone gravely ill or wounded, but insofar as I am, I am moribundus. The MORIBUNDUS first gives the SUM its sense."
John Macmurray
The Scottish philosopher John Macmurray rejects the cogito
outright in order to place action at the center of a philosophical
system he entitles the Form of the Personal. "We must reject this, both
as standpoint and as method. If this be philosophy, then philosophy is a
bubble floating in an atmosphere of unreality." The reliance on thought creates an irreconcilable dualism between thought and action in which the unity
of experience is lost, thus dissolving the integrity of our selves, and
destroying any connection with reality. In order to formulate a more
adequate cogito, Macmurray proposes the substitution of "I do"
for "I think", ultimately leading to a belief in God as an agent to
whom all persons stand in relation.
Egocentrism is the inability to differentiate between self and other. More specifically, it is the inability to untangle subjective schemas from objective reality and an inability to accurately assume or understand any perspective other than one's own.
Although egocentrism and narcissism
appear similar, they are not the same. A person who is egocentric
believes they are the center of attention, like a narcissist, but does
not receive gratification by one's own admiration. Both egotists
and narcissists are people whose egos are greatly influenced by the
approval of others, while for egocentrists this may or may not be true.
Although egocentric behaviors are less prominent in adulthood,
the existence of some forms of egocentrism in adulthood indicates that
overcoming egocentrism may be a lifelong development that never achieves
completion.
Adults appear to be less egocentric than children because they are
faster to correct from an initially egocentric perspective than
children, not because they are less likely to initially adopt an
egocentric perspective.
Therefore, egocentrism is found across the life span: in infancy, early childhood, adolescence, and adulthood. It contributes to the human cognitive development by helping children develop theory of mind and self-identity formation.
During infancy
The
main concept infants and young children learn by beginning to show
egocentrism is the fact that their thoughts, values, and behaviors are
different from those of others, also known as the theory of mind.
Initially when children begin to have social interactions with others,
mainly the caregivers, they misinterpret that they are one entity,
because they are together for a long duration of time and the caregivers
often provide for the children's needs. For example, a child may
misattribute the act of their mother reaching to retrieve an object that
they point to as a sign that they are the same entity, when in fact
they are actually separate individuals. As early as 15 months old, children show a mix of egocentrism and theory of mind
when an agent acts inconsistently with how the children expect him to
behave. In this study the children observed the experimenter place a
toy inside one of two boxes, but did not see when the experimenter
removed the toy from the original box and placed it in the other box,
due to obstruction by a screen. When the screen was removed the children
watched the experimenter reach to take the toy out of one of the boxes,
yet because the children did not see the switching part, they looked at
the experimenter's action much longer when she reached for the box
opposite to the one she originally put the toy in. Not only does this
show the existence of infants' memory capacity, but it also demonstrates
how they have expectations based on their knowledge, as they are
surprised when those expectations are not met.
Piaget explained that egocentrism during infancy does not mean
selfishness, self-centredness, or egotism because it refers to the
infant's understanding of the world in terms of their own motor activity
as well as an inability to understand it.
In children's social development, the infancy is the period where the
individual performs very few social functions due to the conscious and
subconscious concern with the fulfillment of physical needs.
During childhood
According to George Butterworth and Margaret Harris, during childhood, one is usually unable to distinguish between what is subjective and objective. According to Piaget, "an egocentric child assumes that other people see, hear, and feel exactly the same as the child does."
Jean Piaget (1896–1980) developed a theory about the development of human intelligence, describing the stages of cognitive development.
He claimed that early childhood is the time of pre-operational thought,
characterized by children's inability to process logical thought.
According to Piaget, one of the main obstacles to logic that children
possess includes centration, "the tendency to focus on one aspect of a
situation to the exclusion of others."
A particular type of centration is egocentrism – literally,
"self-centeredness." Piaget claimed that young children are egocentric,
capable of contemplating the world only from their personal perspective.
For example, a three-year-old presented his mother a model truck as her
birthday present; "he had carefully wrapped the present and gave it to
his mother with an expression that clearly showed he expected her to
love it." The three-year-old boy had not chosen the present out of selfishness
or greediness, but he simply failed to realize that, from his mother's
perspective, she might not enjoy the model car as much as he would.
Piaget was concerned with two aspects of egocentricity in children: language and morality.
He believed that egocentric children use language primarily for
communication with oneself. Piaget observed that children would talk to
themselves during play, and this egocentric speech was merely the
child's thoughts.
He believed that this speech had no special function; it was used as a
way of accompanying and reinforcing the child's current activity. He
theorized that as the child matures cognitively and socially the amount
of egocentric speech used would be reduced. However, Vygotsky felt that egocentric speech has more meaning, as it allows the child's growth in social speech and high mental development.
In addition to Piaget's theory, he believed that when communicating
with others, the child believes that others know everything about the
topic of discussion and become frustrated when asked to give further
detail.
Piaget also believed that egocentrism affects the child's sense of morality.
Due to egocentrism, the child is only concerned with the final outcome
of an event rather than another's intentions. For example, if someone
breaks the child's toy, the child would not forgive the other and the
child would not be able to understand that the person who broke the toy
did not intend to break it.
This phenomenon can also be backed by the evidence from the findings of
the case study by Nelson, who studied the use of motives and outcomes
by young children as aiding to form their moral judgements.
Piaget did a test to investigate egocentrism called the mountains
study. He put children in front of a simple plaster mountain range and
then asked them to pick from four pictures the view that he, Piaget,
would see. The younger children before age seven picked the picture of
the view they themselves saw and were therefore found to lack the
ability to appreciate a viewpoint different from their own. In other
words, their way of reasoning was egocentric. Only when entering the
concrete-operational stage of development at age seven to twelve,
children became less egocentric and could appreciate viewpoints other
than their own. In other words, they were capable of cognitive
perspective-taking. However, the mountains test has been criticized for
judging only the child's visuo-spatial awareness, rather than
egocentrism. A follow up study involving police dolls showed that even
young children were able to correctly say what the interviewer would
see.
It is thought that Piaget overestimated the extent of egocentrism in
children. Egocentrism is thus the child's inability to see other
people's viewpoints, not to be confused with selfishness. The child at
this stage of cognitive development assumes that their view of the world is the same as other people's.
In addition, a more well-known experiment by Wimmer and Perner
(1983) called the false-belief task demonstrates how children show their
acquisition of theory of mind (ToM) as early as 4 years old.
In this task, children see a scenario where one character hides a
marble in a basket, walks out of the scene, and another character that
is present takes out the marble and put it in a box. Knowing that the
first character did not see the switching task, children were asked to
predict where the first character would look to find the marble. The
results show that children younger than 4 answer that the character
would look inside the box, because they have the superior knowledge of
where the marble actually is. It shows egocentric thinking in early
childhood because they thought that even if the character itself did not
see the entire scenario, it has the same amount of knowledge as oneself
and therefore should look inside the box to find the marble. As
children start to acquire ToM, their ability to recognize and process
others' beliefs and values overrides the natural tendency to be
egocentric.
During adolescence
Although most of the research completed on the study of egocentrism
is primarily focused on early childhood development, it has been found
to also occur during adolescence. David Elkind
was one of the first to discover the presence of egocentrism in
adolescence and late adolescence. He argues, "the young adolescent,
because of the physiological metamorphosis he is undergoing, is
primarily concerned with himself. Accordingly, since he fails to
differentiate between what others are thinking about and his own mental
preoccupations, he assumes that other people are obsessed with his
behavior and appearance as he is himself."
This shows that the adolescent is exhibiting egocentrism, by struggling
to distinguish whether or not, in actuality, others are as fond of them
as they might think because their own thoughts are so prevalent.
Adolescents consider themselves as "unique, special, and much more
socially significant than they actually are."
Elkind also created terms to help describe the egocentric
behaviors exhibited by the adolescent population such as what he calls
an imaginary audience, the personal fable,
and the invincibility fable. Usually when an egocentric adolescent is
experiencing an imaginary audience, it entails the belief that there is
an audience captivated and constantly present to an extent of being
overly interested about the egocentric individual. Personal fable refers
to the idea that many teenagers believe their thoughts, feelings, and
experiences are unique and more extreme than anyone else's.
In the invincibility fable, the adolescent believes in the idea that he
or she is immune to misfortune and cannot be harmed by things that
might defeat a normal person.
Egocentrism in adolescence is often viewed as a negative aspect of
their thinking ability because adolescents become consumed with
themselves and are unable to effectively function in society due to
their skewed version of reality and cynicism.
There are various reasons as to why adolescents experience egocentrism:
Adolescents are often faced with new social environments (for
example, starting secondary school) which require the adolescent to
protect the self which may lead to egocentrism.
Development of the adolescent's identity may lead to the individual
experiencing high levels of uniqueness which subsequently becomes
egocentric – this manifests as the personal fable.
Parental rejection may lead to the adolescents experiencing high levels of self-consciousness, which can lead to egocentrism.
Gender differences have been found in the way egocentrism manifests.
Transient Self, as defined by Elkind and Bowen in 1979, refers to
impermanent image of self that is mainly relative to one-time behaviors
and temporary appearance,
and, adolescent females have a higher tendency to consider themselves
to be different from others, and tend to be more self-conscious in
situations that involve momentary embarrassments (e.g. going to a party
with a bad haircut), than their male peers.
Another study conducted by Goossens and Beyers (1992) using similar
measuring instruments found that boys have stronger beliefs that they
are unique, invulnerable and sometimes omnipotent, which are typical
characteristics of personal fable.
This again exemplifies the idea that egocentrism is present in even late adolescence.
Results from other studies have come to the conclusion that
egocentrism does not present itself in some of the same patterns as it
was found originally. More recent studies have found that egocentrism is
prevalent in later years of development unlike Piaget's original
findings that suggested that egocentrism is only present in early
childhood development.
Egocentrism is especially dominant in early adolescence, particularly
when adolescents encounter new environments, such as a new school or a
new peer group.
In addition, throughout adolescence egocentrism contributes to
the development of self-identity; in order to achieve self-identity,
adolescents go through different pathways of "crisis" and "commitment"
stages, and higher self-identity achievement was found to be correlated with heightened egocentrism.
During adulthood
The prevalence of egocentrism in the individual has been found to decrease between the ages of 15 and 16. However, adults are also susceptible to be egocentric or to have reactions or behaviours that can be categorized as egocentric.
Frankenberger tested adolescents (14–18 years old) and adults (20–89) on their levels of egocentrism and self-consciousness.
It was found that egocentric tendencies had extended to early adulthood
and these tendencies were also present in the middle adult years.
Baron and Hanna looked at 152 participants and tested to see how the presence of depression affected egocentrism.
They tested adults between the ages of 18 and 25 and found that the
participants who suffered from depression showed higher levels of
egocentrism than those who did not.
Finally, Surtees and Apperly found that when adults were asked to
judge the number of dots they see and the number of dots the avatar in
the computer simulation sees, the presence of the avatar interfered with
the participants' judgment-making during the trials. Specifically,
these were the trials where the number of dots seen by the participant
was inconsistent from the number of dots the avatar saw.
Such effect on the participants diminished when the avatar was
replaced with a simple yellow or blue line, which concluded that somehow
the avatar having a personal attribute implicitly caused the
participants to include its "vision" into their own decision making.
That said, they made more errors when they saw prompts such as "the
avatar sees N" when N was the number of dots the participant saw and not
the avatar, which shows that egocentric thought is still predominant in
making quick judgments, even if the adults are well aware that their
thoughts could differ from others.
Religiocentrism or religio-centrism is defined (Corsini 1999:827) as the "conviction that a person's own religion is more important or superior to other religions." In analogy to ethnocentrism, religiocentrism is a value-neutral term for psychological attitude.
Although the precise origins of religiocentrism and religiocentric
remain unclear, the words have been used since the early 20th century.
The American economist Adrian Augustus Holtz (1917:15) described how
early German school reforms were "carried on in a way that allowed for a
religio-centric educational system." Sinclair Lewis's Main Street (1920:307) said, "Maud Dyer was neurotic, religiocentric, faded; her emotions were moist, and her figure was unsystematic."
The related term Christocentric theologically means "forms of Christianity that concentrate on the teaching of Jesus Christ", but is sometimes used as a near synonym of religiocentric.
For instance (Hamilton 2002), "No matter where it appears,
government-sponsored Christocentrism, or even religiocentrism,
undermines this nation's ideals."
The Australian social psychologists John J. Ray and Dianne Doratis defined religiocentrism.
"Ethnocentrism" is the social scientist's value-neutral
term for ethnic or racial prejudice. It refers to ethnically-based
sentiments of exclusiveness without any implication of their moral worth
or justifiability... By analogy, the term religiocentrism is derived
here to mean religiously based sentiments of
exclusiveness—beliefs that one should marry within one's own religion,
work with members of one's own religion, and in general prefer members
of one's own religion above others. This will also entail ipso facto devaluative judgments of other religions. (1971:170)
Ray and Doratis designed a groundbreaking attitude scale to measure
religiocentrism and ethnocentrism. Their religiocentrism scale comprises
33 items (for instance, "I think my religion is nearer to the truth
than any other" and "Most Moslems, Buddhists and Hindus are very stupid
and ignorant"), with five-point Likert scale psychometric response options from "Strongly agree" (Scored 5) to "Strongly disagree" (1). To verify internal consistency
among respondents, 11 items were reverse scored ("It makes no
difference to me what religion my friends are" is the converse of "I
think that it's better if you stick to friends of the same religion as
your own"), resulting in a reliability coefficient
of .88 among 154 first-year university students. The authors tested
attitudes among Australian fifth-form students in two Catholic and two
public schools, and discovered that neither ethnocentrism nor
religiocentrism showed any correlation with religious background. Ray
and Doratis concluded (1971:178), "Ethnocentrism, religiocentrism and
religious conservatism were all shown to be separate and distinct
factors of attitudes in their own right. They are not just three aspects
of the one thing. Religiocentric people do however tend to be both
religiously conservative and ethnocentric."
The Hungarian-Jewish historian and anthropologist Raphael Patai mentions religiocentrism as a variable in relationships between religion and culture,
Each religion also has a definite outlook on its own
value in relation to that of other religions. Its relationship to other
religions may range from complete toleration to the complete lack of it,
with a corresponding range of self-evaluation. This variable, best
called religio-centrism (on the analogy of ethnocentrism), can serve as
an additional avenue of approach to the study of our subject. (1954:234)
Comparing Middle Eastern, Far Eastern, and Western cultures, Patai finds,
Religion in the Far East is characterized by the absence
of religio-centrism: there is a marked toleration of other religions and
a mutual borrowing and influencing; in the Middle East and in the West
there is a high degree of religio-centrism, with intolerance and scorn
of other religions: each religion is exclusive and regards itself as the
"one and only" true faith. (1954:252)
In a later survey of the potentials for world peace, Patai
differentiated the major modern religions between "theistic" and
"nontheistic".
The three great monotheistic religions of the Middle East
and the West, Judaism, Christianity, and Islam, are the foremost
theistic religions and the dominant faiths of about one half of mankind.
Common to all theistic religions is a pronounced religiocentrism,
expressed most poignantly in the conviction that one's own religion is
the one and only true one, and that all the other faiths are erroneous
and hence depreciable. In this conviction were rooted the great
religious wars which pitted, not only Muslims against Christians, but
also various Muslim sects against one another, and likewise made various
Christian denominations bitter enemies... The situation is more hopeful
in the great nontheistic religions of South, Southeast, and East Asia.
These religions, notably Hinduism, Buddhism, Jainism, Sikhism,
Confucianism, and Taoism, lack the element of self-assurance and
certainty that each is the exclusive possessor of the only truth.
(1987:24-25)
In response, Andrew Wilson, Professor of Scriptural Studies of the Unification Theological Seminary,
criticized Patai's opinion as theologically plausible but historically
erroneous, citing examples (1987:28) of "rampant communal violence
between Hindus and Buddhists in Sri Lanka and between Sikhs and Hindus
in India."
Religiocentrism has a specialized meaning for sociologists
(Chalfant, Beckley, and Palmer 1994:51). "This term is related to a
common word used in sociological literature, ethnocentrism. Similarly, we might refer to feelings of rightness and superiority resulting from religious affiliation as religiocentrism. Religiocentrism inhibits the ability of a society to achieve adaptation, integration and goal-attainment."
Mohammed Abu-Nimer, the Director of the Peacebuilding and Development Institute at American University, distinguishes between religiocentrism and "religiorelativism".
A religiorelative person is firm in his/her belief that
other religions have the right to exist and be practiced, even if such
norms and beliefs are contradictory to one's own set of religious
beliefs. Such a person is prone not to engage in violence or
discriminatory actions against the others. In contrast, a religiocentric
person is a believer who denies other religions' "truth" and who holds
an absolute truth that leaves no room for different religious practices.
Such a person becomes more prone to dehumanize, exclude, and
discriminate against other religious groups and individuals. Often, as a
result of negative and destructive exposure and experience with
conflict and war, religiocentric beliefs not only are exacerbated and
easily translated into violence against the enemy (that is, the
different other), but also actually grow and prohibit human and peaceful
contact with the other. However, there are conflict resolution and
peace-building activities and forums that can assist peace workers in
such settings to transform a religiocentric into a religiorelative
believer. (2004:497)
Abu-Nimer (2004:479-501) analyzes three typical reactions of a religiocentric person to another religion: denial (e.g., Israel not allowing Arabs to purchase or use state land), defense mechanisms ("There is no salvation outside the Church"), and minimization ("We are all the children of God").
Ethnocentrism is a term used in social sciences and
anthropology to describe the act of judging another culture and
believing that the values and standards of one's own culture are
superior – especially with regard to language, behavior, customs, and
religion. These aspects or categories are distinctions that define each ethnicity's unique cultural identity.
The term ethnocentrism, deriving from the Greek word ethnos meaning "nation, people, or cultural grouping" and the Latin word centric meaning "center," was first applied in the social sciences by American sociologist William G. Sumner. In his 1906 book, Folkways,
Sumner describes ethnocentrism as; "the technical name for the view of
things in which one's own group is the center of everything, and all
others are scaled and rated with reference to it." He further
characterized ethnocentrism as often leading to pride, vanity, the belief in one's own group's superiority, and contempt for outsiders.
Over time, ethnocentrism developed alongside the progression of social understandings by people such as social theorist, Theodore W. Adorno. In Adorno's The Authoritarian Personality, he and his colleagues of the Frankfurt School
established a broader definition of the term as a result of "in
group-out group differentiation", stating that ethnocentrism "combines a
positive attitude toward one's own ethnic/cultural group (the in-group)
with a negative attitude toward the other ethnic/cultural group (the
out-group)". Both of these juxtaposing attitudes are also a result of a
process known as social identification and social counter-identification.
Origins and development
The term ethnocentrism is believed by scholars to have been created by Austrian sociologist Ludwig Gumplowicz in the 19th century, although alternate theories suggest that he only popularized the concept as opposed to inventing it. He saw ethnocentrism as a phenomenon similar to the delusions of geocentrism and anthropocentrism,
defining Ethnocentrism as "the reasons by virtue of which each group of
people believed it had always occupied the highest point, not only
among contemporaneous peoples and nations, but also in relation to all
peoples of the historical past."
Subsequently in the 20th century, American social scientist William G. Sumner proposed two different definitions in his 1906 book Folkways.
Sumner stated that "Ethnocentrism is the technical name for this view
of things in which one's own group is the center of everything, and all
others are scaled and rated with reference to it." In the War and Other Essays (1911),
he wrote that "the sentiment of cohesion, internal comradeship, and
devotion to the in-group, which carries with it a sense of superiority
to any out-group and readiness to defend the interests of the in-group
against the out-group, is technically known as ethnocentrism."
According to Boris Bizumic it is a popular misunderstanding that Sumner
originated the term ethnocentrism, stating that in actuality he brought
ethnocentrism into the mainstreams of anthropology, social science, and psychology through his English publications.
The classifications of ethnocentrism originate from the studies of anthropology.
With its omnipresence throughout history, ethnocentrism has always been
a factor in how different cultures and groups related to one another.
Examples including how historically, foreigners would be characterized
as 'Barbarians', or China would believe their nation to be the 'Empire
of the Center' and viewing foreigners as privileged subordinates.
However, the anthropocentric interpretations initially took place most
notably in the 19th century when anthropologists began to describe and
rank various cultures according to the degree to which they had
developed significant milestones such as; monotheistic religions,
technological advancements, and other historical progressions.
Most rankings were strongly influenced by colonization and the
belief to improve societies they colonized, ranking the cultures based
on the progression of their western societies and what they classified
as milestones. Comparisons were mostly based on what the colonists
believed as superior and what their western societies have accomplished.
Thomas Macaulay, an English politician in the 19th
Century, attempted to validate the opinion that "one shelf of a Western
library" had more knowledge then the years of text and literature
developed by the Eastern societies. Ideas developed by Charles Darwin has ethnocentric ideals where societies who believed they were superior were most likely to survive and prosper.
Edward Said’s orientalist concept represented how Western reactions to
non-Western societies were based on an "unequal power relationship" that
Western peoples developed due to colonization and the influence it held
over non-Western societies.
The ethnocentric classification of "primitive" were also used by 19th and 20th
century anthropologists and represented how unawareness in cultural and
religious understanding changed overall reactions to non-Western
societies. Modern anthropologist Sir Edward Burnett Tylor wrote about "primitive" societies in Primitive Culture (1871) creating a "civilization" scale where it was implied that ethnic cultures preceded civilized societies.
The use of "savage" as a classification is modernly known as "tribal"
or "pre-literate" where it was usually referred as a derogatory term as
the "civilization" scale became more common.
Examples that demonstrate a lack of understanding include when European
travelers judged different languages based on that fact that they could
not understand it and displayed a negative reaction, or the intolerance
displayed by Westerners when exposed to unknown religions and
symbolisms. Georg Wilhelm Friedrich Hegel,
a German philosopher, justified Western colonization by reasoning that
since the non-Western societies were "primitive" and "uncivilized,"
their culture and history was not worth conserving and should allow
Westernization.
Anthropologist Franz Boas
saw the flaws in this formulaic approach to ranking and interpreting
cultural development and committed himself to overthrowing this
inaccurate reasoning due to many factors involving their individual
characteristics. With his methodological innovations, Boas sought to
show the error of the proposition that race determined cultural
capacity. In his 1911 book The Mind of Primitive Man, Boas wrote that:
It
is somewhat difficult for us to recognize that the value which we
attribute to our own civilization is due to the fact that we participate
in this civilization, and that it has been controlling all our actions
from the time of our birth; but it is certainly conceivable that there
may be other civilizations, based perhaps on different traditions and on
a different equilibrium of emotion and reason, which are of no less
value than ours, although it may be impossible for us to appreciate
their values without having grown up under their influence.
Together,
Boas and his colleagues propagated the certainty that there are no
inferior races or cultures. This egalitarian approach introduced the
concept of cultural relativism
to anthropology, a methodological principle for investigating and
comparing societies in as unprejudiced as possible and without using a
developmental scale as anthropologists at the time were implementing. Boas and anthropologist Bronislaw Malinowski argued that any human science had to transcend the ethnocentric views that could blind any scientist's ultimate conclusions.
Both had also urged anthropologists to conduct ethnographic fieldwork in order to overcome their ethnocentrism. To help, Malinowski would develop the theory of functionalism
as guides for producing non-ethnocentric studies of different cultures.
Classic examples of anti-ethnocentric anthropology include Margaret Mead's Coming of Age in Samoa (1928), which in time has met with severe criticism for its incorrect data and generalisations, Malinowski's The Sexual Life of Savages in North-Western Melanesia (1929), and Ruth Benedict's Patterns of Culture (1934). Mead and Benedict were two of Boas's students.
Scholars generally agree that Boas developed his ideas under the influence of the German philosopher Immanuel Kant. Legend has it that, on a field trip to the Baffin Islands in 1883, Boas would pass the frigid nights reading Kant's Critique of Pure Reason.
In that work, Kant argued that human understanding could not be
described according to the laws that applied to the operations of
nature, and that its operations were therefore free, not determined, and
that ideas regulated human action, sometimes independent of material
interests. Following Kant, Boas pointed out the starving Eskimos who,
because of their religious beliefs, would not hunt seals to feed
themselves, thus showing that no pragmatic or material calculus
determined their values.
Causes
Ethnocentrism is believed to be a learned behavior embedded into a variety of beliefs and values of an individual or group.
Due to enculturation,
individuals in in-groups have a deeper sense of loyalty and are more
likely to following the norms and develop relationships with associated
members.
Within relation to enculturation, ethnocentrism is said to be a
transgenerational problem since stereotypes and similar perspectives can
be enforced and encouraged as time progresses.
Although loyalty can increase better in-grouper approval, limited
interactions with other cultures can prevent individuals to have an
understanding and appreciation towards cultural differences resulting in
greater ethnocentrism.
The social identity approach
suggests that ethnocentric beliefs are caused by a strong
identification with one's own culture that directly creates a positive
view of that culture. It is theorized by Henri Tajfel and John C. Turner that in order to maintain that positive view, people make social comparisons that cast competing cultural groups in an unfavorable light.
Alternative or opposite perspectives could cause individuals to develop naïve realism and be subject to limitations in understandings. These characteristics can also lead to individuals to become subject to ethnocentrism, when referencing out-groups, and black sheep effect, where personal perspectives contradict those from fellow in-groupers.
Realistic conflict theory
assumes that ethnocentrism happens due to "real or perceived conflict"
between groups. This also happens when a dominant group may perceive the
new members as a threat.Scholars
have recently demonstrated that individuals are more likely to develop
in-group identification and out-group negatively in response to
intergroup competition, conflict, or threat.
Although the causes of ethnocentric beliefs and actions can have
varying roots of context and reason, the effects of ethnocentrism has
had both negative and positive effects throughout history. The most
detrimental effects of ethnocentrism resulting into genocide, apartheid, slavery, and many violent conflicts. Historical examples of these negative effects of ethnocentrism are The Holocaust, the Crusades, the Trail of Tears, and the internment of Japanese Americans.
These events were a result of cultural differences reinforced
inhumanely by a superior, majority group. In his 1976 book on evolution,
The Selfish Gene, evolutionary biologist Richard Dawkins writes that "blood-feuds and inter-clan warfare are easily interpretative in terms of Hamilton's genetic theory." Simulation-based experiments in evolutionary game theory have attempted to provide an explanation for the selection of ethnocentric-strategy phenotypes.
The positive examples of ethnocentrism throughout history have
aimed to prohibit the callousness of ethnocentrism and reverse the
perspectives of living in a single culture. These organizations can
include the formation of the United Nations; aimed to maintain international relations, and the Olympic Games; a celebration of sports and friendly competition between cultures.
Effects
A study in New Zealand was used to compare how individuals associate with in-groups and out-groupers and has a connotation to discrimination. Strong in-group favoritism benefits the dominant groups and is different from out-group hostility and/or punishment.
A suggested solution is to limit the perceived threat from the
out-group that also decreases the likeliness for those supporting the
in-groups to negatively react.
Ethnocentrism also influences consumer preference over which
goods they purchase. A study that used several in-group and out-group
orientations have shown a correlation between national identity, consumer cosmopolitanism, consumer ethnocentrism, and the methods consumer choose their products, whether imported or domestic.