Search This Blog

Tuesday, February 20, 2024

Atomic Age

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Atomic_Age
An early nuclear power plant that used atomic energy to generate electricity

The Atomic Age, also known as the Atomic Era, is the period of history following the detonation of the first nuclear weapon, The Gadget at the Trinity test in New Mexico, on 16 July 1945, during World War II. Although nuclear chain reactions had been hypothesized in 1933 and the first artificial self-sustaining nuclear chain reaction (Chicago Pile-1) had taken place in December 1942, the Trinity test and the ensuing bombings of Hiroshima and Nagasaki that ended World War II represented the first large-scale use of nuclear technology and ushered in profound changes in sociopolitical thinking and the course of technological development.

While atomic power was promoted for a time as the epitome of progress and modernity, entering into the nuclear power era also entailed frightful implications of nuclear warfare, the Cold War, mutual assured destruction, nuclear proliferation, the risk of nuclear disaster (potentially as extreme as anthropogenic global nuclear winter), as well as beneficial civilian applications in nuclear medicine. It is no easy matter to fully segregate peaceful uses of nuclear technology from military or terrorist uses (such as the fabrication of dirty bombs from radioactive waste), which complicated the development of a global nuclear-power export industry right from the outset.

In 1973, concerning a flourishing nuclear power industry, the United States Atomic Energy Commission predicted that, by the turn of the 21st century, one thousand reactors would be producing electricity for homes and businesses across the U.S. However, the "nuclear dream" fell far short of what was promised because nuclear technology produced a range of social problems, from the nuclear arms race to nuclear meltdowns, and the unresolved difficulties of bomb plant cleanup and civilian plant waste disposal and decommissioning. Since 1973, reactor orders declined sharply as electricity demand fell and construction costs rose. Many orders and partially completed plants were cancelled.

By the late 1970s, nuclear power had suffered a remarkable international destabilization, as it was faced with economic difficulties and widespread public opposition, coming to a head with the Three Mile Island accident in 1979, and the Chernobyl disaster in 1986, both of which adversely affected the nuclear power industry for many decades.

Early years

In 1901, Frederick Soddy and Ernest Rutherford discovered that radioactivity was part of the process by which atoms changed from one kind to another, involving the release of energy. Soddy wrote in popular magazines that radioactivity was a potentially "inexhaustible" source of energy, and offered a vision of an atomic future where it would be possible to "transform a desert continent, thaw the frozen poles, and make the whole earth one smiling Garden of Eden." The promise of an "atomic age," with nuclear energy as the global, utopian technology for the satisfaction of human needs, has been a recurring theme ever since. But "Soddy also saw that atomic energy could possibly be used to create terrible new weapons".

The concept of a nuclear chain reaction was hypothesized in 1933, shortly after Chadwick's discovery of the neutron. Only a few years later, in December 1938 nuclear fission was discovered by Otto Hahn and his assistant Fritz Strassmann. Hahn understood that a "burst" of the atomic nuclei had occurred. Lise Meitner and Otto Frisch gave a full theoretical interpretation and named the process "nuclear fission". The first artificial self-sustaining nuclear chain reaction (Chicago Pile-1, or CP-1) took place in December 1942 under the leadership of Enrico Fermi.

In 1945, the pocketbook The Atomic Age heralded the untapped atomic power in everyday objects and depicted a future where fossil fuels would go unused. One science writer, David Dietz, wrote that instead of filling the gas tank of your car two or three times a week, you will travel for a year on a pellet of atomic energy the size of a vitamin pill. Glenn T. Seaborg, who chaired the Atomic Energy Commission, wrote "there will be nuclear powered earth-to-moon shuttles, nuclear powered artificial hearts, plutonium heated swimming pools for SCUBA divers, and much more".

World War II

The phrase Atomic Age was coined by William L. Laurence, a journalist with The New York Times, who became the official journalist for the Manhattan Project which developed the first nuclear weapons. He witnessed both the Trinity test and the bombing of Nagasaki and went on to write a series of articles extolling the virtues of the new weapon. His reporting before and after the bombings helped to spur public awareness of the potential of nuclear technology and in part motivated development of the technology in the U.S. and in the Soviet Union. The Soviet Union would go on to test its first nuclear weapon in 1949.

In 1949, U.S. Atomic Energy Commission chairman, David Lilienthal stated that "atomic energy is not simply a search for new energy, but more significantly a beginning of human history in which faith in knowledge can vitalize man's whole life".

1950s

This view of downtown Las Vegas shows a mushroom cloud in the background. Scenes such as this were typical during the 1950s. From 1951 to 1962 the government conducted 100 atmospheric tests at the nearby Nevada Test Site.

The phrase gained popularity as a feeling of nuclear optimism emerged in the 1950s in which it was believed that all power generators in the future would be atomic in nature. The atomic bomb would render all conventional explosives obsolete and nuclear power plants would do the same for power sources such as coal and oil. There was a general feeling that everything would use a nuclear power source of some sort, in a positive and productive way, from irradiating food to preserve it, to the development of nuclear medicine. There would be an age of peace and plenty in which atomic energy would "provide the power needed to desalinate water for the thirsty, irrigate the deserts for the hungry, and fuel interstellar travel deep into outer space". This use would render the Atomic Age as significant a step in technological progress as the first smelting of bronze, of iron, or the commencement of the Industrial Revolution.

This included even cars, leading Ford to display the Ford Nucleon concept car to the public in 1958. There was also the promise of golf balls which could always be found and nuclear-powered aircraft, which the U.S. federal government even spent US$1.5 billion researching. Nuclear policymaking became almost a collective technocratic fantasy, or at least was driven by fantasy:

The very idea of splitting the atom had an almost magical grip on the imaginations of inventors and policymakers. As soon as someone said—in an even mildly credible way—that these things could be done, then people quickly convinced themselves ... that they would be done.

In the US, military planners "believed that demonstrating the civilian applications of the atom would also affirm the American system of private enterprise, showcase the expertise of scientists, increase personal living standards, and defend the democratic lifestyle against communism".

Some media reports predicted that thanks to the giant nuclear power stations of the near future electricity would soon become much cheaper and that electricity meters would be removed, because power would be "too cheap to meter."

When the Shippingport reactor went online in 1957 it produced electricity at a cost roughly ten times that of coal-fired generation. Scientists at the AEC's own Brookhaven Laboratory "wrote a 1958 report describing accident scenarios in which 3,000 people would die immediately, with another 40,000 injured".

However Shippingport was an experimental reactor using highly enriched uranium (unlike most power reactors) and originally intended for a (cancelled) nuclear-powered aircraft carrier. Kenneth Nichols, a consultant for the Connecticut Yankee and Yankee Rowe nuclear power stations, wrote that while considered "experimental" and not expected to be competitive with coal and oil, they "became competitive because of inflation ... and the large increase in price of coal and oil." He wrote that for nuclear power stations the capital cost is the major cost factor over the life of the plant, hence "antinukes" try to increase costs and building time with changing regulations and lengthy hearings, so that "it takes almost twice as long to build a (U.S.-designed boiling-water or pressurised water) atomic power plant in the United States as in France, Japan, Taiwan or South Korea." French pressurised-water nuclear plants produce 60% of their electric power, and have proven to be much cheaper than oil or coal.

Fear of possible atomic attack from the Soviet Union caused U.S. school children to participate in "duck and cover" civil defense drills.

Atomic City

During the 1950s, Las Vegas, Nevada, earned the nickname "Atomic City" for becoming a hotspot where tourists would gather to watch above-ground nuclear weapons tests taking place at Nevada Test Site. Following the detonation of Able, one of the first atomic bombs dropped at the Nevada Test Site, the Las Vegas Chamber of Commerce began advertising the tests as an entertainment spectacle to tourists.

The detonations proved popular and casinos throughout the city capitalised on the tests by advertising hotel rooms or rooftops which offered views of the testing site or by planning "Dawn Bomb Parties" where people would come together to celebrate the detonations. Most parties started at midnight and musicians would perform at the venues until 4:00 a.m. when the party would briefly stop so guests could silently watch the detonation. Some casinos capitalised on the tests further by creating so called "atomic cocktails", a mixture of vodka, cognac, sherry and champagne.

Meanwhile, groups of tourists would drive out into the desert with family or friends to watch the detonations.

Despite the health risks associated with nuclear fallout, tourists and viewers were told to simply "shower". Later on, however, anyone who had worked at the testing site or lived in areas exposed to nuclear fallout fell ill and had higher chances of developing cancer or suffering pre-mature deaths.

1960s

By exploiting the peaceful uses of the "friendly atom" in medical applications, earth removal and, subsequently, in nuclear power plants, the nuclear industry and government sought to allay public fears about nuclear technology and promote the acceptance of nuclear weapons. At the peak of the Atomic Age, the United States government initiated Operation Plowshare, involving "peaceful nuclear explosions". The United States Atomic Energy Commission chairman announced that the Plowshares project was intended to "highlight the peaceful applications of nuclear explosive devices and thereby create a climate of world opinion that is more favorable to weapons development and tests".

Project Plowshare "was named directly from the Bible itself, specifically Micah 4:3, which states that God will beat swords into ploughshares, and spears into pruning hooks, so that no country could lift up weapons against another". Proposed uses included widening the Panama Canal, constructing a new sea-level waterway through Nicaragua nicknamed the Pan-Atomic Canal, cutting paths through mountainous areas for highways, and connecting inland river systems. Other proposals involved blasting caverns for water, natural gas, and petroleum storage. It was proposed to plant underground atomic bombs to extract shale oil in eastern Utah and western Colorado. Serious consideration was also given to using these explosives for various mining operations. One proposal suggested using nuclear blasts to connect underground aquifers in Arizona. Another plan involved surface blasting on the western slope of California's Sacramento Valley for a water transport project. However, there were many negative impacts from Project Plowshare's 27 nuclear explosions. Consequences included blighted land, relocated communities, tritium-contaminated water, radioactivity, and fallout from debris being hurled high into the atmosphere. These were ignored and downplayed until the program was terminated in 1977, due in large part to public opposition, after $770 million had been spent on the project.

In the Thunderbirds TV series, a set of vehicles was presented that were imagined to be completely nuclear, as shown in cutaways presented in their comic-books.

The term "atomic age" was initially used in a positive, futuristic sense, but by the 1960s the threats posed by nuclear weapons had begun to edge out nuclear power as the dominant motif of the atom.

1970s to 1990s

A photograph taken in the abandoned city of Pripyat. The Chernobyl nuclear power plant can be seen on the horizon.

French advocates of nuclear power developed an aesthetic vision of nuclear technology as art to bolster support for the technology. Leclerq compares the nuclear cooling tower to some of the grandest architectural monuments of Western culture:

The age in which we live has, for the public, been marked by the nuclear engineer and the gigantic edifices he has created. For builders and visitors alike, nuclear power plants will be considered the cathedrals of the 20th century. Their syncretism mingles the conscious and the unconscious, religious fulfilment and industrial achievement, the limitations of uses of materials and boundless artistic inspiration, utopia come true and the continued search for harmony.

In 1973, the United States Atomic Energy Commission predicted that, by the turn of the 21st century, one thousand reactors would be producing electricity for homes and businesses across the USA. But after 1973, reactor orders declined sharply as electricity demand fell and construction costs rose. Many orders and partially completed plants were cancelled.

Nuclear power has proved controversial since the 1970s. Highly radioactive materials may overheat and escape from the reactor building. Nuclear waste (spent nuclear fuel) needs to be regularly removed from the reactors and disposed of safely for up to a million years, so that it does not pollute the environment. Recycling of nuclear waste has been discussed, but it creates plutonium which can be used in weapons, and in any case still leaves much unwanted waste to be stored and disposed of. Large, purpose-built facilities for long-term disposal of nuclear waste have been difficult to site, and have not yet reached fruition.

By the late 1970s, nuclear power suffered a remarkable international destabilization, as it was faced with economic difficulties and widespread public opposition, coming to a head with the Three Mile Island accident in 1979, and the Chernobyl disaster in 1986, both of which adversely affected the nuclear power industry for decades thereafter. A cover story in the 11 February 1985, issue of Forbes magazine commented on the overall management of the nuclear power program in the United States:

The failure of the U.S. nuclear power program ranks as the largest managerial disaster in business history, a disaster on a monumental scale ... only the blind, or the biased, can now think that the money has been well spent. It is a defeat for the U.S. consumer and for the competitiveness of U.S. industry, for the utilities that undertook the program and for the private enterprise system that made it possible.

So, in a period just over 30 years, the early dramatic rise of nuclear power went into equally meteoric reverse. With no other energy technology has there been a conjunction of such rapid and revolutionary international emergence, followed so quickly by equally transformative demise.

21st century

The 2011 Fukushima Daiichi nuclear disaster in Japan, the worst nuclear accident in 25 years, displaced 50,000 households after radiation leaked into the air, soil and sea.

In the 21st century, the label of the "Atomic Age" connotes either a sense of nostalgia or naïveté, and is considered by many to have ended with the fall of the Soviet Union in 1991, though the term continues to be used by many historians to describe the era following the conclusion of the Second World War. Atomic energy and weapons continue to have a strong effect on world politics in the 21st century. The term is used by some science fiction fans to describe not only the era following the conclusion of the Second World War but also contemporary history up to the present day.

The nuclear power industry has improved the safety and performance of reactors, and has proposed new safer (but generally untested) reactor designs but there is no guarantee that the reactors will be designed, built and operated correctly. Mistakes do occur and the designers of reactors at Fukushima in Japan did not anticipate that a tsunami generated by an earthquake would disable the backup systems that were supposed to stabilize the reactor after the earthquake. According to UBS AG, the Fukushima I nuclear accidents have cast doubt on whether even an advanced economy like Japan can master nuclear safety. Catastrophic scenarios involving terrorist attacks are also conceivable. An interdisciplinary team from MIT has estimated that if nuclear power use tripled from 2005 to 2055 (2%–7%), at least four serious nuclear accidents would be expected in that period.

In September 2012, in reaction to the Fukushima disaster, Japan announced that it would completely phase out nuclear power by 2030, although the likelihood of this goal became unlikely during the subsequent Abe administration. Germany planned to completely phase out nuclear energy by 2022 but was still using 11.9% in 2021. In 2022, following the Russian invasion of Ukraine, the United Kingdom pledged to build up to 8 new reactors to reduce their reliance on gas and oil and hopes that 25% of all energy produced will be by nuclear means.

Chronology

A large anti-nuclear demonstration was held on 6 May 1979, in Washington D.C., when 125,000 people including the Governor of California, attended a march and rally against nuclear power. In New York City on 23 September 1979, almost 200,000 people attended a protest against nuclear power. Anti-nuclear power protests preceded the shutdown of the Shoreham, Yankee Rowe, Millstone I, Rancho Seco, Maine Yankee, and about a dozen other nuclear power plants.

On 12 June 1982, one million people demonstrated in New York City's Central Park against nuclear weapons and for an end to the cold war arms race. It was the largest anti-nuclear protest and the largest political demonstration in American history. International Day of Nuclear Disarmament protests were held on 20 June 1983, at 50 sites across the United States. In 1986, hundreds of people walked from Los Angeles to Washington, D.C., in the Great Peace March for Global Nuclear Disarmament. There were many Nevada Desert Experience protests and peace camps at the Nevada Test Site during the 1980s and 1990s.

On May 1st 2005, forty thousand anti-nuclear/anti-war protesters marched past the United Nations in New York, 60 years after the atomic bombings of Hiroshima and Nagasaki. This was the largest anti-nuclear rally in the U.S. for several decades.

Discovery and development

Nuclear arms deployment

"Atoms for Peace"

Three Mile Island and Chernobyl

Nuclear arms reduction

  • 8 December 1987 – The Intermediate-Range Nuclear Forces Treaty is signed in Washington 1987. Ronald Reagan and Mikhail Gorbachev agreed after negotiations following the 11–12 October 1986 Reykjavík Summit to go farther than a nuclear freeze – they agreed to reduce nuclear arsenals. IRBMs and SRBMs were eliminated.
  • 1993–2007 – Nuclear power is the primary source of electricity in France. Throughout these two decades, France produced over three quarters of its power from nuclear sources (78.8%), the highest percentage in the world at the time.
  • 31 July 1991 – As the Cold War ends, the Start I treaty is signed by the United States and the Soviet Union, reducing the deployed nuclear warheads of each side to no more than 6,000 each.
  • 1993 – The Megatons to Megawatts Program is agreed upon by Russia and the United States and begins to be implemented in 1995. When it is completed in 2013, five hundred tonnes of uranium derived from 20,000 nuclear warheads from Russia will have been converted from weapons-grade to reactor-grade uranium and used in United States nuclear plants to generate electricity. This has provided 10% of the electrical power of the U.S. (50% of its nuclear power) during the 1995–2013 period.
  • 2006 – Patrick Moore, an early member of Greenpeace and environmentalists such as Stewart Brand suggest the deployment of more advanced nuclear power technology for electric power generation (such as pebble-bed reactors) to combat global warming.
  • 21 November 2006 – Implementation of the ITER fusion power reactor project near Cadarache, France is begun. Construction is to be completed in 2016 with the hope that the research conducted there will allow the introduction of practical commercial fusion power plants by 2050.
  • 2006–2009 – A number of nuclear engineers begin to suggest that, to combat global warming, it would be more efficient to build nuclear reactors that operate on the thorium cycle.
  • 8 April 2010 – The New START treaty is signed by the United States and Russia in Prague. It mandates the eventual reduction by both sides to no more than 1,550 deployed strategic nuclear weapons each.

Fukushima

Influence on popular culture

Cover of Atomic War number one, November 1952

Identity (social science)

From Wikipedia, the free encyclopedia
 
Identity is the qualities, beliefs, personality traits, appearance, and/or expressions that characterize a person or a group.

Identity emerges during childhood as children start to comprehend their self-concept, and it remains a consistent aspect throughout different stages of life. Identity is shaped by social and cultural factors and how others perceive and acknowledge one's characteristics. The etymology of the term "identity" from the Latin noun identitas emphasizes an individual's mental image of themselves and their "sameness with others". Identity encompasses various aspects such as occupational, religious, national, ethnic or racial, gender, educational, generational, and political identities, among others.

Identity serves multiple functions, acting as a "self-regulatory structure" that provides meaning, direction, and a sense of self-control. It fosters internal harmony and serves as a behavioral compass, enabling individuals to orient themselves towards the future and establish long-term goals. As an active process, it profoundly influences an individual's capacity to adapt to life events and achieve a state of well-being. However, identity originates from traits or attributes that individuals may have little or no control over, such as their family background or ethnicity.

In sociology, emphasis is placed by sociologists on collective identity, in which an individual's identity is strongly associated with role-behavior or the collection of group memberships that define them. According to Peter Burke, "Identities tell us who we are and they announce to others who we are." Identities subsequently guide behavior, leading "fathers" to behave like "fathers" and "nurses" to act like "nurses."

In psychology, the term "identity" is most commonly used to describe personal identity, or the distinctive qualities or traits that make an individual unique. Identities are strongly associated with self-concept, self-image (one's mental model of oneself), self-esteem, and individuality. Individuals' identities are situated, but also contextual, situationally adaptive and changing. Despite their fluid character, identities often feel as if they are stable ubiquitous categories defining an individual, because of their grounding in the sense of personal identity (the sense of being a continuous and persistent self).

Usage

Mark Mazower noted in 1998: "At some point in the 1970s this term ["identity"] was borrowed from social psychology and applied with abandon to societies, nations and groups."

In psychology

Erik Erikson (1902–94) became one of the earliest psychologists to take an explicit interest in identity. An essential feature of Erikson's theory of psychosocial development was the idea of the ego identity (often referred to as the self), which is described as an individual's personal sense of continuity. He suggested that people can attain this feeling throughout their lives as they develop and is meant to be an ongoing process. The ego-identity consists of two main features: one's personal characteristics and development, and the culmination of social and cultural factors and roles that impact one's identity. In Erikson's theory, he describes eight distinct stages across the lifespan that are each characterized by a conflict between the inner, personal world and the outer, social world of an individual. Erikson identified the conflict of identity as occurring primarily during adolescence and described potential outcomes that depend on how one deals with this conflict. Those who do not manage a resynthesis of childhood identifications are seen as being in a state of 'identity diffusion' whereas those who retain their given identities unquestioned have 'foreclosed' identities. On some readings of Erikson, the development of a strong ego identity, along with the proper integration into a stable society and culture, lead to a stronger sense of identity in general. Accordingly, a deficiency in either of these factors may increase the chance of an identity crisis or confusion.

The "Neo-Eriksonian" identity status paradigm emerged in 1966, driven largely by the work of James Marcia. This model focuses on the concepts of exploration and commitment. The central idea is that an individual's sense of identity is determined in large part by the degrees to which a person has made certain explorations and the extent to which they have commitments to those explorations or a particular identity. A person may display either relative weakness or strength in terms of both exploration and commitments. When assigned categories, there were four possible results: identity diffusion, identity foreclosure, identity moratorium, and identity achievement. Diffusion is when a person avoids or refuses both exploration and making a commitment. Foreclosure occurs when a person does make a commitment to a particular identity but neglected to explore other options. Identity moratorium is when a person avoids or postpones making a commitment but is still actively exploring their options and different identities. Lastly, identity achievement is when a person has both explored many possibilities and has committed to their identity.

Although the self is distinct from identity, the literature of self-psychology can offer some insight into how identity is maintained. From the vantage point of self-psychology, there are two areas of interest: the processes by which a self is formed (the "I"), and the actual content of the schemata which compose the self-concept (the "Me"). In the latter field, theorists have shown interest in relating the self-concept to self-esteem, the differences between complex and simple ways of organizing self-knowledge, and the links between those organizing principles and the processing of information.

Weinreich's identity variant similarly includes the categories of identity diffusion, foreclosure and crisis, but with a somewhat different emphasis. Here, with respect to identity diffusion for example, an optimal level is interpreted as the norm, as it is unrealistic to expect an individual to resolve all their conflicted identifications with others; therefore we should be alert to individuals with levels which are much higher or lower than the norm – highly diffused individuals are classified as diffused, and those with low levels as foreclosed or defensive. Weinreich applies the identity variant in a framework which also allows for the transition from one to another by way of biographical experiences and resolution of conflicted identifications situated in various contexts – for example, an adolescent going through family break-up may be in one state, whereas later in a stable marriage with a secure professional role may be in another. Hence, though there is continuity, there is also development and change.

Laing's definition of identity closely follows Erikson's, in emphasising the past, present and future components of the experienced self. He also develops the concept of the "metaperspective of self", i.e. the self's perception of the other's view of self, which has been found to be extremely important in clinical contexts such as anorexia nervosa. Harré also conceptualises components of self/identity – the "person" (the unique being I am to myself and others) along with aspects of self (including a totality of attributes including beliefs about one's characteristics including life history), and the personal characteristics displayed to others.

In social psychology

At a general level, self-psychology is compelled to investigate the question of how the personal self relates to the social environment. To the extent that these theories place themselves in the tradition of "psychological" social psychology, they focus on explaining an individual's actions within a group in terms of mental events and states. However, some "sociological" social psychology theories go further by attempting to deal with the issue of identity at both the levels of individual cognition and of collective behaviour.

Collective identity

Many people gain a sense of positive self-esteem from their identity groups, which furthers a sense of community and belonging. Another issue that researchers have attempted to address is the question of why people engage in discrimination, i.e., why they tend to favour those they consider a part of their "in-group" over those considered to be outsiders. Both questions have been given extensive attention by researchers working in the social identity tradition. For example, in work relating to social identity theory, it has been shown that merely crafting cognitive distinction between in- and out-groups can lead to subtle effects on people's evaluations of others.

Different social situations also compel people to attach themselves to different self-identities which may cause some to feel marginalized, switch between different groups and self-identifications, or reinterpret certain identity components. These different selves lead to constructed images dichotomized between what people want to be (the ideal self) and how others see them (the limited self). Educational background and occupational status and roles significantly influence identity formation in this regard.

Identity formation strategies

Another issue of interest in social psychology is related to the notion that there are certain identity formation strategies which a person may use to adapt to the social world. Cote and Levine developed a typology which investigated the different manners of behavior that individuals may have. Their typology includes:

Cote and Levine's identity formation strategy typology
Type Psychological signs Personality signs Social signs
Refuser Develops cognitive blocks that prevent adoption of adult role-schemas Engages in childlike behavior Shows extensive dependency upon others and no meaningful engagement with the community of adults
Drifter Possesses greater psychological resources than the Refuser (i.e., intelligence, charisma) Is apathetic toward application of psychological resources Has no meaningful engagement with or commitment to adult communities
Searcher Has a sense of dissatisfaction due to high personal and social expectations Shows disdain for imperfections within the community Interacts to some degree with role-models, but ultimately these relationships are abandoned
Guardian Possesses clear personal values and attitudes, but also a deep fear of change Sense of personal identity is almost exhausted by sense of social identity Has an extremely rigid sense of social identity and strong identification with adult communities
Resolver Consciously desires self-growth Accepts personal skills and competencies and uses them actively Is responsive to communities that provide opportunity for self-growth

Kenneth Gergen formulated additional classifications, which include the strategic manipulator, the pastiche personality, and the relational self. The strategic manipulator is a person who begins to regard all senses of identity merely as role-playing exercises, and who gradually becomes alienated from their social self. The pastiche personality abandons all aspirations toward a true or "essential" identity, instead viewing social interactions as opportunities to play out, and hence become, the roles they play. Finally, the relational self is a perspective by which persons abandon all sense of exclusive self, and view all sense of identity in terms of social engagement with others. For Gergen, these strategies follow one another in phases, and they are linked to the increase in popularity of postmodern culture and the rise of telecommunications technology.

In social anthropology

Anthropologists have most frequently employed the term identity to refer to this idea of selfhood in a loosely Eriksonian way  properties based on the uniqueness and individuality which makes a person distinct from others. Identity became of more interest to anthropologists with the emergence of modern concerns with ethnicity and social movements in the 1970s. This was reinforced by an appreciation, following the trend in sociological thought, of the manner in which the individual is affected by and contributes to the overall social context. At the same time, the Eriksonian approach to identity remained in force, with the result that identity has continued until recently to be used in a largely socio-historical way to refer to qualities of sameness in relation to a person's connection to others and to a particular group of people.

The first favours a primordialist approach which takes the sense of self and belonging to a collective group as a fixed thing, defined by objective criteria such as common ancestry and common biological characteristics. The second, rooted in social constructionist theory, takes the view that identity is formed by a predominantly political choice of certain characteristics. In so doing, it questions the idea that identity is a natural given, characterised by fixed, supposedly objective criteria. Both approaches need to be understood in their respective political and historical contexts, characterised by debate on issues of class, race and ethnicity. While they have been criticized, they continue to exert an influence on approaches to the conceptualisation of identity today.

These different explorations of 'identity' demonstrate how difficult a concept it is to pin down. Since identity is a virtual thing, it is impossible to define it empirically. Discussions of identity use the term with different meanings, from fundamental and abiding sameness, to fluidity, contingency, negotiated and so on. Brubaker and Cooper note a tendency in many scholars to confuse identity as a category of practice and as a category of analysis. Indeed, many scholars demonstrate a tendency to follow their own preconceptions of identity, following more or less the frameworks listed above, rather than taking into account the mechanisms by which the concept is crystallised as reality. In this environment, some analysts, such as Brubaker and Cooper, have suggested doing away with the concept completely. Others, by contrast, have sought to introduce alternative concepts in an attempt to capture the dynamic and fluid qualities of human social self-expression. Stuart Hall for example, suggests treating identity as a process, to take into account the reality of diverse and ever-changing social experience. Some scholars have introduced the idea of identification, whereby identity is perceived as made up of different components that are 'identified' and interpreted by individuals. The construction of an individual sense of self is achieved by personal choices regarding who and what to associate with. Such approaches are liberating in their recognition of the role of the individual in social interaction and the construction of identity.

Anthropologists have contributed to the debate by shifting the focus of research: One of the first challenges for the researcher wishing to carry out empirical research in this area is to identify an appropriate analytical tool. The concept of boundaries is useful here for demonstrating how identity works. In the same way as Barth, in his approach to ethnicity, advocated the critical focus for investigation as being "the ethnic boundary that defines the group rather than the cultural stuff that it encloses", social anthropologists such as Cohen and Bray have shifted the focus of analytical study from identity to the boundaries that are used for purposes of identification. If identity is a kind of virtual site in which the dynamic processes and markers used for identification are made apparent, boundaries provide the framework on which this virtual site is built. They concentrated on how the idea of community belonging is differently constructed by individual members and how individuals within the group conceive ethnic boundaries.

As a non-directive and flexible analytical tool, the concept of boundaries helps both to map and to define the changeability and mutability that are characteristic of people's experiences of the self in society. While identity is a volatile, flexible and abstract 'thing', its manifestations and the ways in which it is exercised are often open to view. Identity is made evident through the use of markers such as language, dress, behaviour and choice of space, whose effect depends on their recognition by other social beings. Markers help to create the boundaries that define similarities or differences between the marker wearer and the marker perceivers, their effectiveness depends on a shared understanding of their meaning. In a social context, misunderstandings can arise due to a misinterpretation of the significance of specific markers. Equally, an individual can use markers of identity to exert influence on other people without necessarily fulfilling all the criteria that an external observer might typically associate with such an abstract identity.

Boundaries can be inclusive or exclusive depending on how they are perceived by other people. An exclusive boundary arises, for example, when a person adopts a marker that imposes restrictions on the behaviour of others. An inclusive boundary is created, by contrast, by the use of a marker with which other people are ready and able to associate. At the same time, however, an inclusive boundary will also impose restrictions on the people it has included by limiting their inclusion within other boundaries. An example of this is the use of a particular language by a newcomer in a room full of people speaking various languages. Some people may understand the language used by this person while others may not. Those who do not understand it might take the newcomer's use of this particular language merely as a neutral sign of identity. But they might also perceive it as imposing an exclusive boundary that is meant to mark them off from the person. On the other hand, those who do understand the newcomer's language could take it as an inclusive boundary, through which the newcomer associates themself with them to the exclusion of the other people present. Equally, however, it is possible that people who do understand the newcomer but who also speak another language may not want to speak the newcomer's language and so see their marker as an imposition and a negative boundary. It is possible that the newcomer is either aware or unaware of this, depending on whether they themself knows other languages or is conscious of the plurilingual quality of the people there and is respectful of it or not.

In religion

A religious identity is the set of beliefs and practices generally held by an individual, involving adherence to codified beliefs and rituals and study of ancestral or cultural traditions, writings, history, mythology, and faith and mystical experience. Religious identity refers to the personal practices related to communal faith along with rituals and communication stemming from such conviction. This identity formation begins with an association in the parents' religious contacts, and individuation requires that the person chooses the same or different religious identity than that of their parents.

The Parable of the Lost Sheep is one of the parables of Jesus. it is about a shepherd who leaves his flock of ninety-nine sheep in order to find the one which is lost. The parable of the lost sheep is an example of the rediscovery of identity. Its aim is to lay bare the nature of the divine response to the recovery of the lost, with the lost sheep representing a lost human being.

Christian meditation is a specific form of personality formation, though often used only by certain practitioners to describe various forms of prayer and the process of knowing the contemplation of God.

In Western culture, personal and secular identity are deeply influenced by the formation of Christianity, throughout history, various Western thinkers who contributed to the development of European identity were influenced by classical cultures and incorporated elements of Greek culture as well as Jewish culture, leading to some movements such as Philhellenism and Philosemitism.

Implications

Due to the multiple functions of identity which include self regulation, self-concept, personal control, meaning and direction, its implications are woven into many aspects of life.

Identity changes

Contexts Influencing Identity Changes

Identity transformations can occur in various contexts, some of which include:

  1. Career Change: When individuals undergo significant shifts in their career paths or occupational identities, they face the challenge of redefining themselves within a new professional context.
  2. Gender Identity Transition: Individuals experiencing gender dysphoria may embark on a journey to align their lives with their true gender identity. This process involves profound personal and social changes to establish an authentic sense of self.
  3. National Immigration: Relocating to a new country necessitates adaptation to unfamiliar societal norms, leading to adjustments in cultural, social, and occupational identities.
  4. Identity Change due to Climate Migration: In the face of environmental challenges and forced displacement, individuals may experience shifts in their identity as they adapt to new geographical locations and cultural contexts.
  5. Adoption: Adoption entails exploring alternative familial features and reconciling with the experience of being adopted, which can significantly impact an individual's self-identity.
  6. Illness Diagnosis: The diagnosis of an illness can provoke an identity shift, altering an individual's self-perception and influencing how they navigate life. Additionally, illnesses may result in changes in abilities, which can affect occupational identity and require adaptations.

Immigration and identity

Immigration and acculturation often lead to shifts in social identity. The extent of this change depends on the disparities between the individual's heritage culture and the culture of the host country, as well as the level of adoption of the new culture versus the retention of the heritage culture. However, the effects of immigration and acculturation on identity can be moderated if the person possesses a strong personal identity. This established personal identity can serve as an "anchor" and play a "protective role" during the process of social and cultural identity transformations that occur.

Occupational identity

Identity is an ongoing and dynamic process that impacts an individual's ability to navigate life's challenges and cultivate a fulfilling existence. Within this process, occupation emerges as a significant factor that allows individuals to express and maintain their identity. Occupation encompasses not only careers or jobs but also activities such as travel, volunteering, sports, or caregiving. However, when individuals face limitations in their ability to participate or engage in meaningful activities, such as due to illness, it poses a threat to the active process and continued development of identity. Feeling socially unproductive can have detrimental effects on one's social identity. Importantly, the relationship between occupation and identity is bidirectional; occupation contributes to the formation of identity, while identity shapes decisions regarding occupational choices. Furthermore, individuals inherently seek a sense of control over their chosen occupation and strive to avoid stigmatizing labels that may undermine their occupational identity.

Navigating stigma and occupational identity

In the realm of occupational identity, individuals make choices regarding employment based on the stigma associated with certain jobs. Likewise, those already working in stigmatized occupations may employ personal rationalization to justify their career path. Factors such as workplace satisfaction and overall quality of life play significant roles in these decisions. Individuals in such jobs face the challenge of forging an identity that aligns with their values and beliefs. Crafting a positive self-concept becomes more arduous when societal standards label their work as "dirty" or undesirable. Consequently, some individuals opt not to define themselves solely by their occupation but strive for a holistic identity that encompasses all aspects of their lives, beyond their job or work. On the other hand, individuals whose identity strongly hinges on their occupation may experience a crisis if they become unable to perform their chosen work. Therefore, occupational identity necessitates an active and adaptable process that ensures both adaptation and continuity amid shifting circumstances.

Factors shaping the concept of identity

The modern notion of personal identity as a distinct and unique characteristic of individuals has evolved relatively recently in history beginning with the first passports in the early 1900s and later becoming more popular as a social science term in the 1950s. Several factors have influenced its evolution, including:

  1. Protestant Influence: In Western societies, the Protestant tradition has underscored individuals' responsibility for their own soul or spiritual well-being, contributing to a heightened focus on personal identity.
  2. Development of Psychology: The emergence of psychology as a separate field of knowledge and study starting in the 19th century has played a significant role in shaping our understanding of identity.
  3. Rise of Privacy: The Renaissance era witnessed a growing sense of privacy, leading to increased attention and importance placed on individual identities.
  4. Specialization in Work: The industrial period brought about a shift from undifferentiated roles in feudal systems to specialized worker roles. This change impacted how individuals identified themselves in relation to their occupations.
  5. Occupation and Identity: The concept of occupation as a crucial aspect of identity was introduced by Christiansen in 1999, highlighting the influence of employment and work roles on an individual's sense of self.
  6. Focus on Gender Identity: There has been an increased emphasis on gender identity, including issues related to gender dysphoria and transgender experiences. These discussions have contributed to a broader understanding of diverse identities.
  7. Relevance of Identity in Personality Pathology: Understanding and assessing personality pathology has highlighted the significance of identity problems in comprehending individuals' psychological well-being.

Equality (mathematics)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Equality_...