Search This Blog

Monday, July 11, 2022

Essentialism

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Essentialism

Essentialism is the view that objects have a set of attributes that are necessary to their identity. In early Western thought, Plato's idealism held that all things have such an "essence"—an "idea" or "form". In Categories, Aristotle similarly proposed that all objects have a substance that, as George Lakoff put it, "make the thing what it is, and without which it would be not that kind of thing". The contrary view—non-essentialism—denies the need to posit such an "essence'".

Essentialism has been controversial from its beginning. Plato, in the Parmenides dialogue, depicts Socrates questioning the notion, suggesting that if we accept the idea that every beautiful thing or just action partakes of an essence to be beautiful or just, we must also accept the "existence of separate essences for hair, mud, and dirt". In biology and other natural sciences, essentialism provided the rationale for taxonomy at least until the time of Charles Darwin; the role and importance of essentialism in biology is still a matter of debate.

Historically, beliefs which posit that social identities such as ethnicity, nationality or gender are essential characteristics have in many cases been shown to have destructive or harmful results. It has been argued by some that Essentialist thinking lies at the core of many reductive, discriminatory or extremist ideologies. Psychological essentialism is also correlated with racial prejudice. In medical sciences, essentialism can lead to a reified view of identities—for example assuming that differences in hypertension in Afro-American populations are due to racial differences rather than social causes—leading to fallacious conclusions and potentially unequal treatment. Older social theories were often conceptually essentialist.

In philosophy

An essence characterizes a substance or a form, in the sense of the forms and ideas in Platonic idealism. It is permanent, unalterable, and eternal, and is present in every possible world. Classical humanism has an essentialist conception of the human, in its endorsement of the notion of an eternal and unchangeable human nature. This has been criticized by Kierkegaard, Marx, Heidegger, Sartre, and many other existential and materialist thinkers.

In Plato's philosophy (in particular, the Timaeus and the Philebus), things were said to come into being by the action of a demiurge who works to form chaos into ordered entities. Many definitions of essence hark back to the ancient Greek hylomorphic understanding of the formation of the things. According to that account, the structure and real existence of any thing can be understood by analogy to an artefact produced by a craftsperson. The craftsperson requires hyle (timber or wood) and a model, plan or idea in their own mind, according to which the wood is worked to give it the indicated contour or form (morphe). Aristotle was the first to use the terms hyle and morphe. According to his explanation, all entities have two aspects: "matter" and "form". It is the particular form imposed that gives some matter its identity—its quiddity or "whatness" (i.e., its "what it is").

Plato was one of the first essentialists, postulating the concept of ideal forms—an abstract entity of which individual objects are mere facsimiles. To give an example: the ideal form of a circle is a perfect circle, something that is physically impossible to make manifest; yet the circles we draw and observe clearly have some idea in common—the ideal form. Plato proposed that these ideas are eternal and vastly superior to their manifestations, and that we understand these manifestations in the material world by comparing and relating them to their respective ideal form. Plato's forms are regarded as patriarchs to essentialist dogma simply because they are a case of what is intrinsic and a-contextual of objects—the abstract properties that make them what they are. (For more on forms, read Plato's parable of the cave.)

Karl Popper splits the ambiguous term realism into essentialism and realism. He uses essentialism whenever he means the opposite of nominalism, and realism only as opposed to idealism. Popper himself is a realist as opposed to an idealist, but a methodological nominalist as opposed to an essentialist. For example, statements like "a puppy is a young dog" should be read from right to left, as an answer to "What shall we call a young dog"; never from left to right as an answer to "What is a puppy?"

Metaphysical essentialism

Essentialism, in its broadest sense, is any philosophy that acknowledges the primacy of essence. Unlike existentialism, which posits "being" as the fundamental reality, the essentialist ontology must be approached from a metaphysical perspective. Empirical knowledge is developed from experience of a relational universe whose components and attributes are defined and measured in terms of intellectually constructed laws. Thus, for the scientist, reality is explored as an evolutionary system of diverse entities, the order of which is determined by the principle of causality.

Plato believed that the universe was perfect and that its observed imperfections came from man's limited perception of it. For Plato, there were two realities: the "essential" or ideal and the "perceived". Aristotle (384–322 BC) applied the term essence to that which things in a category have in common and without which they cannot be members of that category (for example, rationality is the essence of man; without rationality a creature cannot be a man). In his critique of Aristotle's philosophy, Bertrand Russell said that his concept of essence transferred to metaphysics what was only a verbal convenience and that it confused the properties of language with the properties of the world. In fact, a thing's "essence" consisted in those defining properties without which we could not use the name for it. Although the concept of essence was "hopelessly muddled" it became part of every philosophy until modern times.

The Egyptian-born philosopher Plotinus (204–270 AD) brought idealism to the Roman Empire as Neoplatonism, and with it the concept that not only do all existents emanate from a "primary essence" but that the mind plays an active role in shaping or ordering the objects of perception, rather than passively receiving empirical data.

Despite the metaphysical basis for the term, academics in science, aesthetics, heuristics, psychology, and gender-based sociological studies have advanced their causes under the banner of essentialism. Possibly the clearest definition for this philosophy was offered by gay/lesbian rights advocate Diana Fuss, who wrote: "Essentialism is most commonly understood as a belief in the real, true essence of things, the invariable and fixed properties which define the 'whatness' of a given entity." Metaphysical essentialism stands diametrically opposed to existential realism in that finite existence is only differentiated appearance, whereas "ultimate reality" is held to be absolute essence.

In psychology

Paul Bloom attempts to explain why people will pay more in an auction for the clothing of celebrities if the clothing is unwashed. He believes the answer to this and many other questions is that people cannot help but think of objects as containing a sort of "essence" that can be influenced.

There is a difference between metaphysical essentialism (see above) and psychological essentialism, the latter referring not to an actual claim about the world but a claim about a way of representing entities in cognitions (Medin, 1989). Influential in this area is Susan Gelman, who has outlined many domains in which children and adults construe classes of entities, particularly biological entities, in essentialist terms—i.e., as if they had an immutable underlying essence which can be used to predict unobserved similarities between members of that class. (Toosi & Ambady, 2011). This causal relationship is unidirectional; an observable feature of an entity does not define the underlying essence (Dar-Nimrod & Heine, 2011).

In developmental psychology

Essentialism has emerged as an important concept in psychology, particularly developmental psychology. Gelman and Kremer (1991) studied the extent to which children from 4–7 years old demonstrate essentialism. Children were able to identify the cause of behaviour in living and non-living objects. Children understood that underlying essences predicted observable behaviours. Participants could correctly describe living objects' behaviour as self-perpetuated and non-living objects as a result of an adult influencing the object's actions. This is a biological way of representing essential features in cognitions. Understanding the underlying causal mechanism for behaviour suggests essentialist thinking (Rangel and Keller, 2011). Younger children were unable to identify causal mechanisms of behaviour whereas older children were able to. This suggests that essentialism is rooted in cognitive development. It can be argued that there is a shift in the way that children represent entities, from not understanding the causal mechanism of the underlying essence to showing sufficient understanding (Demoulin, Leyens & Yzerbyt, 2006). 

There are four key criteria that constitute essentialist thinking. The first facet is the aforementioned individual causal mechanisms (del Rio & Strasser, 2011). The second is innate potential: the assumption that an object will fulfill its predetermined course of development (Kanovsky, 2007). According to this criterion, essences predict developments in entities that will occur throughout its lifespan. The third is immutability (Holtz & Wagner, 2009). Despite altering the superficial appearance of an object it does not remove its essence. Observable changes in features of an entity are not salient enough to alter its essential characteristics. The fourth is inductive potential (Birnbaum, Deeb, Segall, Ben-Aliyahu & Diesendruck, 2010). This suggests that entities may share common features but are essentially different. However similar two beings may be, their characteristics will be at most analogous, differing most importantly in essences.

The implications of psychological essentialism are numerous. Prejudiced individuals have been found to endorse exceptionally essential ways of thinking, suggesting that essentialism may perpetuate exclusion among social groups (Morton, Hornsey & Postmes, 2009). For example, essentialism of nationality has been linked to anti-immigration attitudes(Rad & Ginges, 2018). In multiple studies in India and the United States, Rad & Ginges (2018) showed that in lay view, a person's nationality is considerably fixed at birth, even if that person is adopted and raised by a family of another nationality at day one and never told about their origin. This may be due to an over-extension of an essential-biological mode of thinking stemming from cognitive development. Paul Bloom of Yale University has stated that "one of the most exciting ideas in cognitive science is the theory that people have a default assumption that things, people and events have invisible essences that make them what they are. Experimental psychologists have argued that essentialism underlies our understanding of the physical and social worlds, and developmental and cross-cultural psychologists have proposed that it is instinctive and universal. We are natural-born essentialists." Scholars suggest that the categorical nature of essentialist thinking predicts the use of stereotypes and can be targeted in the application of stereotype prevention (Bastian & Haslam, 2006).

In ethics

Classical essentialists claim that some things are wrong in an absolute sense. For example, murder breaks a universal, objective and natural moral law and not merely an advantageous, socially or ethically constructed one.

Many modern essentialists claim that right and wrong are moral boundaries that are individually constructed; in other words, things that are ethically right or wrong are actions that the individual deems to be beneficial or harmful, respectively. 

In biology

Before evolution was developed as a scientific theory, there existed an essentialist view of biology that posited all species to be unchanging throughout time. The historian Mary P. Winsor has argued that biologists such as Louis Agassiz in the 19th century believed that taxa such as species and genus were fixed, reflecting the mind of the creator. Some religious opponents of evolution continue to maintain this view of biology.

Recent work by historians of systematic biology has, however, cast doubt upon this view of pre-Darwinian thinkers. Winsor, Ron Amundson and Staffan Müller-Wille have each argued that in fact the usual suspects (such as Linnaeus and the Ideal Morphologists) were very far from being essentialists, and it appears that the so-called "essentialism story" (or "myth") in biology is a result of conflating the views expressed by philosophers from Aristotle onwards through to John Stuart Mill and William Whewell in the immediately pre-Darwinian period, using biological examples, with the use of terms in biology like species.

Gender essentialism

In feminist theory and gender studies, gender essentialism is the attribution of fixed essences to men and women—this idea that men and women are fundamentally different continues to be a matter of contention. Women's essence is assumed to be universal and is generally identified with those characteristics viewed as being specifically feminine. These ideas of femininity are usually biologized and are often preoccupied with psychological characteristics, such as nurturance, empathy, support, and non-competitiveness, etc. Feminist theorist Elizabeth Grosz states in her 1995 publication Space, time and perversion: essays on the politics of bodies that essentialism "entails the belief that those characteristics defined as women's essence are shared in common by all women at all times. It implies a limit of the variations and possibilities of change—it is not possible for a subject to act in a manner contrary to her essence. Her essence underlies all the apparent variations differentiating women from each other. Essentialism thus refers to the existence of fixed characteristic, given attributes, and ahistorical functions that limit the possibilities of change and thus of social reorganization."

Gender essentialism is pervasive in popular culture, as illustrated by the #1 New York Times best seller Men Are from Mars, Women Are from Venus, but this essentialism is routinely critiqued in introductory women studies textbooks such as Women: Images & Realities.

Starting in the 1980s, some feminist writers have put forward essentialist theories about gender and science. Evelyn Fox Keller, Sandra Harding,  and Nancy Tuana  argued that the modern scientific enterprise is inherently patriarchal and incompatible with women's nature. Other feminist scholars, such as Ann Hibner Koblitz, Lenore Blum, Mary Gray, Mary Beth Ruskai, and Pnina Abir-Am and Dorinda Outram have criticized those theories for ignoring the diverse nature of scientific research and the tremendous variation in women's experiences in different cultures and historical periods.

In historiography

Essentialism in history as a field of study entails discerning and listing essential cultural characteristics of a particular nation or culture, in the belief that a people or culture can be understood in this way. Sometimes such essentialism leads to claims of a praiseworthy national or cultural identity, or to its opposite, the condemnation of a culture based on presumed essential characteristics. Herodotus, for example, claims that Egyptian culture is essentially feminized and possesses a "softness" which has made Egypt easy to conquer. To what extent Herodotus was an essentialist is a matter of debate; he is also credited with not essentializing the concept of the Athenian identity, or differences between the Greeks and the Persians that are the subject of his Histories.

Essentialism had been operative in colonialism as well as in critiques of colonialism.

Post-colonial theorists such as Edward Said insisted that essentialism was the "defining mode" of "Western" historiography and ethnography until the nineteenth century and even after, according to Touraj Atabaki, manifesting itself in the historiography of the Middle East and Central Asia as Eurocentrism, over-generalization, and reductionism.

Today, most historians, social scientists and humanists reject methodologies associated with essentialism, though some have argued that certain varieties of essentialism may be useful or even necessary.

Language ideology

From Wikipedia, the free encyclopedia

Language ideology (also known as linguistic ideology or language attitude) is, within anthropology (especially linguistic anthropology), sociolinguistics, and cross-cultural studies, any set of beliefs about languages as they are used in their social worlds. When recognized and explored, language ideologies expose how the speakers' linguistic beliefs are linked to the broader social and cultural systems to which they belong, illustrating how the systems beget such beliefs. By doing so, language ideologies link implicit and explicit assumptions about a language or language in general to their social experience as well as their political and economic interests. Language ideologies are conceptualizations about languages, speakers, and discursive practices. Like other kinds of ideologies, language ideologies are influenced by political and moral interests, and they are shaped in a cultural setting.

Applications and approaches

Definitions

Scholars have noted difficulty in attempting to delimit the scope, meaning, and applications of language ideology. Paul Kroskrity, a linguistic anthropologist, describes language ideology as a "cluster concept, consisting of a number of converging dimensions" with several "partially overlapping but analytically distinguishable layers of significance", and cites that in the existing scholarship on language ideology "there is no particular unity . . . no core literature, and a range of definitions." One of the broadest definitions is offered by Alan Rumsey, who describes language ideologies as "shared bodies of commonsense notions about the nature of language in the world." This definition is seen by Kroskrity as unsatisfactory, however, because "it fails to problematize language ideological variation and therefore promotes an overly homogeneous view of language ideologies within a cultural group." Emphasizing the role of speakers' awareness in influencing language structure, Michael Silverstein defines linguistic ideologies as "sets of beliefs about language articulated by users as a rationalization or justification of perceived language structure and use." Definitions that place greater emphasis on sociocultural factors include Shirley Heath's characterization of language ideologies as "self-evident ideas and objectives a group holds concerning roles of language in the social experiences of members as they contribute to the expression of the group", as well as Judith Irvine's definition of the concept as "the cultural system of ideas about social and linguistic relationships, together with their loading of moral and political interests."

Critical vs. neutral approaches

The basic division in studies of language ideology is between neutral and critical approaches to ideology. In neutral approaches to language ideology, beliefs or ideas about a language are understood to be shaped by the cultural systems in which it is embedded, but no variation within or across these systems is identified. Often, a single ideology will be identified in such cases. Characterizations of language ideology as representative of one community or culture, such as those routinely documented in ethnographic research, are common examples of neutral approaches to language ideology.

Critical approaches to language ideology explore the capacity for language and linguistic ideologies to be used as strategies for maintaining social power and domination. They are described by Kathryn Woolard and Bambi Schieffelin as studies of "some aspects of representation and social cognition, with particular social origins or functional and formal characteristics." Although such studies are often noted for their discussions of language politics and the intersection between language and social class, the crucial difference between these approaches to language ideology and neutral understandings of the concept is that the former emphasize the existence of variability and contradiction both within and amongst ideologies, while the latter approach ideology as a conception on its own terms.

Areas of inquiry

Language use and structure

Many scholars have argued that ideology plays a role in shaping and influencing linguistic structures and speech forms. Michael Silverstein, for example, sees speakers’ awareness of language and their rationalizations of its structure and use as critical factors that often shape the evolution of a language's structure. According to Silverstein, the ideologies speakers possess regarding language mediate the variation that occurs due to their imperfect and limited awareness of linguistic structures, resulting in the regularization of any variation that is rationalized by any sufficiently dominant or culturally widespread ideologies. This is demonstrated by such linguistic changes as the rejection of “he” as the generic pronoun in English, which coincided with the rise of the feminist movement in the second half of the twentieth century. In this instance, the accepted usage of the masculine pronoun as the generic form came to be understood as a linguistic symbol of patriarchal and male-dominated society, and the growing sentiment opposing these conditions motivated some speakers to stop using “he” as the generic pronoun in favor of the construction “he or she.” This rejection of generic “he” was rationalized by the growing desire for gender equality and women's empowerment, which was sufficiently culturally prevalent to regularize the change.

Alan Rumsey also sees linguistic ideologies as playing a role in shaping the structure of a language, describing a circular process of reciprocal influence where a language's structure conditions the ideologies that affect it, which in turn reinforce and expand this structure, altering the language “in the name of making it more like itself.” This process is exemplified by the excessive glottalization of consonants by bilingual speakers of moribund varieties of Xinca, who effectively altered the structure of this language in order to make it more distinct from Spanish. These speakers glottalized consonants in situations in places more competent speakers of Xinca would not because they were less familiar with the phonological rules of the language and also because they wished to distinguish themselves from the socially-dominant Spanish-speakers, who viewed glottalized consonants as “exotic.”

Ethnography of speaking

Studies of "ways of speaking" within specific communities have been recognized as especially productive sites of research in language ideology. They often include a community's own theory of speech as a part of their ethnography, which allows for the documentation of explicit language ideologies on a community-wide level or in “the neutral sense of cultural conceptions.” A study of language socialization practices in Dominica, for example, revealed that local notions of personhood, status, and authority are associated with the strategic usage of Patwa and English in the course of the adult-child interaction. The use of Patwa by children is largely forbidden by adults due to a perception that it inhibits the acquisition of English, thus restricting social mobility, which in turn has imbued Patwa with a significant measure of covert prestige and rendered it a powerful tool for children to utilize in order to defy authority. Thus there are many competing ideologies of Patwa in Dominica: one which encourages a shift away from Patwa usage and another which contributes to its maintenance.

Linguistic ideologies in speech act theory

J. L. Austin and John Searle's speech act theory has been described by several ethnographers, anthropologists, and linguists as being based in a specifically Western linguistic ideology that renders it inapplicable in certain ethnographic contexts. Jef Verschueren characterized speech act theory as privileging “a privatized view of language that emphasizes the psychological state of the speaker while downplaying the social consequences of speech,” while Michael Silverstein argued that the theory's ideas about language “acts” and “forces” are “projections of covert categories typical in the metapragmatic discourse of languages such as English.” Scholars have subsequently used speech act theory to caution against the positioning of linguistic theories as universally applicable, citing that any account of language will reflect the linguistic ideologies held by those who develop it.

Language contact and multilingualism

Several scholars have noted that sites of cultural contact promote the development of new linguistic forms that draw on diverse language varieties and ideologies at an accelerated rate. According to Miki Makihara and Bambi Schieffelin, it becomes necessary during times of cultural contact for speakers to actively negotiate language ideologies and to consciously reflect on language use. This articulation of ideology is essential to prevent misconceptions of meaning and intentions between cultures, and provides a link between sociocultural and linguistic processes in contact situations.

Language policy and standardization

The establishment of a standard language has many implications in the realms of politics and power. Recent examinations of language ideologies have resulted in the conception of “standard” as a matter of ideology rather than fact, raising questions such as “how doctrines of linguistic correctness and incorrectness are rationalized and how they are related to doctrines of the inherent representational power, beauty, and expressiveness of language as a valued mode of action.”.

Language policy

Governmental policies often reflect the tension between two contrasting types of language ideologies: ideologies that conceive of language as a resource, problem, or right and ideologies that conceive of language as pluralistic phenomena. The linguistic policies that emerge in such instances often reflect a compromise between both types of ideologies. According to Blommaert and Verschueren, this compromise is often reinterpreted as a single, unified ideology, evidenced by the many European societies characterized by a language ideological homogenism.

Ideologies of linguistic purism

Purist language ideologies or ideologies of linguistic conservatism can close off languages to nonnative sources of innovation, usually when such sources are perceived as socially or politically threatening to the target language. Among the Tewa, for example, the influence of theocratic institutions and ritualized linguistic forms in other domains of Tewa society have led to a strong resistance to the extensive borrowing and shift that neighboring speech communities have experienced. According to Paul Kroskrity this is due to a "dominant language ideology" through which ceremonial Kiva speech is elevated to a linguistic ideal and the cultural preferences that it embodies, namely regulation by convention, indigenous purism, strict compartmentalization, and linguistic indexing of identity, are recursively projected onto the Tewa language as a whole.

Alexandra Jaffe points out that language purism is often part of “essentializing discourses” that can lead to stigmatizing habitual language practices like code-switching and depict contact-induced linguistic changes as forms of cultural deficiency.

Standard language ideology

As defined by Rosina Lippi-Green, standard language ideology is "a bias toward an abstract, idealized homogeneous language, which is imposed and maintained by dominant institutions and which has as its model the written language, but which is drawn primarily from the spoken language of the upper middle class." According to Lippi-Green, part of this ideology is a belief that standard languages are internally consistent. Linguists generally agree, however, that variation is intrinsic to all spoken language, including standard varieties.

Standard language ideology is strongly connected with the concepts of linguistic purism and prescriptivism. It is also linked with linguicism (linguistic discrimination).

Literacy

Literacy cannot be strictly defined technically, but rather it is a set of practices determined by a community's language ideology. It can be interpreted in many ways that are determined by political, social, and economic forces. According to Kathryn Woolard and Bambi Schieffelin, literacy traditions are closely linked to social control in most societies. The typical European literacy ideology, for example, recognizes literacy solely in an alphabetic capacity.

Kaluli literacy development

In the 1960s, missionaries arrived in Papua New Guinea and exposed the Kaluli to Christianity and modernization, part of which was accomplished through the introduction of literacy. The Kaluli primers that were introduced by the missionaries promoted Westernization, which effectively served to strip the vernacular language of cultural practices and from discourse in church and school. Readers written in the 1970s used derogatory terms to refer to the Kaluli and depicted their practices as inferior, motivating the Kaluli to change their self-perceptions and orient themselves towards Western values. The missionaries' control of these authoritative books and of this new “technology of language literacy” gave them the power to effect culture change and morph the ideology of Kaluli into that of modern Christianity.

Orthography

Orthographic systems always carry historical, cultural, and political meaning that are grounded in ideology. Orthographic debates are focused on political and social issues rather than on linguistic discrepancies, which can make for intense debates characterized by ideologically charged stances and symbolically important decisions.

Classroom practice/second language acquisition

"Language ideologies are not confined merely to ideas or beliefs, but rather is extended to include the very language practices through which our ideas or notions are enacted" (Razfar, 2005). Teachers display their language ideologies in classroom instruction through various practices such as correction or repair, affective alignment, metadiscourse, and narrative (see Razfar & Rumenapp, 2013, p. 289). The study of ideology seeks to uncover the hidden world of students and teachers to shed light on the fundamental forces that shape and give meaning to their actions and interactions.

Subatomic particle

From Wikipedia, the free encyclopedia
 
A composite particle proton is made of two up quark and one down quark, which are elementary particles

In physical sciences, a subatomic particle is a particle that composes an atom. According to the Standard Model of particle physics, a subatomic particle can be either a composite particle, which is composed of other particles (for example, a proton, neutron, or meson), or an elementary particle, which is not composed of other particles (for example, an electron, photon, or muon). Particle physics and nuclear physics study these particles and how they interact.

Experiments show that light could behave like a stream of particles (called photons) as well as exhibiting wave-like properties. This led to the concept of wave–particle duality to reflect that quantum-scale particles behave like both particles and waves (they are sometimes described as waveicles to reflect this).

Another concept, the uncertainty principle, states that some of their properties taken together, such as their simultaneous position and momentum, cannot be measured exactly. The wave–particle duality has been shown to apply not only to photons but to more massive particles as well.

Interactions of particles in the framework of quantum field theory are understood as creation and annihilation of quanta of corresponding fundamental interactions. This blends particle physics with field theory.

Even among particle physicists, the exact definition of a particle has diverse descriptions. These professional attempts at the definition of a particle include:

Classification

By composition

Subatomic particles are either "elementary", i.e. not made of multiple other particles, or "composite" and made of more than one elementary particle bound together.

The elementary particles of the Standard Model are:

The Standard Model classification of particles

All of these have now been discovered by experiments, with the latest being the top quark (1995), tau neutrino (2000), and Higgs boson (2012).

Various extensions of the Standard Model predict the existence of an elementary graviton particle and many other elementary particles, but none have been discovered as of 2021.

Hadrons

Nearly all composite particles contain multiple quarks (and/or antiquarks) bound together by gluons (with a few exceptions with no quarks, such as positronium and muonium). Those containing few (≤ 5) quarks (including antiquarks) are called hadrons. Due to a property known as color confinement, quarks are never found singly but always occur in hadrons containing multiple quarks. The hadrons are divided by number of quarks (including antiquarks) into the baryons containing an odd number of quarks (almost always 3), of which the proton and neutron (the two nucleons) are by far the best known; and the mesons containing an even number of quarks (almost always 2, one quark and one antiquark), of which the pions and kaons are the best known.

Except for the proton and neutron, all other hadrons are unstable and decay into other particles in microseconds or less. A proton is made of two up quarks and one down quark, while the neutron is made of two down quarks and one up quark. These commonly bind together into an atomic nucleus, e.g. a helium-4 nucleus is composed of two protons and two neutrons. Most hadrons do not live long enough to bind into nucleus-like composites; those who do (other than the proton and neutron) form exotic nuclei.

By statistics

Any subatomic particle, like any particle in the three-dimensional space that obeys the laws of quantum mechanics, can be either a boson (with integer spin) or a fermion (with odd half-integer spin).

In the Standard Model, all the elementary fermions have spin 1/2, and are divided into the quarks which carry color charge and therefore feel the strong interaction, and the leptons which do not. The elementary bosons comprise the gauge bosons (photon, W and Z, gluons) with spin 1, while the Higgs boson is the only elementary particle with spin zero.

The hypothetical graviton is required theoretically to have spin 2, but is not part of the Standard Model. Some extensions such as supersymmetry predict additional elementary particles with spin 3/2, but none have been discovered as of 2021.

Due to the laws for spin of composite particles, the baryons (3 quarks) have spin either 1/2 or 3/2, and are therefore fermions; the mesons (2 quarks) have integer spin of either 0 or 1, and are therefore bosons.

By mass

In special relativity, the energy of a particle at rest equals its mass times the speed of light squared, E = mc2. That is, mass can be expressed in terms of energy and vice versa. If a particle has a frame of reference in which it lies at rest, then it has a positive rest mass and is referred to as massive.

All composite particles are massive. Baryons (meaning "heavy") tend to have greater mass than mesons (meaning "intermediate"), which in turn tend to be heavier than leptons (meaning "lightweight"), but the heaviest lepton (the tau particle) is heavier than the two lightest flavours of baryons (nucleons). It is also certain that any particle with an electric charge is massive.

When originally defined in the 1950s, the terms baryons, mesons and leptons referred to masses; however, after the quark model became accepted in the 1970s, it was recognised that baryons are composites of three quarks, mesons are composites of one quark and one antiquark, while leptons are elementary and are defined as the elementary fermions with no color charge.

All massless particles (particles whose invariant mass is zero) are elementary. These include the photon and gluon, although the latter cannot be isolated.

By decay

Most subatomic particles are not stable. All leptons, as well as baryons decay by either the strong force or weak force (except for the proton). Protons are not known to decay, although whether they are "truly" stable is unknown, as some very important Grand Unified Theories (GUTs) actually require it. The μ and τ muons, as well as their antiparticles, decay by the weak force. Neutrinos (and antineutrinos) do not decay, but a related phenomenon of neutrino oscillations is thought to exist even in vacuums. The electron and its antiparticle, the positron, are theoretically stable due to charge conservation unless a lighter particle having magnitude of electric charge  e exists (which is unlikely). Its charge is not shown yet.

Other properties

All observable subatomic particles have their electric charge an integer multiple of the elementary charge. The Standard Model's quarks have "non-integer" electric charges, namely, multiple of 13e, but quarks (and other combinations with non-integer electric charge) cannot be isolated due to color confinement. For baryons, mesons, and their antiparticles the constituent quarks' charges sum up to an integer multiple of e.

Through the work of Albert Einstein, Satyendra Nath Bose, Louis de Broglie, and many others, current scientific theory holds that all particles also have a wave nature. This has been verified not only for elementary particles but also for compound particles like atoms and even molecules. In fact, according to traditional formulations of non-relativistic quantum mechanics, wave–particle duality applies to all objects, even macroscopic ones; although the wave properties of macroscopic objects cannot be detected due to their small wavelengths.

Interactions between particles have been scrutinized for many centuries, and a few simple laws underpin how particles behave in collisions and interactions. The most fundamental of these are the laws of conservation of energy and conservation of momentum, which let us make calculations of particle interactions on scales of magnitude that range from stars to quarks. These are the prerequisite basics of Newtonian mechanics, a series of statements and equations in Philosophiae Naturalis Principia Mathematica, originally published in 1687.

Dividing an atom

The negatively charged electron has a mass equal to 11837 or 1836 of that of a hydrogen atom. The remainder of the hydrogen atom's mass comes from the positively charged proton. The atomic number of an element is the number of protons in its nucleus. Neutrons are neutral particles having a mass slightly greater than that of the proton. Different isotopes of the same element contain the same number of protons but differing numbers of neutrons. The mass number of an isotope is the total number of nucleons (neutrons and protons collectively).

Chemistry concerns itself with how electron sharing binds atoms into structures such as crystals and molecules. The subatomic particles considered important in the understanding of chemistry are the electron, the proton, and the neutron. Nuclear physics deals with how protons and neutrons arrange themselves in nuclei. The study of subatomic particles, atoms and molecules, and their structure and interactions, requires quantum mechanics. Analyzing processes that change the numbers and types of particles requires quantum field theory. The study of subatomic particles per se is called particle physics. The term high-energy physics is nearly synonymous to "particle physics" since creation of particles requires high energies: it occurs only as a result of cosmic rays, or in particle accelerators. Particle phenomenology systematizes the knowledge about subatomic particles obtained from these experiments.

History

The term "subatomic particle" is largely a retronym of the 1960s, used to distinguish a large number of baryons and mesons (which comprise hadrons) from particles that are now thought to be truly elementary. Before that hadrons were usually classified as "elementary" because their composition was unknown.

A list of important discoveries follows:

Particle Composition Theorized Discovered Comments
Electron
e
elementary (lepton) G. Johnstone Stoney (1874) J. J. Thomson (1897) Minimum unit of electrical charge, for which Stoney suggested the name in 1891.
alpha particle
α
composite (atomic nucleus) never Ernest Rutherford (1899) Proven by Rutherford and Thomas Royds in 1907 to be helium nuclei.
Photon
γ
elementary (quantum) Max Planck (1900) Albert Einstein (1905) Ernest Rutherford (1899) as γ rays Necessary to solve the thermodynamic problem of black-body radiation.
Proton
p
composite (baryon) William Prout (1815) Ernest Rutherford (1919, named 1920) The nucleus of 1
H
.
Neutron
n
composite (baryon) Santiago Antúnez de Mayolo (c.1924) James Chadwick (1932) The second nucleon.
Antiparticles   Paul Dirac (1928) Carl D. Anderson (
e+
, 1932)
Revised explanation uses CPT symmetry.
Pions
π
composite (mesons) Hideki Yukawa (1935) César Lattes, Giuseppe Occhialini, Cecil Powell (1947) Explains the nuclear force between nucleons. The first meson (by modern definition) to be discovered.
Muon
μ
elementary (lepton) never Carl D. Anderson (1936) Called a "meson" at first; but today classed as a lepton.
Kaons
K
composite (mesons) never G. D. Rochester, C. C. Butler (1947) Discovered in cosmic rays. The first strange particle.
Lambda baryons
Λ
composite (baryons) never University of Melbourne (
Λ0
, 1950)
The first hyperon discovered.
Neutrino
ν
elementary (lepton) Wolfgang Pauli (1930), named by Enrico Fermi Clyde Cowan, Frederick Reines (
ν
e
, 1956)
Solved the problem of energy spectrum of beta decay.
Quarks
(
u
,
d
,
s
)
elementary Murray Gell-Mann, George Zweig (1964) No particular confirmation event for the quark model.
charm quark
c
elementary (quark) Sheldon Glashow, John Iliopoulos, Luciano Maiani (1970) B. Richter et al., S. C. C. Ting et al. (
J/ψ
, 1974)

bottom quark
b
elementary (quark) Makoto Kobayashi, Toshihide Maskawa (1973) Leon M. Lederman et al. (
ϒ
, 1977)

Gluons elementary (quantum) Harald Fritzsch, Murray Gell-Mann (1972) DESY (1979)
Weak gauge bosons
W±
,
Z0
elementary (quantum) Glashow, Weinberg, Salam (1968) CERN (1983) Properties verified through the 1990s.
top quark
t
elementary (quark) Makoto Kobayashi, Toshihide Maskawa (1973) Fermilab (1995) Does not hadronize, but is necessary to complete the Standard Model.
Higgs boson elementary (quantum) Peter Higgs et al. (1964) CERN (2012) Thought to be confirmed in 2013. More evidence found in 2014.
Tetraquark composite ? Zc(3900), 2013, yet to be confirmed as a tetraquark A new class of hadrons.
Pentaquark composite ? Yet another class of hadrons. As of 2019 several are thought to exist.
Graviton elementary (quantum) Albert Einstein (1916)
Interpretation of a gravitational wave as particles is controversial.
Magnetic monopole elementary (unclassified) Paul Dirac (1931) undiscovered

 

Algorithmic information theory

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Algorithmic_information_theory ...