Search This Blog

Sunday, December 21, 2025

Electron microscope

From Wikipedia, the free encyclopedia
A modern transmission electron microscope (TITAN)

An electron microscope is a microscope that uses a beam of electrons as a source of illumination. It uses electron optics that are analogous to the glass lenses of an optical light microscope to control the electron beam, for instance focusing it to produce magnified images or electron diffraction patterns. As the wavelength of an electron can be up to 100,000 times smaller than that of visible light, electron microscopes have a much higher resolution of about 0.1 nm, which compares to about 200 nm for light microscopesElectron microscope may refer to:

Additional details can be found in the above links. This article contains some general information mainly about transmission and scanning electron microscopes.

History

Many developments laid the groundwork of the electron optics used in microscopes. One significant step was the work of Hertz in 1883 who made a cathode-ray tube with electrostatic and magnetic deflection, demonstrating manipulation of the direction of an electron beam. Others were focusing of the electrons by an axial magnetic field by Emil Wiechert in 1899, improved oxide-coated cathodes which produced more electrons by Arthur Wehnelt in 1905 and the development of the electromagnetic lens in 1926 by Hans Busch. According to Dennis Gabor, the physicist Leó Szilárd tried in 1928 to convince him to build an electron microscope, for which Szilárd had filed a patent.

Reproduction of an early electron microscope constructed by Ernst Ruska in the 1930s

To this day the issue of who invented the transmission electron microscope is controversial. In 1928, at the Technische Hochschule in Charlottenburg (now Technische Universität Berlin), Adolf Matthias (Professor of High Voltage Technology and Electrical Installations) appointed Max Knoll to lead a team of researchers to advance research on electron beams and cathode-ray oscilloscopes. The team consisted of several PhD students including Ernst Ruska. In 1931, Max Knoll and Ernst Ruska successfully generated magnified images of mesh grids placed over an anode aperture. The device, a replicate of which is shown in the figure, used two magnetic lenses to achieve higher magnifications, the first electron microscope. (Max Knoll died in 1969, so did not receive a share of the 1986 Nobel prize for the invention of electron microscopes.)

Apparently independent of this effort was work at Siemens-Schuckert by Reinhold Rüdenberg. According to patent law (U.S. Patent No. 2058914 and 2070318, both filed in 1932), he is the inventor of the electron microscope, but it is not clear when he had a working instrument. He stated in a very brief article in 1932 that Siemens had been working on this for some years before the patents were filed in 1932, claiming that his effort was parallel to the university development. He died in 1961, so similar to Max Knoll, was not eligible for a share of the 1986 Nobel prize.

In the following year, 1933, Ruska and Knoll built the first electron microscope that exceeded the resolution of an optical (light) microscope. Four years later, in 1937, Siemens financed the work of Ernst Ruska and Bodo von Borries, and employed Helmut Ruska, Ernst's brother, to develop applications for the microscope, especially with biological specimens. Also in 1937, Manfred von Ardenne pioneered the scanning electron microscope. Siemens produced the first commercial electron microscope in 1938. The first North American electron microscopes were constructed in the 1930s, at the Washington State University by Anderson and Fitzsimmons  and at the University of Toronto by Eli Franklin Burton and students Cecil Hall, James Hillier, and Albert Prebus. Siemens produced a transmission electron microscope (TEM) in 1939. Although current transmission electron microscopes are capable of two million times magnification, as scientific instruments they remain similar but with improved optics.

In the 1940s, high-resolution electron microscopes were developed, enabling greater magnification and resolution. By 1965, Albert Crewe at the University of Chicago introduced the scanning transmission electron microscope using a field emission source, enabling scanning microscopes at high resolution. By the early 1980s improvements in mechanical stability as well as the use of higher accelerating voltages enabled imaging of materials at the atomic scale. In the 1980s, the field emission gun became common for electron microscopes, improving the image quality due to the additional coherence and lower chromatic aberrations. The 2000s were marked by advancements in aberration-corrected electron microscopy, allowing for significant improvements in resolution and clarity of images.

Types of electron microscopes

Transmission electron microscope (TEM)

Transmission Electron Microscope

The original form of the electron microscope, the transmission electron microscope (TEM), uses a high voltage electron beam to illuminate the specimen and create an image. An electron beam is produced by an electron gun, with the electrons typically having energies in the range 20 to 400 keV, focused by electromagnetic lenses, and transmitted through a thin specimen. When it emerges from the specimen, the electron beam carries information about the structure of the specimen that is then magnified by the lenses of the microscope. The spatial variation in this information (the "image") may be viewed by projecting the magnified electron image onto a detector. For example, the image may be viewed directly by an operator using a fluorescent viewing screen coated with a phosphor or scintillator material such as zinc sulfide. More commonly a high-resolution phosphor is coupled by means of a lens optical system or a fibre optic light-guide to the sensor of a digital camera. A different approach is to use a direct electron detector which has no scintillator, which addresses some of the limitations of scintillator-coupled cameras.

For many years the resolution of TEMs was limited by aberrations of the electron optics, primarily the spherical aberration. In most recent instruments hardware correctors can reduce spherical aberration and other aberrations, improving the resolution in high-resolution transmission electron microscopy (HRTEM) to below 0.5 angstrom (50 picometres), enabling magnifications of more than 50 million times. The ability of HRTEM to determine the positions of atoms within materials is useful for many areas of research and development.

Scanning electron microscope (SEM)

An SEM produces images by probing the specimen with a focused electron beam that is scanned across the specimen (raster scanning). When the electron beam interacts with the specimen, it loses energy and is scattered in different directions by a variety of mechanisms. These interactions lead to, among other events, emission of low-energy secondary electrons and high-energy backscattered electrons, light emission (cathodoluminescence) or X-ray emission. All of these signals carrying information about the specimen, such as the surface topography and composition. The image displayed when using an SEM shows the variation in the intensity of any of these signals as an image. In these each position in the image corresponding to a position of the beam on the specimen when the signal was generated.

TESCAN S8000X SEM

SEMs are different from TEMs in that they use electrons with much lower energy, generally below 20 keV, while TEMs generally use electrons with energies in the range of 80-300 keV. Thus, the electron sources and optics of the two microscopes have different designs, and they are normally separate instruments.

Scanning transmission electron microscope (STEM)

A STEM combines features of both a TEM and a SEM by rastering a focused incident probe across a specimen, but now mainly using the electrons which are transmitted through the sample. Many types of imaging are common to both TEM and STEM, but some such as annular dark-field imaging and other analytical techniques are much easier to perform with higher spatial resolutions in a STEM instrument. One drawback is that image data is acquired in serial rather than in parallel fashion.

Main operating modes

An image of an ant in an SEM

The most common methods of obtaining images in an electron microscope involve selecting different directions for the electrons that have been transmitted through a sample, and/or electrons of different energies. There are a very large number of methods of doing this, although not all are very common.

Secondary electrons

Electron–matter interaction volume and types of signal generated in a SEM

In a SEM the signals result from interactions of the electron beam with atoms within the sample. The most common mode is to use the secondary electrons (SE) to produce images. Secondary electrons have very low energies, on the order of 50 eV, which limits their mean free path in solid matter to a few nanometers below the sample surface. The electrons are detected by an Everhart–Thornley detector, which is a type of collector-scintillator-photomultiplier system. The signal from secondary electrons tends to be highly localized at the point of impact of the primary electron beam, making it possible to collect images of the sample surface with a resolution of better than 1 nm, and with specialized instruments at the atomic scale.

The brightness of the signal depends on the number of secondary electrons reaching the detector. If the beam enters the sample perpendicular to the surface, then the electrons come out symmetrically about the axis of the beam. As the angle of incidence increases, the interaction volume from which they cone increases and the "escape" distance from one side of the beam decreases, resulting in more secondary electrons being emitted from the sample. Thus steep surfaces and edges tend to be brighter than flat surfaces, which results in images with a well-defined, three-dimensional appearance that is similar to a reflected light image.

Backscattered electrons

Backscattered electrons (BSE) are those emitted back out from the specimen due to beam-specimen interactions where the electrons undergo elastic and inelastic scattering. They are conventionally defined as having energies from 50 eV up to the energy of the primary beam. Backscattered electrons can be used for both imaging and to form an electron backscatter diffraction (EBSD) image, the latter can be used to determine the crystallography of the specimen.

Electron backscatter diffraction pattern for (001) single crystal silicon crystals taken at 20kV using Oxford S2 detector

Heavy elements (high atomic number) backscatter electrons more strongly than light elements (low atomic number), and thus appear brighter in the image, BSE images can therefore be used to detect areas with different chemical compositions. To optimize the signal, dedicated backscattered electron detectors are positioned above the sample in a "doughnut" type arrangement, concentric with the electron beam, maximizing the solid angle of collection. BSE detectors are usually either scintillator or semiconductor types. When all parts of the detector are used to collect electrons symmetrically about the beam, atomic number contrast is produced. However, strong topographic contrast is produced by collecting back-scattered electrons from one side above the specimen using an asymmetrical, directional BSE detector; the resulting contrast appears as if there was illumination of the topography from that side. Semiconductor detectors can be made in radial segments that can be switched in or out to control the type of contrast produced and its directionality.

Diffraction contrast imaging

Diffraction contrast uses the variation in either or both the direction of diffracted electrons or their amplitude as a function of position as the contrast mechanism. It is one of the simplest ways to image in a transmission electron microscope, and widely used.

The idea is to use an objective aperture below the sample and select only one or a range of different diffracted directions, then use these to form an image. When the aperture includes the incident beam direction the images are called bright field, since in the absence of any sample the field of view would be uniformly bright. When the aperture excludes the incident beam the images are called dark field, since similarly without a sample the image would be uniformly dark. One variant of this is called weak-beam dark-field microscopy, and can be used to obtain high resolution images of defects such as dislocations.

High resolution imaging

CuTe High resolution image

In high-resolution transmission electron microscopy (also sometimes called high-resolution electron microscopy) a number of different diffracted beams are allowed through the objective aperture. These interfere, leading to images which represent the atomic structure of the material. These can include the incident beam direction, or with scanning transmission electron microscopes they typically are for a range of diffracted beams excluding the incident beam. Depending upon how thick the samples are and the aberrations of the microscope, these images can either be directly interpreted in terms of the positions of columns of atoms, or require a more careful analysis using calculations of the multiple scattering of the electrons and the effect of the contrast transfer function of the microscope.

There are many other imaging variants that can also to lead to atomic level information. Electron holography uses the interference of electrons which have been through the sample and a reference beam. 4D STEM collects diffraction data at each point using a scanning instrument, then processes them to produce different types of images.

X-ray microanalysis

EDS spectrum of the mineral crust of the vent shrimp Rimicaris exoculata Most of these peaks are K-alpha and K-beta lines. One peak is from the L shell of iron.

X-ray microanalysis is a method of obtaining local chemical information within electron microscopes of all types, although it is most commonly used in scanning instruments. When high energy electrons interact with atoms they can knock out electrons, particularly those in the inner shells and core electrons. These are then filled by valence electron, and the energy difference between the valence and core states can be converted into an x-ray which is detected by a spectrometer. The energies of these x-rays is somewhat specific to the atomic species, so local chemistry can be probed.

EELS

Experimental electron energy loss spectrum, showing the major features: zero-loss peak, plasmon peaks and core loss edge.

Similar to X-ray microanalysis, the energies of electrons which have transmitted through a sample can be analyzed and yield information ranging from details of the local electronic structure to chemical information.

Electron diffraction

Transmission electron microscopes can be used in electron diffraction mode where a map of the angles of the electrons leaving the sample is produced. The advantages of electron diffraction over X-ray crystallography are primarily in the size of the crystals. In X-ray crystallography, crystals are commonly visible by the naked eye and are generally in the hundreds of micrometers in length. In comparison, crystals for electron diffraction must be less than a few hundred nanometers in thickness, and have no lower boundary of size. Additionally, electron diffraction is done on a TEM, which can also be used to obtain other types of information, rather than requiring a separate instrument.

Variations in CBED with thickness for Si (001)

There are many variants on electron diffraction, depending upon exactly what type of illumination conditions are used. If a parallel beam is used with an aperture to limit the region exposed to the electrons then sharp diffraction features are normally observed, a technique called selected area electron diffraction. This is often the main technique used. Another common approach uses conical illumination and is called convergent beam electron diffraction (CBED). This is good for determining the symmetry of materials. A third is precession electron diffraction, where a parallel beam is spun around a large angle, producing a type of average diffraction pattern. These often have less multiple scattering.

Other electron microscope techniques

Aberration corrected instruments

Scanning transmission electron microscope equipped with a 3rd-order spherical aberration corrector

Aberration-corrected transmission electron microscopy (AC-TEM) is the general term for electron microscopes where electro optical components are introduced to reduce the aberrations that would otherwise limit the resolution of the images. Historically electron microscopes had quite severe aberrations, and until about the start of the 21st century the resolution was limited, able to image the atomic structure of materials if the atoms were far enough apart. Around the turn of the century the electron optical components were coupled with computer control of the lenses and their alignment, enabling correction of aberrations. The first demonstration of aberration correction in TEM mode was by Harald Rose and Maximilian Haider in 1998 using a hexapole corrector, and in STEM mode by Ondrej Krivanek and Niklas Dellby in 1999 using a quadrupole/octupole corrector.

As of 2025 correction of geometric aberrations is standard in many commercial electron microscopes, and they are extensively used in many different areas of science. Similar correctors have also been used at much lower energies for LEEM instruments.

Sample preparation

An insect coated in gold for viewing with a scanning electron microscope (SEM)

Samples for electron microscopes mostly cannot be observed directly. The samples need to be prepared to stabilize the sample and enhance contrast. Preparation techniques differ vastly in respect to the sample and its specific qualities to be observed as well as the specific microscope used. Details can be found in the relevant main articles listed above.

Disadvantages

JEOL transmission and scanning electron microscope made in the mid-1970s

Electron microscopes are expensive to build and maintain. Microscopes designed to achieve high resolutions must be housed in stable buildings (sometimes underground) with special services such as magnetic field canceling systems and anti vibration mounts.

The samples largely have to be viewed in vacuum, as the molecules that make up air would scatter the electrons. An exception is liquid-phase electron microscopy using either a closed liquid cell or an environmental chamber, for example, in the environmental scanning electron microscope, which allows hydrated samples to be viewed in a low-pressure (up to 20 Torr or 2.7 kPa) wet environment. Various techniques for in situ electron microscopy of gaseous samples have also been developed.

Pleolipoviral virion (HRPV-6)

Samples of hydrated materials, including almost all biological specimens, have to be prepared in various ways to stabilize them, reduce their thickness (ultrathin sectioning) and increase their electron optical contrast (staining). These processes may result in artifacts, but these can usually be identified by comparing the results obtained by using radically different specimen preparation methods. Since the 1980s, analysis of cryofixed, vitrified specimens has also become increasingly used.

Many samples suffer from radiation damage which can change internal structures. This can be due to either or both radiolytic processes or ballistic, for instance with collision cascades. This can be a severe issue for biological samples.

There's Plenty of Room at the Bottom

Miniaturization (publ. 1961) included Feynman's lecture as its final chapter.

"There's Plenty of Room at the Bottom: An Invitation to Enter a New Field of Physics" was a lecture given by physicist Richard Feynman at the annual American Physical Society meeting at Caltech on December 29, 1959. Feynman considered the possibility of direct manipulation of individual atoms as a more robust form of synthetic chemistry than those used at the time. Versions of the talk were reprinted in a few popular magazines, but it went largely unnoticed until the 1980s.

The title references the popular quote "There is always room at the top." attributed to Daniel Webster (who is thought to have said this phrase in response to warnings against becoming a lawyer, which was seen as an oversaturated field in the 19th century).

Conception

Feynman considered some ramifications of a general ability to manipulate matter on an atomic scale. He was particularly interested in the possibilities of denser computer circuitry and microscopes that could see things much smaller than is possible with scanning electron microscopes. These ideas were later realized by the use of the scanning tunneling microscope, the atomic force microscope and other examples of scanning probe microscopy and storage systems such as Millipede.

Feynman also suggested that it should be possible, in principle, to make nanoscale machines that "arrange the atoms the way we want" and do chemical synthesis by mechanical manipulation.

He also presented the possibility of "swallowing the doctor", an idea that he credited in the essay to his friend and graduate student Albert Hibbs. This concept involved building a tiny, swallowable surgical robot.

As a thought experiment, he proposed developing a set of one-quarter-scale manipulator hands controlled by the hands of a human operator, to build one-quarter scale machine tools analogous to those found in any machine shop. This set of small tools would then be used by the small hands to build and operate ten sets of one-sixteenth-scale hands and tools, and so forth, culminating in perhaps a billion tiny factories to achieve massively parallel operations. He uses the analogy of a pantograph as a way of scaling down items. This idea was anticipated in part, down to the microscale, by science fiction author Robert A. Heinlein in his 1942 story Waldo.

Feynman's vision of a medical use for nanotechnology by swallowing the doctor may be partially achieved by the ribosome, which functions as a biological machine. Such protein domain dynamics can only now be seen by neutron spin echo spectroscopy.

As the sizes got smaller, one would have to redesign tools because the relative strength of various forces would change. Gravity would become less important, and Van der Waals forces such as surface tension would become more important. Feynman mentioned these scaling issues during his talk. Nobody has yet attempted to implement this thought experiment; some types of biological enzymes and enzyme complexes (especially ribosomes) function chemically in a way close to Feynman's vision. Feynman also mentioned in his lecture that it might be better eventually to use glass or plastic because their greater uniformity would avoid problems in the very small scale (metals and crystals are separated into domains where the lattice structure prevails). This could be a good reason to make machines and electronics out of glass and plastic. At present, there are electronic components made of both materials. In glass, there are optical fiber cables that carry and amplify light. In plastic, field effect transistors are being made with polymers, such as polythiophene that becomes an electrical conductor when oxidized.

Challenges

At the meeting Feynman concluded his talk with two challenges, and offered a prize of $1000 for the first to solve each one. The first challenge involved the construction of a tiny motor, which, to Feynman's surprise, was achieved by November 1960 by Caltech graduate William McLellan, a meticulous craftsman, using conventional tools. The motor met the conditions, but did not advance the field. The second challenge involved the possibility of scaling down letters small enough so as to be able to fit the entire Encyclopædia Britannica on the head of a pin, by writing the information from a book page on a surface 1/25,000 smaller in linear scale. In 1985, Tom Newman, a Stanford graduate student, successfully reduced the first paragraph of A Tale of Two Cities by 1/25,000, and collected the second Feynman prize. Newman's thesis adviser, R. Fabian Pease, had read the paper in 1966, but it was another graduate student in the lab, Ken Polasko, who had recently read it who suggested attempting the challenge. Newman was looking for an arbitrary random pattern to demonstrate their technology. Newman said, "Text was ideal because it has so many different shapes."

Reception

The New Scientist reported "the scientific audience was captivated." Feynman had "spun the idea off the top of his mind" without even "notes from beforehand". There were no copies of the speech available. A "foresighted admirer" brought a tape recorder and an edited transcript, without Feynman's jokes, was made for publication by Caltech. In February 1960, Caltech's Engineering and Science published the speech. In addition to excerpts in The New Scientist, versions were printed in The Saturday Review and Popular Science. Newspapers announced the winning of the first challenge. The lecture was included as the final chapter in the 1961 book, Miniaturization.

Impact

K. Eric Drexler later took the Feynman concept of a billion tiny factories and added the idea that they could make more copies of themselves, via computer control instead of control by a human operator, in his 1986 book Engines of Creation: The Coming Era of Nanotechnology.

After Feynman's death, scholars studying the historical development of nanotechnology have concluded that his role in catalyzing nanotechnology research was not highly rated by many people active in the nascent field in the 1980s and 1990s. Chris Toumey, a cultural anthropologist at the University of South Carolina, has reconstructed the history of the publication and republication of Feynman's talk, along with the record of citations to "Plenty of Room" in the scientific literature.

In Toumey's 2008 article "Reading Feynman into Nanotechnology", he found 11 versions of the publication of "Plenty of Room", plus two instances of a closely related talk by Feynman, "Infinitesimal Machinery", which Feynman called "Plenty of Room, Revisited" (published under the name "Infinitesimal Machinery"). Also in Toumey's references are videotapes of that second talk. The journal Nature Nanotechnology dedicated an issue in 2009 to the subject.

Toumey found that the published versions of Feynman's talk had a negligible influence in the twenty years after it was first published, as measured by citations in the scientific literature, and not much more influence in the decade after the scanning tunneling microscope was invented in 1981. Interest in "Plenty of Room" in the scientific literature greatly increased in the early 1990s. This is probably because the term "nanotechnology" gained serious attention just before that time, following its use by Drexler in his 1986 book, Engines of Creation: The Coming Era of Nanotechnology, which cited Feynman, and in a cover article headlined "Nanotechnology", published later that year in a mass-circulation science-oriented magazine, OMNI. The journal Nanotechnology was launched in 1989; the famous Eigler-Schweizer experiment, precisely manipulating 35 xenon atoms, was published in Nature in April 1990; and Science had a special issue on nanotechnology in November 1991. These and other developments hint that the retroactive rediscovery of "Plenty of Room" gave nanotechnology a packaged history that provided an early date of December 1959, plus a connection to Richard Feynman.

Toumey's analysis also includes comments from scientists in nanotechnology who say that "Plenty of Room" did not influence their early work, and most of them had not read it until a later date.

Feynman's stature as a Nobel laureate and an important figure in 20th-century science helped advocates of nanotechnology. It provided a valuable intellectual link to the past. More concretely, his stature and concept of atomically precise fabrication played a role in securing funding for nanotechnology research, illustrated by President Clinton's January 2000 speech calling for a federal program:

My budget supports a major new National Nanotechnology Initiative, worth $500 million. Caltech is no stranger to the idea of nanotechnology, the ability to manipulate matter at the atomic and molecular level. Over 40 years ago, Caltech's own Richard Feynman asked, "What would happen if we could arrange the atoms one by one the way we want them?"

The version of the Nanotechnology Research and Development Act that the House passed in May 2003 called for a study of the technical feasibility of molecular manufacturing, but this study was removed to safeguard funding of less controversial research before it was passed by the Senate and signed into law by President George W. Bush on December 3, 2003.

In 2016, a group of researchers of TU Delft and INL reported the storage of a paragraph of Feynman's talk using binary code where every bit was made with a single atomic vacancy. Using a scanning tunnelling microscope to manipulate thousands of atoms, the researchers crafted the text:

But I am not afraid to consider the final question as to whether, ultimately – in the great future – we can arrange the atoms the way we want; the very atoms, all the way down! What would happen if we could arrange the atoms one by one the way we want them (within reason, of course; you can't put them so that they are chemically unstable, for example). Up to now, we have been content to dig in the ground to find minerals. We heat them and we do things on a large scale with them, and we hope to get a pure substance with just so much impurity, and so on. But we must always accept some atomic arrangement that nature gives us. We haven't got anything, say, with a "checkerboard" arrangement, with the impurity atoms precisely arranged 1,000 angstroms apart, or in some other particular pattern.

This text uses exactly 1 kibibyte, i.e., 8192 bits, made with 1 atom vacancy each, constituting thereby the first atomic kibibyte, with a storage density 500 times larger than the state of the art approaches. The text required to "arrange the atoms the way we want", in a checkerboard pattern. This self-referential tribute to Feynman's vision was covered both by scientific journals and mainstream media.

White supremacy

From Wikipedia, the free encyclopedia

White supremacy is the belief that white people are superior to those of other races. The belief favors the maintenance and defense of any power and privilege held by white people. White supremacy has roots in the now-discredited doctrine of scientific racism and was a key justification for European colonialism.

As a political ideology, it imposes and maintains cultural, social, political, historical or institutional domination by white people and non-white supporters. In the past, this ideology had been put into effect through socioeconomic and legal structures such as the Atlantic slave trade, European colonial labor and social practices, the Scramble for Africa, Jim Crow laws in the United States, the activities of the Native Land Court in New Zealand, the White Australia policies from the 1890s to the mid-1970s, and apartheid in South Africa. This ideology is also today present among neo-Confederates.

White supremacy underlies a spectrum of contemporary movements including white nationalism, white separatism, neo-Nazism, and the Christian Identity movement. In the United States, white supremacy is primarily associated with Aryan Nations, White Aryan Resistance, and the Ku Klux Klan. The Proud Boys are considered an implicitly white supremacist organization, despite denying their association with white supremacy. In recent years, websites such as Twitter (known as X since July 2023), Reddit, and Stormfront, have contributed to an increased activity and interest in white supremacy.

Not all white-supremacist organizations have the same objectives, and while some may uphold a Nordicist ideal of whiteness, others are more broadly white supremacist, including members of Southern European and Eastern European descent. Different groups of white supremacists identify various racial, ethnic, religious, and other enemies, most commonly those of Sub-Saharan African ancestry, Indigenous peoples, people of Asian descent, multiracial people, MENA people, Jews, Muslims, and LGBTQ+ people.

In academic usage, particularly in critical race theory or intersectionality, "white supremacy" also refers to a social system in which white people enjoy structural advantages (privilege) over other ethnic groups, on both a collective and individual level, despite formal legal equality.

The theory of white adjacency posits that some groups of non-White people are more closely aligned with White people than others, which affords them some degree of white privilege.

History

White supremacy has ideological foundations that date back to 18th-century scientific racism, the predominant paradigm of human variation that shaped international relations and racial policy from the latter part of the Age of Enlightenment until the late 20th century.

United States

White men pose for a photograph of the 1920 Duluth, Minnesota lynchings. Two of the black victims are still hanging while the third is on the ground. Lynchings were often public spectacles for the white community to celebrate white supremacy in the U.S., and photos were often sold as postcards.
Ku Klux Klan parade in Washington, D.C. in 1926

Early history

White supremacy was dominant in the United States both before and after the American Civil War, and it persisted for decades after the Reconstruction era. The Virginia Slave Codes of 1705 socially segregated white colonists from black enslaved persons, making them disparate groups and hindering their ability to unite. Unity of the commoners was a perceived fear of the Virginia aristocracy, who wished to prevent repeated events such as Bacon's Rebellion, occurring 29 years prior. Prior to the Civil War, many wealthy white Americans owned slaves; they tried to justify their economic exploitation of black people by creating a "scientific" theory of white superiority and black inferiority. One such slave owner, future president Thomas Jefferson, wrote in 1785 that blacks were "inferior to the whites in the endowments of body and mind." In the antebellum South, four million slaves were denied freedom. The outbreak of the Civil War saw the desire to uphold white supremacy being cited as a cause for state secession and the formation of the Confederate States of America. In an 1890 editorial about Native Americans and the American Indian Wars, author L. Frank Baum wrote: "The Whites, by law of conquest, by justice of civilization, are masters of the American continent, and the best safety of the frontier settlements will be secured by the total annihilation of the few remaining Indians."

The Naturalization Act of 1790 limited U.S. citizenship to whites only. In some parts of the United States, many people who were considered non-white were disenfranchised, barred from government office, and prevented from holding most government jobs well into the second half of the 20th century. Professor Leland T. Saito of the University of Southern California writes: "Throughout the history of the United States, race has been used by whites for legitimizing and creating difference and social, economic and political exclusion."

20th century

The denial of social and political freedom to minorities continued into the mid-20th century, resulting in the civil rights movement. The movement was spurred by the lynching of Emmett Till, a 14-year-old boy. David Jackson writes it was the image of the "murdered child's ravaged body, that forced the world to reckon with the brutality of American racism."

Sociologist Stephen Klineberg has stated that U.S. immigration laws prior to 1965 clearly "declared that Northern Europeans are a superior subspecies of the white race". The Immigration and Nationality Act of 1965 opened entry to the U.S. to non-Germanic groups, and significantly altered the demographic mix in the U.S. as a result. With 38 U.S. states having banned interracial marriage through anti-miscegenation laws, the last 16 states had such laws in place until 1967 when they were invalidated by the Supreme Court of the United States' decision in Loving v. Virginia. These mid-century gains had a major impact on white Americans' political views; segregation and white racial superiority, which had been publicly endorsed in the 1940s, became minority views within the white community by the mid-1970s, and continued to decline in 1990s' polls to a single-digit percentage. For sociologist Howard Winant, these shifts marked the end of "monolithic white supremacy" in the United States.

After the mid-1960s, white supremacy remained an important ideology to the American far-right. According to Kathleen Belew, a historian of race and racism in the United States, white militancy shifted after the Vietnam War from supporting the existing racial order to a more radical position (self-described as "white power" or "white nationalism") committed to overthrowing the United States government and establishing a white homeland. Such anti-government militia organizations are one of three major strands of violent right-wing movements in the United States, with white-supremacist groups (such as the Ku Klux Klan, neo-Nazi organizations, and racist skinheads) and a religious fundamentalist movement (such as Christian Identity) being the other two.

21st century

The presidential campaign of Donald Trump led to a surge of interest in white supremacy and white nationalism in the United States, bringing increased media attention and new members to their movement; his campaign enjoyed their widespread support.

Some academics argue that the outcome of the 2016 United States presidential election, and the many controversies which surrounded it, reflect the ongoing influence of white supremacy in the United States. Educators, literary theorists, and other political experts have raised similar questions, connecting the scapegoating of disenfranchised populations to white superiority.

British Commonwealth

There has been debate whether Winston Churchill, who was voted "the greatest ever Briton" in 2002, was "a racist and white supremacist". In the context of rejecting the Arab wish to stop Jewish immigration to Palestine, he said:

I do not admit that the dog in the manger has the final right to the manger, though he may have lain there for a very long time. I do not admit that right. I do not admit for instance that a great wrong has been done to the Red Indians of America or the black people of Australia. I do not admit that a wrong has been done to those people by the fact that a stronger race, a higher-grade race or at any rate a more worldly-wise race ... has come in and taken their place."

British historian Richard Toye, author of Churchill's Empire, concluded that "Churchill did think that white people were superior."

South Africa

A number of Southern African nations experienced severe racial tension and conflict during global decolonization, particularly as white Africans of European ancestry fought to protect their preferential social and political status. Racial segregation in South Africa began in colonial times under the Dutch Empire. It continued when the British took over the Cape of Good Hope in 1795. Apartheid was introduced as an officially structured policy by the Afrikaner-dominated National Party after the general election of 1948. Apartheid's legislation divided inhabitants into four racial groups – "black", "white", "coloured", and "Indian", with coloured divided into several sub-classifications. In 1970, the Afrikaner-run government abolished non-white political representation, and starting that year black people were deprived of South African citizenship. South Africa abolished apartheid in 1991.

Rhodesia

In Rhodesia a predominantly white government issued its own unilateral declaration of independence from the United Kingdom in 1965 during an ultimately unsuccessful attempt to avoid majority rule. Following the Rhodesian Bush War which was fought by African nationalists, Rhodesian prime minister Ian Smith acceded to biracial political representation in 1978 and the state achieved recognition from the United Kingdom as Zimbabwe in 1980.

Germany

Nazism promoted the idea of a superior Germanic people or Aryan race in Germany during the early 20th century. Notions of white supremacy and Aryan racial superiority were combined in the 19th century, with white supremacists maintaining the belief that white people were members of an Aryan "master race" that was superior to other races, particularly the Jews, who were described as the "Semitic race", Slavs, and Gypsies, who they associated with "cultural sterility". Arthur de Gobineau, a French racial theorist and aristocrat, blamed the fall of the ancien régime in France on racial degeneracy caused by racial intermixing, which he argued had destroyed the "purity" of the Nordic or Germanic race. Gobineau's theories, which attracted a strong following in Germany, emphasized the existence of an irreconcilable polarity between Aryan or Germanic peoples and Jewish culture.

As the Nazi Party's chief racial theorist, Alfred Rosenberg oversaw the construction of a human racial "ladder" that justified Hitler's racial and ethnic policies. Rosenberg promoted the Nordic theory, which regarded Nordics as the "master race", superior to all others, including other Aryans (Indo-Europeans). Rosenberg got the racial term Untermensch from the title of Klansman Lothrop Stoddard's 1922 book The Revolt Against Civilization: The Menace of the Under-man. It was later adopted by the Nazis from that book's German version Der Kulturumsturz: Die Drohung des Untermenschen (1925). Rosenberg was the leading Nazi who attributed the concept of the East-European "under man" to Stoddard. An advocate of the U.S. immigration laws that favored Northern Europeans, Stoddard wrote primarily on the alleged dangers posed by "colored" peoples to white civilization, and wrote The Rising Tide of Color Against White World-Supremacy in 1920. In establishing a restrictive entry system for Germany in 1925, Hitler wrote of his admiration for America's immigration laws: "The American Union categorically refuses the immigration of physically unhealthy elements, and simply excludes the immigration of certain races."

German praise for America's institutional racism, previously found in Hitler's Mein Kampf, was continuous throughout the early 1930s. Nazi lawyers were advocates of the use of American models. Race-based U.S. citizenship and anti-miscegenation laws directly inspired the Nazis' two principal Nuremberg racial laws—the Citizenship Law and the Blood Law. To preserve the Aryan or Nordic race, the Nazis introduced the Nuremberg Laws in 1935, which forbade sexual relations and marriages between Germans and Jews, and later between Germans and Romani and Slavs. The Nazis used the Mendelian inheritance theory to argue that social traits were innate, claiming that there was a racial nature associated with certain general traits, such as inventiveness or criminal behavior.

According to the 2012 annual report of Germany's interior intelligence service, the Federal Office for the Protection of the Constitution, at the time there were 26,000 right-wing extremists living in Germany, including 6,000 neo-Nazis.

Australia and New Zealand

Fifty-one people died from two consecutive terrorist attacks at the Al Noor Mosque and the Linwood Islamic Centre by an Australian white supremacist carried out on March 15, 2019. The terrorist attacks have been described by Prime Minister Jacinda Ardern as "One of New Zealand's darkest days". On August 27, 2020, the shooter was sentenced to life without parole.

In 2016, there was a rise in debate over the appropriateness of the naming of Massey University in Palmerston North after William Massey, whom many historians and critics have described as a white supremacist. Lecturer Steve Elers was a leading proponent of the idea that Massey was an avowed white supremacist, given Massey "made several anti-Chinese racist statements in the public domain" and intensified the New Zealand head tax. In 1921, Massey wrote in the Evening Post: "New Zealanders are probably the purest Anglo-Saxon population in the British Empire. Nature intended New Zealand to be a white man's country, and it must be kept as such. The strain of Polynesian will be no detriment". This is one of many quotes attributed to him regarded as being openly racist.

Ideologies and movements

Supporters of Nordicism consider the "Nordic peoples" to be a superior race. By the early 19th century, white supremacy was attached to emerging theories of racial hierarchy. The German philosopher Arthur Schopenhauer attributed cultural primacy to the white race:

The highest civilization and culture, apart from the ancient Hindus and Egyptians, are found exclusively among the white races; and even with many dark peoples, the ruling caste or race is fairer in colour than the rest and has, therefore, evidently immigrated, for example, the Brahmins, the Incas, and the rulers of the South Sea Islands. All this is due to the fact that necessity is the mother of invention because those tribes that emigrated early to the north, and there gradually became white, had to develop all their intellectual powers and invent and perfect all the arts in their struggle with need, want and misery, which in their many forms were brought about by the climate.

The Good Citizen 1926, published by Pillar of Fire Church

The eugenicist Madison Grant argued in his 1916 book, The Passing of the Great Race, that the Nordic race had been responsible for most of humanity's great achievements, and that admixture was "race suicide". In this book, Europeans who are not of Germanic origin but have Nordic characteristics such as blonde/red hair and blue/green/gray eyes, were considered to be a Nordic admixture and suitable for Aryanization.

Members of the second Ku Klux Klan at a rally in 1923

In the United States, the groups most associated with the white-supremacist movement are the Ku Klux Klan (KKK), Aryan Nations, and the White American Resistance movement, all of which are also considered to be antisemitic. The Proud Boys, despite claiming non-association with white supremacy, have been described in academic contexts as being such. Many white-supremacist groups are based on the concept of preserving genetic purity, and do not focus solely on discrimination based on skin color. The KKK's reasons for supporting racial segregation are not primarily based on religious ideals, but some Klan groups are openly Protestant. The 1915 silent drama film The Birth of a Nation followed the rising racial, economic, political, and geographic tensions leading up to the Emancipation Proclamation and the Southern Reconstruction era that was the genesis of the Ku Klux Klan.

Nazi Germany promulgated white supremacy based on the belief that the Aryan race, or the Germans, were the master race. It was combined with a eugenics programme that aimed for racial hygiene through compulsory sterilization of sick individuals and extermination of Untermenschen ("subhumans"): Slavs, Jews and Romani, which eventually culminated in the Holocaust.

Christian Identity is another movement closely tied to white supremacy. Some white supremacists identify themselves as Odinists, although many Odinists reject white supremacy. Some white-supremacist groups, such as the South African Boeremag, conflate elements of Christianity and Odinism. Creativity (formerly known as "The World Church of the Creator") is atheistic and it denounces Christianity and other theistic religions. Aside from this, its ideology is similar to that of many Christian Identity groups because it believes in the antisemitic conspiracy theory that there is a "Jewish conspiracy" in control of governments, the banking industry and the media. Matthew F. Hale, founder of the World Church of the Creator, has published articles stating that all races other than white are "mud races", which is what the group's religion teaches.

The white-supremacist ideology has become associated with a racist faction of the skinhead subculture, despite the fact that when the skinhead culture first developed in the United Kingdom in the late 1960s, it was heavily influenced by black fashions and music, especially Jamaican reggae and ska, and African American soul music.

White-supremacist recruitment activities are primarily conducted at a grassroots level as well as on the Internet. Widespread access to the Internet has led to a dramatic increase in white-supremacist websites. The Internet provides a venue for open expression of white-supremacist ideas at little social cost because people who post the information are able to remain anonymous.

White nationalism

White separatism

A map showing the suggested boundaries of the Northwest Territorial Imperative in red

White separatism is a political and social movement that seeks the separation of white people from people of other races and ethnicities. This may include the establishment of a white ethnostate by removing non-whites from existing communities or by forming new communities elsewhere.

Most modern researchers do not view white separatism as distinct from white-supremacist beliefs. The Anti-Defamation League defines white separatism as "a form of white supremacy"; the Southern Poverty Law Center defines both white nationalism and white separatism as "ideologies based on white supremacy." Facebook has banned content that is openly white nationalist or white separatist because "white nationalism and white separatism cannot be meaningfully separated from white supremacy and organized hate groups".

Use of the term to self-identify has been criticized as a dishonest rhetorical ploy. The Anti-Defamation League argues that white supremacists use the phrase because they believe it has fewer negative connotations than the term white supremacist.

Dobratz and Shanks-Meile reported that adherents usually reject marriage "outside the white race". They argued for the existence of "a distinction between the white supremacist's desire to dominate (as in apartheid, slavery, or segregation) and complete separation by race". They argued that this is a matter of pragmatism, because, while many white supremacists are also white separatists, contemporary white separatists reject the view that returning to a system of segregation is possible or desirable in the United States.

Academic use of the term

The term white supremacy is used in some academic studies of racial power to denote a system of structural or societal racism which privileges white people over others, regardless of the presence or the absence of racial hatred. According to this definition, white racial advantages occur at both a collective and an individual level (ceteris paribus, i. e., when individuals are compared that do not differ relevantly except in ethnicity). Legal scholar Frances Lee Ansley explains this definition as follows:

By "white supremacy" I do not mean to allude only to the self-conscious racism of white supremacist hate groups. I refer instead to a political, economic and cultural system in which whites overwhelmingly control power and material resources, conscious and unconscious ideas of white superiority and entitlement are widespread, and relations of white dominance and non-white subordination are daily reenacted across a broad array of institutions and social settings.

This and similar definitions have been adopted or proposed by Charles W. Millsbell hooksDavid Gillborn, Jessie Daniels, and Neely Fuller Jr, and they are widely used in critical race theory and intersectional feminism. Some anti-racist educators, such as Betita Martinez and the Challenging White Supremacy workshop, also use the term in this way. The term expresses historic continuities between a pre–civil rights movement era of open white supremacy and the current racial power structure of the United States. It also expresses the visceral impact of structural racism through "provocative and brutal" language that characterizes racism as "nefarious, global, systemic, and constant". Academic users of the term sometimes prefer it to racism because it allows for a distinction to be drawn between racist feelings and white racial advantage or privilege. John McWhorter, a specialist in language and race relations, explains the gradual replacement of "racism" by "white supremacy" by the fact that "potent terms need refreshment, especially when heavily used", drawing a parallel with the replacement of "chauvinist" by "sexist".

Other intellectuals have criticized the term's recent rise in popularity among leftist activists as counterproductive. John McWhorter has described the use of "white supremacy" as straying from its commonly accepted meaning to encompass less extreme issues, thereby cheapening the term and potentially derailing productive discussion. Political columnist Kevin Drum attributes the term's growing popularity to frequent use by Ta-Nehisi Coates, describing it as a "terrible fad" that fails to convey nuance. He claims that the term should be reserved for those who are trying to promote the idea that whites are inherently superior to blacks and not used to characterize less blatantly racist beliefs or actions. The academic use of the term to refer to systemic racism has been criticized by Conor Friedersdorf for the confusion that it creates for the general public, inasmuch as it differs from the more common dictionary definition; he argues that it is likely to alienate those that it hopes to convince.

Relationship between science and religion

From Wikipedia, the free encyclopedia "Science and Religion" redirects here. For the 1991 book by John Hedley Brooke, see  Science...