Search This Blog

Monday, November 3, 2025

Introduction to quantum mechanics

Quantum mechanics is the study of matter and matter's interactions with energy on the scale of atomic and subatomic particles. By contrast, classical physics explains matter and energy only on a scale familiar to human experience, including the behavior of astronomical bodies such as the Moon. Classical physics is still used in much of modern science and technology. However, towards the end of the 19th century, scientists discovered phenomena in both the large (macro) and the small (micro) worlds that classical physics could not explain. The desire to resolve inconsistencies between observed phenomena and classical theory led to a revolution in physics, a shift in the original scientific paradigm: the development of quantum mechanics.

Many aspects of quantum mechanics yield unexpected results, defying expectations and deemed counterintuitive. These aspects can seem paradoxical as they map behaviors quite differently from those seen at larger scales. In the words of quantum physicist Richard Feynman, quantum mechanics deals with "nature as She is—absurd". Features of quantum mechanics often defy simple explanations in everyday language. One example of this is the uncertainty principle: precise measurements of position cannot be combined with precise measurements of velocity. Another example is entanglement: a measurement made on one particle (such as an electron that is measured to have spin 'up') will correlate with a measurement on a second particle (an electron will be found to have spin 'down') if the two particles have a shared history. This will apply even if it is impossible for the result of the first measurement to have been transmitted to the second particle before the second measurement takes place.

Quantum mechanics helps people understand chemistry, because it explains how atoms interact with each other and form molecules. Many remarkable phenomena can be explained using quantum mechanics, like superfluidity. For example, if liquid helium cooled to a temperature near absolute zero is placed in a container, it spontaneously flows up and over the rim of its container; this is an effect which cannot be explained by classical physics.

History

James C. Maxwell's unification of the equations governing electricity, magnetism, and light in the late 19th century led to experiments on the interaction of light and matter. Some of these experiments had aspects which could not be explained until quantum mechanics emerged in the early part of the 20th century.

Evidence of quanta from the photoelectric effect

The seeds of the quantum revolution appear in the discovery by J.J. Thomson in 1897 that cathode rays were not continuous but "corpuscles" (electrons). Electrons had been named just six years earlier as part of the emerging theory of atoms. In 1900, Max Planck, unconvinced by the atomic theory, discovered that he needed discrete entities like atoms or electrons to explain black-body radiation.

Black-body radiation intensity vs color and temperature. The rainbow bar represents visible light; 5000 K objects are "white hot" by mixing differing colors of visible light. To the right is the invisible infrared. Classical theory (black curve for 5000 K) fails to predict the colors; the other curves are correctly predicted by quantum theories.

Very hot – red hot or white hot – objects look similar when heated to the same temperature. This look results from a common curve of light intensity at different frequencies (colors), which is called black-body radiation. White hot objects have intensity across many colors in the visible range. The lowest frequencies above visible colors are infrared light, which also give off heat. Continuous wave theories of light and matter cannot explain the black-body radiation curve. Planck spread the heat energy among individual "oscillators" of an undefined character but with discrete energy capacity; this model explained black-body radiation.

At the time, electrons, atoms, and discrete oscillators were all exotic ideas to explain exotic phenomena. But in 1905 Albert Einstein proposed that light was also corpuscular, consisting of "energy quanta", in contradiction to the established science of light as a continuous wave, stretching back a hundred years to Thomas Young's work on diffraction.

Einstein's revolutionary proposal started by reanalyzing Planck's black-body theory, arriving at the same conclusions by using the new "energy quanta". Einstein then showed how energy quanta connected to Thomson's electron. In 1902, Philipp Lenard directed light from an arc lamp onto freshly cleaned metal plates housed in an evacuated glass tube. He measured the electric current coming off the metal plate, at higher and lower intensities of light and for different metals. Lenard showed that amount of current – the number of electrons – depended on the intensity of the light, but that the velocity of these electrons did not depend on intensity. This is the photoelectric effect. The continuous wave theories of the time predicted that more light intensity would accelerate the same amount of current to higher velocity, contrary to this experiment. Einstein's energy quanta explained the volume increase: one electron is ejected for each quantum: more quanta mean more electrons.

Einstein then predicted that the electron velocity would increase in direct proportion to the light frequency above a fixed value that depended upon the metal. Here the idea is that energy in energy-quanta depends upon the light frequency; the energy transferred to the electron comes in proportion to the light frequency. The type of metal gives a barrier, the fixed value, that the electrons must climb over to exit their atoms, to be emitted from the metal surface and be measured.

Ten years elapsed before Millikan's definitive experiment verified Einstein's prediction. During that time many scientists rejected the revolutionary idea of quanta. But Planck's and Einstein's concept was in the air and soon began to affect other physics and quantum theories.

Quantization of bound electrons in atoms

Experiments with light and matter in the late 1800s uncovered a reproducible but puzzling regularity. When light was shown through purified gases, certain frequencies (colors) did not pass. These dark absorption 'lines' followed a distinctive pattern: the gaps between the lines decreased steadily. By 1889, the Rydberg formula predicted the lines for hydrogen gas using only a constant number and the integers to index the lines. The origin of this regularity was unknown. Solving this mystery would eventually become the first major step toward quantum mechanics.

Throughout the 19th century evidence grew for the atomic nature of matter. With Thomson's discovery of the electron in 1897, scientist began the search for a model of the interior of the atom. Thomson proposed negative electrons swimming in a pool of positive charge. Between 1908 and 1911, Rutherford showed that the positive part was only 1/3000th of the diameter of the atom.

Models of "planetary" electrons orbiting a nuclear "Sun" were proposed, but cannot explain why the electron does not simply fall into the positive charge. In 1913 Niels Bohr and Ernest Rutherford connected the new atom models to the mystery of the Rydberg formula: the orbital radius of the electrons were constrained and the resulting energy differences matched the energy differences in the absorption lines. This meant that absorption and emission of light from atoms was energy quantized: only specific energies that matched the difference in orbital energy would be emitted or absorbed.

Trading one mystery – the regular pattern of the Rydberg formula – for another mystery – constraints on electron orbits – might not seem like a big advance, but the new atom model summarized many other experimental findings. The quantization of the photoelectric effect and now the quantization of the electron orbits set the stage for the final revolution.

Throughout the first and the modern era of quantum mechanics the concept that classical mechanics must be valid macroscopically constrained possible quantum models. This concept was formalized by Bohr in 1923 as the correspondence principle. It requires quantum theory to converge to classical limits. A related concept is Ehrenfest's theorem, which shows that the average values obtained from quantum mechanics (e.g. position and momentum) obey classical laws.

Quantization of spin

Stern–Gerlach experiment: Silver atoms travelling through an inhomogeneous magnetic field, and being deflected up or down depending on their spin; (1) furnace, (2) beam of silver atoms, (3) inhomogeneous magnetic field, (4) classically expected result, (5) observed result

In 1922 Otto Stern and Walther Gerlach demonstrated that the magnetic properties of silver atoms defy classical explanation, the work contributing to Stern’s 1943 Nobel Prize in Physics. They fired a beam of silver atoms through a magnetic field. According to classical physics, the atoms should have emerged in a spray, with a continuous range of directions. Instead, the beam separated into two, and only two, diverging streams of atoms. Unlike the other quantum effects known at the time, this striking result involves the state of a single atom. In 1927, T.E. Phipps and J.B. Taylor obtained a similar, but less pronounced effect using hydrogen atoms in their ground state, thereby eliminating any doubts that may have been caused by the use of silver atoms.

In 1924, Wolfgang Pauli called it "two-valuedness not describable classically" and associated it with electrons in the outermost shell. The experiments lead to formulation of its theory described to arise from spin of the electron in 1925, by Samuel Goudsmit and George Uhlenbeck, under the advice of Paul Ehrenfest.

Quantization of matter

In 1924 Louis de Broglie proposed that electrons in an atom are constrained not in "orbits" but as standing waves. In detail his solution did not work, but his hypothesis – that the electron "corpuscle" moves in the atom as a wave – spurred Erwin Schrödinger to develop a wave equation for electrons; when applied to hydrogen the Rydberg formula was accurately reproduced.

Example original electron diffraction photograph from the laboratory of G. P. Thomson, recorded 1925–1927

Max Born's 1924 paper "Zur Quantenmechanik" was the first use of the words "quantum mechanics" in print. His later work included developing quantum collision models; in a footnote to a 1926 paper he proposed the Born rule connecting theoretical models to experiment.

In 1927 at Bell Labs, Clinton Davisson and Lester Germer fired slow-moving electrons at a crystalline nickel target which showed a diffraction pattern indicating wave nature of electron whose theory was fully explained by Hans Bethe. A similar experiment by George Paget Thomson and Alexander Reid, firing electrons at thin celluloid foils and later metal films, observing rings, independently discovered matter wave nature of electrons.

Further developments

In 1928 Paul Dirac published his relativistic wave equation simultaneously incorporating relativity, predicting anti-matter, and providing a complete theory for the Stern–Gerlach result. These successes launched a new fundamental understanding of our world at small scale: quantum mechanics.

Planck and Einstein started the revolution with quanta that broke down the continuous models of matter and light. Twenty years later "corpuscles" like electrons came to be modeled as continuous waves. This result came to be called wave-particle duality, one iconic idea along with the uncertainty principle that sets quantum mechanics apart from older models of physics.

Quantum radiation, quantum fields

In 1923 Compton demonstrated that the Planck-Einstein energy quanta from light also had momentum; three years later the "energy quanta" got a new name "photon". Despite its role in almost all stages of the quantum revolution, no explicit model for light quanta existed until 1927 when Paul Dirac began work on a quantum theory of radiation that became quantum electrodynamics. Over the following decades this work evolved into quantum field theory, the basis for modern quantum optics and particle physics.

Wave–particle duality

The concept of wave–particle duality says that neither the classical concept of "particle" nor of "wave" can fully describe the behavior of quantum-scale objects, either photons or matter. Wave–particle duality is an example of the principle of complementarity in quantum physics. An elegant example of wave-particle duality is the double-slit experiment.

The diffraction pattern produced when light is shone through one slit (top) and the interference pattern produced by two slits (bottom). Both patterns show oscillations due to the wave nature of light. The double slit pattern is more dramatic.

In the double-slit experiment, as originally performed by Thomas Young in 1803, and then Augustin Fresnel a decade later, a beam of light is directed through two narrow, closely spaced slits, producing an interference pattern of light and dark bands on a screen. The same behavior can be demonstrated in water waves: the double-slit experiment was seen as a demonstration of the wave nature of light.

Variations of the double-slit experiment have been performed using electrons, atoms, and even large molecules, and the same type of interference pattern is seen. Thus it has been demonstrated that all matter possesses wave characteristics.

If the source intensity is turned down, the same interference pattern will slowly build up, one "count" or particle (e.g. photon or electron) at a time. The quantum system acts as a wave when passing through the double slits, but as a particle when it is detected. This is a typical feature of quantum complementarity: a quantum system acts as a wave in an experiment to measure its wave-like properties, and like a particle in an experiment to measure its particle-like properties. The point on the detector screen where any individual particle shows up is the result of a random process. However, the distribution pattern of many individual particles mimics the diffraction pattern produced by waves.

Uncertainty principle

Werner Heisenberg at the age of 26. Heisenberg won the Nobel Prize in Physics in 1932 for the work he did in the late 1920s.

Suppose it is desired to measure the position and speed of an object—for example, a car going through a radar speed trap. It can be assumed that the car has a definite position and speed at a particular moment in time. How accurately these values can be measured depends on the quality of the measuring equipment. If the precision of the measuring equipment is improved, it provides a result closer to the true value. It might be assumed that the speed of the car and its position could be operationally defined and measured simultaneously, as precisely as might be desired.

In 1927, Heisenberg proved that this last assumption is not correct. Quantum mechanics shows that certain pairs of physical properties, for example, position and speed, cannot be simultaneously measured, nor defined in operational terms, to arbitrary precision: the more precisely one property is measured, or defined in operational terms, the less precisely can the other be thus treated. This statement is known as the uncertainty principle. The uncertainty principle is not only a statement about the accuracy of our measuring equipment but, more deeply, is about the conceptual nature of the measured quantities—the assumption that the car had simultaneously defined position and speed does not work in quantum mechanics. On a scale of cars and people, these uncertainties are negligible, but when dealing with atoms and electrons they become critical.

Heisenberg gave, as an illustration, the measurement of the position and momentum of an electron using a photon of light. In measuring the electron's position, the higher the frequency of the photon, the more accurate is the measurement of the position of the impact of the photon with the electron, but the greater is the disturbance of the electron. This is because from the impact with the photon, the electron absorbs a random amount of energy, rendering the measurement obtained of its momentum increasingly uncertain, for one is necessarily measuring its post-impact disturbed momentum from the collision products and not its original momentum (momentum which should be simultaneously measured with position). With a photon of lower frequency, the disturbance (and hence uncertainty) in the momentum is less, but so is the accuracy of the measurement of the position of the impact.

At the heart of the uncertainty principle is a fact that for any mathematical analysis in the position and velocity domains, achieving a sharper (more precise) curve in the position domain can only be done at the expense of a more gradual (less precise) curve in the speed domain, and vice versa. More sharpness in the position domain requires contributions from more frequencies in the speed domain to create the narrower curve, and vice versa. It is a fundamental tradeoff inherent in any such related or complementary measurements, but is only really noticeable at the smallest (Planck) scale, near the size of elementary particles.

The uncertainty principle shows mathematically that the product of the uncertainty in the position and momentum of a particle (momentum is velocity multiplied by mass) could never be less than a certain value, and that this value is related to the Planck constant.

Wave function collapse

Wave function collapse means that a measurement has forced or converted a quantum (probabilistic or potential) state into a definite measured value. This phenomenon is only seen in quantum mechanics rather than classical mechanics.

For example, before a photon actually "shows up" on a detection screen it can be described only with a set of probabilities for where it might show up. When it does appear, for instance in the CCD of an electronic camera, the time and space where it interacted with the device are known within very tight limits. However, the photon has disappeared in the process of being captured (measured), and its quantum wave function has disappeared with it. In its place, some macroscopic physical change in the detection screen has appeared, e.g., an exposed spot in a sheet of photographic film, or a change in electric potential in some cell of a CCD.

Eigenstates and eigenvalues

Because of the uncertainty principle, statements about both the position and momentum of particles can assign only a probability that the position or momentum has some numerical value. Therefore, it is necessary to formulate clearly the difference between the state of something indeterminate, such as an electron in a probability cloud, and the state of something having a definite value. When an object can definitely be "pinned-down" in some respect, it is said to possess an eigenstate.

In the Stern–Gerlach experiment discussed above, the quantum model predicts two possible values of spin for the atom compared to the magnetic axis. These two eigenstates are named arbitrarily 'up' and 'down'. The quantum model predicts these states will be measured with equal probability, but no intermediate values will be seen. This is what the Stern–Gerlach experiment shows.

The eigenstates of spin about the vertical axis are not simultaneously eigenstates of spin about the horizontal axis, so this atom has an equal probability of being found to have either value of spin about the horizontal axis. As described in the section above, measuring the spin about the horizontal axis can allow an atom that was spun up to spin down: measuring its spin about the horizontal axis collapses its wave function into one of the eigenstates of this measurement, which means it is no longer in an eigenstate of spin about the vertical axis, so can take either value.

The Pauli exclusion principle

Wolfgang Pauli

In 1924, Wolfgang Pauli proposed a new quantum degree of freedom (or quantum number), with two possible values, to resolve inconsistencies between observed molecular spectra and the predictions of quantum mechanics. In particular, the spectrum of atomic hydrogen had a doublet, or pair of lines differing by a small amount, where only one line was expected. Pauli formulated his exclusion principle, stating, "There cannot exist an atom in such a quantum state that two electrons within [it] have the same set of quantum numbers."

A year later, Uhlenbeck and Goudsmit identified Pauli's new degree of freedom with the property called spin whose effects were observed in the Stern–Gerlach experiment.

Dirac wave equation

Paul Dirac (1902–1984)

In 1928, Paul Dirac extended the Pauli equation, which described spinning electrons, to account for special relativity. The result was a theory that dealt properly with events, such as the speed at which an electron orbits the nucleus, occurring at a substantial fraction of the speed of light. By using the simplest electromagnetic interaction, Dirac was able to predict the value of the magnetic moment associated with the electron's spin and found the experimentally observed value, which was too large to be that of a spinning charged sphere governed by classical physics. He was able to solve for the spectral lines of the hydrogen atom and to reproduce from physical first principles Sommerfeld's successful formula for the fine structure of the hydrogen spectrum.

Dirac's equations sometimes yielded a negative value for energy, for which he proposed a novel solution: he posited the existence of an antielectron and a dynamical vacuum. This led to the many-particle quantum field theory.

Quantum entanglement

In quantum physics, a group of particles can interact or be created together in such a way that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. This is known as quantum entanglement.

An early landmark in the study of entanglement was the Einstein–Podolsky–Rosen (EPR) paradox, a thought experiment proposed by Albert Einstein, Boris Podolsky and Nathan Rosen which argues that the description of physical reality provided by quantum mechanics is incomplete. In a 1935 paper titled "Can Quantum-Mechanical Description of Physical Reality be Considered Complete?", they argued for the existence of "elements of reality" that were not part of quantum theory, and speculated that it should be possible to construct a theory containing these hidden variables.

The thought experiment involves a pair of particles prepared in what would later become known as an entangled state. Einstein, Podolsky, and Rosen pointed out that, in this state, if the position of the first particle were measured, the result of measuring the position of the second particle could be predicted. If instead the momentum of the first particle were measured, then the result of measuring the momentum of the second particle could be predicted. They argued that no action taken on the first particle could instantaneously affect the other, since this would involve information being transmitted faster than light, which is forbidden by the theory of relativity. They invoked a principle, later known as the "EPR criterion of reality", positing that: "If, without in any way disturbing a system, we can predict with certainty (i.e., with probability equal to unity) the value of a physical quantity, then there exists an element of reality corresponding to that quantity." From this, they inferred that the second particle must have a definite value of both position and of momentum prior to either quantity being measured. But quantum mechanics considers these two observables incompatible and thus does not associate simultaneous values for both to any system. Einstein, Podolsky, and Rosen therefore concluded that quantum theory does not provide a complete description of reality. In the same year, Erwin Schrödinger used the word "entanglement" and declared: "I would not call that one but rather the characteristic trait of quantum mechanics."

The Irish physicist John Stewart Bell carried the analysis of quantum entanglement much further. He deduced that if measurements are performed independently on the two separated particles of an entangled pair, then the assumption that the outcomes depend upon hidden variables within each half implies a mathematical constraint on how the outcomes on the two measurements are correlated. This constraint would later be named the Bell inequality. Bell then showed that quantum physics predicts correlations that violate this inequality. Consequently, the only way that hidden variables could explain the predictions of quantum physics is if they are "nonlocal", which is to say that somehow the two particles are able to interact instantaneously no matter how widely they ever become separated. Performing experiments like those that Bell suggested, physicists have found that nature obeys quantum mechanics and violates Bell inequalities. In other words, the results of these experiments are incompatible with any local hidden variable theory.

Quantum field theory

The idea of quantum field theory began in the late 1920s with British physicist Paul Dirac, when he attempted to quantize the energy of the electromagnetic field; just as in quantum mechanics the energy of an electron in the hydrogen atom was quantized. Quantization is a procedure for constructing a quantum theory starting from a classical theory.

Merriam-Webster defines a field in physics as "a region or space in which a given effect (such as magnetism) exists". Other effects that manifest themselves as fields are gravitation and static electricity. In 2008, physicist Richard Hammond wrote:

Sometimes we distinguish between quantum mechanics (QM) and quantum field theory (QFT). QM refers to a system in which the number of particles is fixed, and the fields (such as the electromechanical field) are continuous classical entities. QFT ... goes a step further and allows for the creation and annihilation of particles ...

He added, however, that quantum mechanics is often used to refer to "the entire notion of quantum view".

In 1931, Dirac proposed the existence of particles that later became known as antimatter. Dirac shared the Nobel Prize in Physics for 1933 with Schrödinger "for the discovery of new productive forms of atomic theory".

Quantum electrodynamics

Quantum electrodynamics (QED) is the name of the quantum theory of the electromagnetic force. Understanding QED begins with understanding electromagnetism. Electromagnetism can be called "electrodynamics" because it is a dynamic interaction between electrical and magnetic forces. Electromagnetism begins with the electric charge.

Electric charges are the sources of and create, electric fields. An electric field is a field that exerts a force on any particles that carry electric charges, at any point in space. This includes the electron, proton, and even quarks, among others. As a force is exerted, electric charges move, a current flows, and a magnetic field is produced. The changing magnetic field, in turn, causes electric current (often moving electrons). The physical description of interacting charged particles, electrical currents, electrical fields, and magnetic fields is called electromagnetism.

In 1928 Paul Dirac produced a relativistic quantum theory of electromagnetism. This was the progenitor to modern quantum electrodynamics, in that it had essential ingredients of the modern theory. However, the problem of unsolvable infinities developed in this relativistic quantum theory. Years later, renormalization largely solved this problem. Initially viewed as a provisional, suspect procedure by some of its originators, renormalization eventually was embraced as an important and self-consistent tool in QED and other fields of physics. Also, in the late 1940s Feynman diagrams provided a way to make predictions with QED by finding a probability amplitude for each possible way that an interaction could occur. The diagrams showed in particular that the electromagnetic force is the exchange of photons between interacting particles.

The Lamb shift is an example of a quantum electrodynamics prediction that has been experimentally verified. It is an effect whereby the quantum nature of the electromagnetic field makes the energy levels in an atom or ion deviate slightly from what they would otherwise be. As a result, spectral lines may shift or split.

Similarly, within a freely propagating electromagnetic wave, the current can also be just an abstract displacement current, instead of involving charge carriers. In QED, its full description makes essential use of short-lived virtual particles. There, QED again validates an earlier, rather mysterious concept.

Standard Model

The Standard Model of particle physics is the quantum field theory that describes three of the four known fundamental forces (electromagnetic, weak and strong interactions – excluding gravity) in the universe and classifies all known elementary particles. It was developed in stages throughout the latter half of the 20th century, through the work of many scientists worldwide, with the current formulation being finalized in the mid-1970s upon experimental confirmation of the existence of quarks. Since then, proof of the top quark (1995), the tau neutrino (2000), and the Higgs boson (2012) have added further credence to the Standard Model. In addition, the Standard Model has predicted various properties of weak neutral currents and the W and Z bosons with great accuracy.

Although the Standard Model is believed to be theoretically self-consistent and has demonstrated success in providing experimental predictions, it leaves some physical phenomena unexplained and so falls short of being a complete theory of fundamental interactions. For example, it does not fully explain baryon asymmetry, incorporate the full theory of gravitation as described by general relativity, or account for the universe's accelerating expansion as possibly described by dark energy. The model does not contain any viable dark matter particle that possesses all of the required properties deduced from observational cosmology. It also does not incorporate neutrino oscillations and their non-zero masses. Accordingly, it is used as a basis for building more exotic models that incorporate hypothetical particles, extra dimensions, and elaborate symmetries (such as supersymmetry) to explain experimental results at variance with the Standard Model, such as the existence of dark matter and neutrino oscillations.

Interpretations

The physical measurements, equations, and predictions pertinent to quantum mechanics are all consistent and hold a very high level of confirmation. However, the question of what these abstract models say about the underlying nature of the real world has received competing answers. These interpretations are widely varying and sometimes somewhat abstract. For instance, the Copenhagen interpretation states that before a measurement, statements about a particle's properties are completely meaningless, while the many-worlds interpretation describes the existence of a multiverse made up of every possible universe.

Light behaves in some aspects like particles and in other aspects like waves. Matter—the "stuff" of the universe consisting of particles such as electrons and atoms—exhibits wavelike behavior too. Some light sources, such as neon lights, give off only certain specific frequencies of light, a small set of distinct pure colors determined by neon's atomic structure. Quantum mechanics shows that light, along with all other forms of electromagnetic radiation, comes in discrete units, called photons, and predicts its spectral energies (corresponding to pure colors), and the intensities of its light beams. A single photon is a quantum, or smallest observable particle, of the electromagnetic field. A partial photon is never experimentally observed. More broadly, quantum mechanics shows that many properties of objects, such as position, speed, and angular momentum, that appeared continuous in the zoomed-out view of classical mechanics, turn out to be (in the very tiny, zoomed-in scale of quantum mechanics) quantized. Such properties of elementary particles are required to take on one of a set of small, discrete allowable values, and since the gap between these values is also small, the discontinuities are only apparent at very tiny (atomic) scales.

Applications

Everyday applications

The relationship between the frequency of electromagnetic radiation and the energy of each photon is why ultraviolet light can cause sunburn, but visible or infrared light cannot. A photon of ultraviolet light delivers a high amount of energy—enough to contribute to cellular damage such as occurs in a sunburn. A photon of infrared light delivers less energy—only enough to warm one's skin. So, an infrared lamp can warm a large surface, perhaps large enough to keep people comfortable in a cold room, but it cannot give anyone a sunburn.

Technological applications

Applications of quantum mechanics include the laser, the transistor, the electron microscope, and magnetic resonance imaging. A special class of quantum mechanical applications is related to macroscopic quantum phenomena such as superfluid helium and superconductors. The study of semiconductors led to the invention of the diode and the transistor, which are indispensable for modern electronics.

In even a simple light switch, quantum tunneling is absolutely vital, as otherwise the electrons in the electric current could not penetrate the potential barrier made up of a layer of oxide. Flash memory chips found in USB drives also use quantum tunneling, to erase their memory cells.

Big cat

From Wikipedia, the free encyclopedia
 
Big cats
The genus Panthera, from top to bottom: the tiger, the lion, the jaguar, the leopard, and the snow leopard.
The genus Panthera, from top to bottom: the tiger, the lion, the jaguar, the leopard, and the snow leopard.
Scientific classification
Kingdom: Animalia
Phylum: Chordata
Class: Mammalia
Order: Carnivora
Superfamily: Feloidea
Family: Felidae
Species

The term "big cat" is used by zoologists to mean any of the five living members of the genus Panthera (the tiger, lion, jaguar, leopard, and snow leopard). In non-scientific contexts, "big cat" can also mean any member of the cat family that is considered "big", including animals like cheetahs and cougars that taxonomically fall under the small cats.

All cats are members of the Felidae family, sharing similar musculature, cardiovascular systems, skeletal frames, and behaviour. Both the cheetah and cougar differ physically from fellow big cats, and to a greater extent, other small cats. As obligate carnivores, big cats are considered apex predators, topping their food chain without natural predators of their own. Native ranges include the Americas, Africa, and Asia; the ranges of the leopard and tiger also extend into Europe, specifically in Russia.

Species

Evolution

It is estimated that the ancestors of most big cats split away from the Felinae about 6.37 million years ago. The Felinae, on the other hand, comprises mostly small to medium-sized cats, including domestic cats, but also some larger cats such as the cougar and cheetah.

A 2010 study published in Molecular Phylogenetics and Evolution has given insight into the exact evolutionary relationships among members of genus Panthera. The study reveals that the snow leopard and the tiger are sister species, while the lion, leopard, and jaguar are more closely related to each other. The tiger and snow leopard diverged from the ancestral big cats approximately 3.9 Ma. The tiger then evolved into a unique species towards the end of the Pliocene epoch, approximately 3.2 Ma. The ancestor of the lion, leopard, and jaguar split from other big cats from 4.3–3.8 Ma. Between 3.6 and 2.5 Ma, the jaguar diverged from the ancestor of lions and leopards. Lions and leopards split from one another approximately 2 Ma. The earliest Panthera fossil, P. blytheae, dating to 4.1−5.95 MA, was discovered in southwest Tibet.

Description and abilities

Roaring

The ability to roar comes from an elongated and specially adapted larynx and hyoid apparatus. The larynx is attached to the hyoid bone that is hanging from a sequence of bones. This sequence of bones the hyoid hangs from are tympanohyal, stylohyal, epihyal, and ceratohyal; these are located in the mandible and skull. In the larynx, there are vocal folds that produce the structure needed to stretch the ligament to a length that creates the roar effect. This tissue is made of thick collagen and elastic fiber that becomes denser as it approaches the epithelial mucosal lining. When this large pad folds it creates a low natural frequency, causing the cartilage walls of the larynx to vibrate. When it begins to vibrate the sound moves from a high to low air resistance which makes the roaring.

The lion's larynx is the longest, giving it the most robust roar. The roar in good conditions can be heard 8 or even 10 km (5 or 6 mi) away. All five extant members of the genus Panthera contain this elongated hyoid but owing to differences in the larynx the snow leopard cannot roar. Unlike the roaring cats in their family, the snow leopard is distinguished by the lack of a large pad of fibro-elastic tissue that allows for a large vocal fold.

Weight range

The range of weights exhibited by the species is large. At the bottom, adult snow leopards usually weigh 22 to 55 kg (49 to 121 lb), with an exceptional specimen reaching 75 kg (165 lb).

Male and female lions typically weigh 150–250 kg (330–550 lb) and 110–182 kg (243–401 lb) respectively, and male and female tigers 100–306 kg (220–675 lb) and 75–167 kg (165–368 lb) respectively. Exceptionally heavy male lions and tigers have been recorded to exceed 306 kg (675 lb) in the wilderness, and weigh around 450 kg (990 lb) in captivity.

The liger, a hybrid of a lion and tiger, can grow to be much larger than its parent species. In particular, a liger called 'Nook' is reported to have weighed over 550 kg (1,210 lb).

Interaction with humans

Conservation

An animal sanctuary provides a refuge for animals to live out their natural lives in a protected environment. Usually, these animal sanctuaries are the organizations which provide a home to big cats whose private owners are no longer able or willing to care for their big cats. However, the use of the word sanctuary in an organization's name is by itself no guarantee that it is a true animal sanctuary in the sense of a refuge. To be accepted by the United States Fish and Wildlife Service (FWS) as a bona fide animal sanctuary and to be eligible for an exemption from the prohibition of interstate movement of big cats under the Captive Wildlife Safety Act (CWSA), organizations must meet the following criteria:

  • Must be a non-profit entity that is tax-exempt under section 501(a) of the Internal Revenue Code
  • Cannot engage in commercial trade in big cat species, including their offspring, parts, and products made from them
  • Cannot breed big cats
  • Cannot allow direct contact between big cats and the public at their facilities
  • Must keep records of transactions involving covered cats
  • Must allow the service to inspect their facilities, records, and animals at reasonable hours

Internationally, a variety of regulations are placed on big cat possession. In Austria, big cats may only be owned in a qualified zoo which is overseen by a zoologist or veterinarian. Requirements must also be met for enclosures, feeding, and training practices. Both Russia and South Africa regulate private ownership of big cats native to each country. Some countries, including Denmark, Thailand and India, prohibit all private ownership of big cats.

Threats

The members of the Panthera genus are classified as some level of threatened by the IUCN Red List: the lion, leopard and snow leopard are categorized as Vulnerable; the tiger is listed as Endangered; and the jaguar is listed as Near Threatened. Cheetahs are also classified as Vulnerable, and the cougar is of Least Concern. All species currently have populations that are decreasing. The principal threats to big cats vary by geographic location but primarily consist of habitat destruction and poaching. In Africa, many big cats are hunted by pastoralists or government "problem animal control" officers. Certain protected areas exist that shelter large and exceptionally visible populations of African leopards, lions and cheetahs, such as Botswana's Chobe, Kenya's Masai Mara, and Tanzania's Serengeti; outside these conservation areas, hunting poses the dominant threat to large carnivores.

In the United States, 19 states have banned ownership of big cats and other dangerous exotic animals as pets, and the Captive Wildlife Safety Act bans the interstate sale and transportation of these animals. The initial Captive Wildlife Safety Act (CWSA) was signed into law on December 19, 2003. To address problems associated with the increasing trade in certain big cat species, the CWSA regulations were strengthened by a law passed on September 17, 2007. The big cat species addressed in these regulations are the lion, tiger, leopard, snow leopard, clouded leopard, cheetah, jaguar, cougar, and any hybrid of these species (liger, tigon, etc.). Private ownership is not prohibited, but the law makes it illegal to transport, sell, or purchase such animals in interstate or foreign commerce. Although these regulations seem to provide a strong legal framework for controlling the commerce involving big cats, international organizations such as the World Wildlife Fund (WWF) have encouraged the U.S. to further strengthen these laws. The WWF is concerned that weaknesses in the existing U.S. regulations could be unintentionally helping to fuel the black market for tiger parts.

Last universal common ancestor

From Wikipedia, the free encyclopedia
Phylogenetic tree linking all major groups of living organisms, namely the bacteria, archaea, and eukarya, as proposed by Woese et al 1990, with the last universal common ancestor (LUCA) shown at the root

The last universal common ancestor (LUCA) is the hypothesized common ancestral cell population from which all subsequent life forms descend, including Bacteria, Archaea, and Eukarya. The cell had a lipid bilayer; it possessed the genetic code and ribosomes which translated from DNA or RNA to proteins. Although the timing of the LUCA cannot be definitively constrained, most studies suggest that the LUCA existed by 3.5 billion years ago, and possibly as early as 4.3 billion years ago or earlier. The nature of this point or stage of divergence remains a topic of research.

All earlier forms of life preceding this divergence and all extant organisms are generally thought to share common ancestry. On the basis of a formal statistical test, this theory of a universal common ancestry (UCA) is supported in preference to competing multiple-ancestry hypotheses. The first universal common ancestor (FUCA) is a hypothetical non-cellular ancestor to LUCA and other now-extinct sister lineages.

Whether the genesis of viruses falls before or after the LUCA–as well as the diversity of extant viruses and their hosts–remains a subject of investigation.

While no fossil evidence of the LUCA exists, the detailed biochemical similarity of all current life (divided into the three domains) makes its existence widely accepted by biochemists. Its characteristics can be inferred from shared features of modern genomes. These genes describe a complex life form with many co-adapted features, including transcription and translation mechanisms to convert information from DNA to mRNA to proteins.

Historical background

A tree of life, like this one from Charles Darwin's notebooks c. July 1837, implies a single common ancestor at its root (labelled "1").

A phylogenetic tree directly portrays the idea of evolution by descent from a single ancestor. An early tree of life was sketched by Jean-Baptiste Lamarck in his Philosophie zoologique in 1809. Charles Darwin more famously proposed the theory of universal common descent through an evolutionary process in his book On the Origin of Species in 1859: "Therefore I should infer from analogy that probably all the organic beings which have ever lived on this earth have descended from some one primordial form, into which life was first breathed." The last sentence of the book begins with a restatement of the hypothesis:

There is grandeur in this view of life, with its several powers, having been originally breathed into a few forms or into one ...

The term "last universal common ancestor" or "LUCA" was first used in the 1990s for such a primordial organism.

Inferring LUCA's features

Biochemical mechanisms

While the anatomy of the LUCA cannot be reconstructed with certainty, its biochemical mechanisms can be deduced and described in some detail, based on properties shared by currently living organisms as well as genetic analysis.

The LUCA certainly had genes and a genetic code. Its genetic material was most likely DNA, so that it lived after the RNA world. The DNA was kept double-stranded by an enzyme, DNA polymerase, which recognises the structure and directionality of DNA. The integrity of the DNA was maintained by a group of repair enzymes including DNA topoisomerase. If the genetic code was based on dual-stranded DNA, it was expressed by copying the information to single-stranded RNA. The RNA was produced by a DNA-dependent RNA polymerase using nucleotides similar to those of DNA. It had multiple DNA-binding proteins, such as histone-fold proteins. The genetic code was expressed into proteins. These were assembled from 20 free amino acids by translation of a messenger RNA via a mechanism of ribosomes, transfer RNAs, and a group of related proteins.

Although LUCA was likely not capable of sexual interaction, gene functions were present that promoted the transfer of DNA between individuals of the population to facilitate genetic recombination. Homologous gene products that promote genetic recombination are present in bacteria, archaea and eukaryota, such as the RecA protein in bacteria, the RadA protein in archaea, and the Rad51 and Dmc1 proteins in eukaryota.

The functionality of LUCA as well as evidence for the early evolution of membrane-dependent biological systems together suggest that LUCA had cellularity and cell membranes. As for the cell's structure, it contained a water-based cytoplasm effectively enclosed by a lipid bilayer membrane; it was capable of reproducing by cell division. It tended to exclude sodium and concentrate potassium by means of specific ion transporters (or ion pumps). The cell multiplied by duplicating all its contents followed by cellular division. The cell used chemiosmosis to produce energy. It also reduced CO2 and oxidized H2 (methanogenesis or acetogenesis) via acetyl-thioesters.

By phylogenetic bracketing, analysis of the presumed LUCA's offspring groups, LUCA appears to have been a small, single-celled organism. It likely had a ring-shaped coil of DNA floating freely within the cell. Morphologically, it would likely not have stood out within a mixed population of small modern-day bacteria. The originator of the three-domain system, Carl Woese, stated that in its genetic machinery, the LUCA would have been a "simpler, more rudimentary entity than the individual ancestors that spawned the three [domains] (and their descendants)".

Because both bacteria and archaea have differences in the structure of phospholipids and cell wall, ion pumping, most proteins involved in DNA replication, and glycolysis, it is inferred that LUCA had a permeable membrane without an ion pump. The emergence of Na+/H+ antiporters likely led to the evolution of impermeable membranes present in eukaryotes, archaea, and bacteria. It is stated that "The late and independent evolution of glycolysis but not gluconeogenesis is entirely consistent with LUCA being powered by natural proton gradients across leaky membranes. Several discordant traits are likely to be linked to the late evolution of cell membranes, notably the cell wall, whose synthesis depends on the membrane and DNA replication". Although LUCA likely had DNA, it is unknown if it could replicate DNA and is suggested that it "might just have been a chemically stable repository for RNA-based replication". It is likely that the permeable membrane of LUCA was composed of archaeal lipids (isoprenoids) and bacterial lipids (fatty acids). Isoprenoids would have enhanced stabilization of LUCA's membrane in the surrounding extreme habitat. Nick Lane and coauthors state that "The advantages and disadvantages of incorporating isoprenoids into cell membranes in different microenvironments may have driven membrane divergence, with the later biosynthesis of phospholipids giving rise to the unique G1P and G3P headgroups of archaea and bacteria respectively. If so, the properties conferred by membrane isoprenoids place the lipid divide as early as the origin of life".

A 2024 study suggests that LUCA's genome was similar in size to that of modern prokaryotes, coding for some 2,600 proteins; that it respired anaerobically, and was an acetogen; and that it had an early CAS-based anti-viral immune system.

An anaerobic thermophile

A direct way to infer LUCA's genome would be to find genes common to all surviving descendants, but little can be learnt by this approach, as there are only about 30 such genes. They are mostly for ribosome proteins, proving that LUCA had the genetic code. Many other LUCA genes have been lost in later lineages over 4 billion years of evolution.
 
Three ways to infer genes present in LUCA: universal presence, presence in both the Bacterial and Archaean domains, and presence in two phyla in both domains. The first yields as stated only about 30 genes; the second, some 11,000 with lateral gene transfer (LGT) very likely; the third, 355 genes probably in LUCA, since they were found in at least two phyla in both domains, making LGT an unlikely explanation.

An alternative to the search for "universal" traits is to use genome analysis to identify phylogenetically ancient genes. This gives a picture of a LUCA that could live in a geochemically harsh environment and is like modern prokaryotes. Analysis of biochemical pathways implies the same sort of chemistry as does phylogenetic analysis.

LUCA systems and environment, including the Wood–Ljungdahl or reductive acetyl–CoA pathway to fix carbon, and most likely DNA complete with the genetic code and enzymes to replicate it, transcribe it to RNA, and translate it to proteins.

In 2016, Madeline C. Weiss and colleagues genetically analyzed 6.1 million protein-coding genes and 286,514 protein clusters from sequenced prokaryotic genomes representing many phylogenetic trees, and identified 355 protein clusters that were probably common to the LUCA. The results of their analysis are highly specific, though debated. They depict LUCA as "anaerobic, CO2-fixing, H2-dependent with a Wood–Ljungdahl pathway (the reductive acetyl-coenzyme A pathway), N2-fixing and thermophilic. LUCA's biochemistry was replete with FeS clusters and radical reaction mechanisms." The cofactors also reveal "dependence upon transition metals, flavins, S-adenosyl methionine, coenzyme A, ferredoxin, molybdopterin, corrins and selenium. Its genetic code required nucleoside modifications and S-adenosylmethionine-dependent methylations." They show that methanogens and clostridia were basal, near the root of the phylogenetic tree, in the 355 protein lineages examined, and that the LUCA may therefore have inhabited an anaerobic hydrothermal vent setting in a geochemically active environment rich in H2, CO2, and iron, where ocean water interacted with hot magma beneath the ocean floor. It is even inferred that LUCA also grew from H2 and CO2 via the reverse incomplete Krebs cycle. Other metabolic pathways inferred in LUCA are the pentose phosphate pathway, glycolysis, and gluconeogenesis. Even if phylogenetic evidence may point to a hydrothermal vent environment for a thermophilic LUCA, this does not constitute evidence that the origin of life took place at a hydrothermal vent since mass extinctions may have removed previously existing branches of life.

The LUCA used the Wood–Ljungdahl or reductive acetyl–CoA pathway to fix carbon, if it was an autotroph, or to respire anaerobically, if it was a heterotroph.

Weiss and colleagues write that "Experiments ... demonstrate that ... acetyl-CoA pathway [chemicals used in anaerobic respiration] formate, methanol, acetyl moieties, and even pyruvate arise spontaneously ... from CO2, native metals, and water", a combination present in hydrothermal vents.

An experiment shows that Zn2+, Cr3+, and Fe can promote 6 of the 11 reactions of an ancient anabolic pathway called the reverse Krebs cycle in acidic conditions which implies that LUCA might have inhabited either hydrothermal vents or acidic metal-rich hydrothermal fields.

Undersampled protein families

Some other researchers have challenged Weiss et al.'s 2016 conclusions. Sarah Berkemer and Shawn McGlynn argue that Weiss et al. undersampled the families of proteins, so that the phylogenetic trees were not complete and failed to describe the evolution of proteins correctly. There are two risks in attempting to attribute LUCA's environment from near-universal gene distribution (as in Weiss et al. 2016). On the one hand, it risks misattributing convergence or horizontal gene transfer events to vertical descent; on the other hand, it risks misattributing potential LUCA gene families as horizontal gene transfer events. A phylogenomic and geochemical analysis of a set of proteins that probably traced to the LUCA show that it had K+-dependent GTPases and the ionic composition and concentration of its intracellular fluid was seemingly high K+/Na+ ratio, NH+
4
, Fe2+, CO2+, Ni2+, Mg2+, Mn2+, Zn2+, pyrophosphate, and PO3−
4
which would imply a terrestrial hot spring habitat. It possibly had a phosphate-based metabolism. Further, these proteins were unrelated to autotrophy (the ability of an organism to create its own organic matter), suggesting that the LUCA had a Heterotrophic lifestyle (consuming organic matter) and that its growth was dependent on organic matter produced by the physical environment.

The presence of the energy-handling enzymes CODH/acetyl-coenzyme A synthase in LUCA could be compatible not only with being an autotroph but also with life as a mixotroph or heterotroph. Weiss et al. in 2018 replied that no enzyme defines a trophic lifestyle, and that heterotrophs evolved from autotrophs.

A 2024 study directly estimated the order in which amino acids were added into the genetic code from early protein domain sequences. A total of 969 protein domains were classified as present in LUCA, including 101 domain sequences that dated back to the even-older pre-LUCA communities. 88% of the protein domains annotated as LUCA or pre-LUCA were confirmed by Moody et al. 2024, by being associated with proteins that are more than 50% likely to be present in LUCA. It found that amino acids that bind metals, and those that contain sulphur, came early in the genetic code. The study suggests that sulphur metabolism and catalysis involving metals were important elements of life at the time of LUCA.

Possibly a mesophile

Several lines of evidence suggest that LUCA was non-thermophilic. The content of G + C nucleotide pairs (compared to the occurrence of A + T pairs) can indicate an organism's thermal optimum as they are more thermally stable due to an additional hydrogen bond. As a result, they occur more frequently in the rRNA of thermophiles; however, this is not seen in LUCA's reconstructed rRNA.

The identification of thermophilic genes in the LUCA has been challenged, as they may instead represent genes that evolved later in archaea or bacteria, then migrated between these via horizontal gene transfer, as in Woese's 1998 hypothesis. For instance, the thermophile-specific topoisomerase, reverse gyrase, was initially attributed to LUCA before an exhaustive phylogenetic study revealed a more recent origin of this enzyme followed by extensive horizontal gene transfer. LUCA could have been a mesophile that fixed CO2 and relied on H2, and lived close to hydrothermal vents.

Further evidence that LUCA was mesophilic comes from the amino acid composition of its proteins. The abundance of I, V, Y, W, R, E, and L amino acids (denoted IVYWREL) in an organism's proteins is correlated with its optimal growth temperature. According to phylogenetic analysis, the IVYWREL content of LUCA's proteins suggests its ideal temperature was below 50°C.

Evidence that bacteria and archaea both independently underwent phases of increased and subsequently decreased thermo-tolerance suggests a dramatic post-LUCA climate shift that affected both populations, and would explain the seeming genetic pervasiveness of thermo-tolerant genetics.

Age

Studies from 2000 to 2018 have suggested an increasingly ancient time for the LUCA. In 2000, estimates of the LUCA's age ranged from 3.8 to 3.5 bya (billion years ago) in the Paleoarchean, a few hundred million years before the earliest fossil evidence of life, for which candidates range in age from 4.28 to 3.48 bya. This placed the origin of the first forms of life shortly after the Late Heavy Bombardment which was thought to have repeatedly sterilized Earth's surface. However, a 2018 study by Holly Betts and colleagues applied a molecular clock model to the genomic and fossil record (102 species, 29 common protein-coding genes, mostly ribosomal), concluding that LUCA preceded the Late Heavy Bombardment (making the LUCA over 3.9 bya). A 2022 study suggested an age of around 4.2–3.6 bya for the LUCA. A 2024 study suggested that the LUCA lived around 4.2 bya (with a confidence interval of 4.33–4.09 bya).

Root of the tree of life

2005 tree of life showing horizontal gene transfers between branches including (coloured lines) the symbiogenesis of plastids and mitochondria. "Horizontal gene transfer and how it has impacted the evolution of life is presented through a web connecting bifurcating branches that complicate, yet do not erase, the tree of life".

In 1990, a novel concept of the tree of life was presented, dividing the living world into three stems, classified as the domains Bacteria, Archaea, Eukarya. It is the first tree founded exclusively on molecular phylogenetics, and which includes the evolution of microorganisms. It has been called a "universal phylogenetic tree in rooted form". This tree and its rooting became the subject of debate.

In the meantime, numerous modifications of this tree, mainly concerning the role and importance of horizontal gene transfer for its rooting and early ramifications have been suggested (e.g.). Since heredity occurs both vertically and horizontally, the tree of life may have been more weblike or netlike in its early phase and more treelike when it grew three-stemmed. Presumably horizontal gene transfer has decreased with growing cell stability.

A modified version of the tree, based on several molecular studies, has its root between a monophyletic domain Bacteria and a clade formed by Archaea and Eukaryota. A small minority of studies place the root in the domain bacteria, in the phylum Bacillota, or state that the phylum Chloroflexota (formerly Chloroflexi) is basal to a clade with Archaea and Eukaryotes and the rest of bacteria (as proposed by Thomas Cavalier-Smith). Metagenomic analyses recover a two-domain system with the domains Archaea and Bacteria; in this view of the tree of life, Eukaryotes are derived from Archaea. With the later gene pool of LUCA's descendants, sharing a common framework of the AT/GC rule and the standard twenty amino acids, horizontal gene transfer would have become feasible and could have been common.

The nature of LUCA remains disputed. In 1994, on the basis of primordial metabolism (as discussed by Wächtershäuser), Otto Kandler proposed a successive divergence of the three domains of life from a multiphenotypical population of pre-cells, reached by gradual evolutionary improvements (cellularization). The phenotypically diverse pre-cells of this population were metabolising, self-reproducing entities exhibiting frequent mutual exchange of genetic information. Thus, in this scenario there was no "first cell". It may explain the unity and, at the same time, the partition into three lines (the three domains) of life. Kandler's pre-cell theory is supported by Wächtershäuser. In 1998, Carl Woese, based on the RNA world concept, proposed that no individual organism could be considered a LUCA, and that the genetic heritage of all modern organisms derived through horizontal gene transfer among an ancient community of organisms. Other authors concur that there was a "complex collective genome" at the time of the LUCA, and that horizontal gene transfer was important in the evolution of later groups; Nicolas Glansdorff states that LUCA "was in a metabolically and morphologically heterogeneous community, constantly shuffling around genetic material" and "remained an evolutionary entity, though loosely defined and constantly changing, as long as this promiscuity lasted."

The theory of a universal common ancestry of life is widely accepted. In 2010, based on "the vast array of molecular sequences now available from all domains of life," D. L. Theobald published a "formal test" of universal common ancestry (UCA). This deals with the common descent of all extant terrestrial organisms, each being a genealogical descendant of a single species from the distant past. His formal test favoured the existence of a universal common ancestry over a wide class of alternative hypotheses that included horizontal gene transfer. Basic biochemical principles imply that all organisms do have a common ancestry.

A proposed non-cellular ancestor to LUCA is the First universal common ancestor (FUCA). FUCA would therefore be the ancestor to every modern cell as well as to ancient, now-extinct cellular lineages not descending from LUCA. FUCA is assumed to have had descendants other than LUCA, none of which have modern descendants. Some genes of these ancient now-extinct cell lineages are thought to have been horizontally transferred into the genome of early descendants of LUCA.

LUCA and viruses

The origin of viruses remains disputed. Since viruses need host cells for their replication, it is likely that they emerged after the formation of cells. Viruses may even have multiple origins and different types of viruses may have evolved independently over the history of life. There are different hypotheses for the origins of viruses, for instance an early viral origin from the RNA world or a later viral origin from selfish DNA.

Based on how viruses are currently distributed across the bacteria and archaea, the LUCA is suspected of having been prey to multiple viruses, ancestral to those that now have those two domains as their hosts. Furthermore, extensive virus evolution seems to have preceded the LUCA, since the jelly-roll structure of capsid proteins is shared by RNA and DNA viruses across all three domains of life. LUCA's viruses were probably mainly dsDNA viruses in the groups called Duplodnaviria and Varidnaviria. Two other single-stranded DNA virus groups within the Monodnaviria, the Microviridae and the Tubulavirales, likely infected the last bacterial common ancestor. The last archaeal common ancestor was probably host to spindle-shaped viruses. All of these could well have affected the LUCA, in which case each must since have been lost in the host domain where it is no longer extant. By contrast, RNA viruses do not appear to have been important parasites of LUCA, even though straightforward thinking might have envisaged viruses as beginning with RNA viruses directly derived from an RNA world. Instead, by the time the LUCA lived, RNA viruses had probably already been out-competed by DNA viruses.

LUCA might have been the ancestor to some viruses, as it might have had at least two descendants: LUCELLA, the Last Universal Cellular Ancestor, the ancestor to all cells, and the archaic virocell ancestor, the ancestor to large-to-medium-sized DNA viruses. Viruses might have evolved before LUCA but after the First universal common ancestor (FUCA), according to the reduction hypothesis, where giant viruses evolved from primordial cells that became parasitic.

Epigenetics of anxiety and stress–related disorders

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Epigenetics_of_anxiety_and_st...