Search This Blog

Tuesday, January 25, 2022

Missing baryon problem

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Missing_baryon_problem

In cosmology, the missing baryon problem is an observed discrepancy between the amount of baryonic matter detected from shortly after the Big Bang and from more recent epochs. Observations of the cosmic microwave background and Big Bang nucleosynthesis studies have set constraints on the abundance of baryons in the early universe, finding that baryonic matter accounts for approximately 4.8% of the energy contents of the Universe. At the same time, a census of baryons in the recent observable universe has found that observed baryonic matter accounts for less than half of that amount. This discrepancy is commonly known as the missing baryon problem. The missing baryon problem is different from the dark matter problem, which is non-baryonic in nature.

Early universe measurements

The abundance of baryonic matter in the early universe can be obtained indirectly from two independent methods:

  • The theory of Big Bang nucleosynthesis, which predicts the observed relative abundance of the chemical elements in observations of the recent universe. Higher numbers of baryons in the early universe should produce higher ratios for helium, lithium, and heavier elements relative to hydrogen. Agreement with observed abundances requires that baryonic matter makes up between 4–5% of the universe's critical density.
  • Detailed analysis of the small fluctuations (anisotropies) in the cosmic microwave background (CMB), especially the second peak of the CMB power spectrum. Baryonic matter interacts with photons and therefore leaves a visible imprint on the CMB. CMB analysis also yields a baryon fraction on the order of 5%.

The CMB constraint is much more precise than the BBN constraint, but the two are in agreement.

Late universe observations

The density of baryonic matter can be obtained directly by summing up all the known baryonic matter. This is highly nontrivial, since although luminous matter such as stars and galaxies are easily summed, baryonic matter can also exist in highly non-luminous form, such as black holes, planets, and highly diffuse interstellar gas. Nonetheless it can still be done, using a range of techniques:

  • Making use of the Lyman-alpha forest; clouds of diffuse, baryonic gas or dust are sometimes visible when backlit by stars. The resulting spectra can be used to infer the mass between the star and the observer.
  • Gravitational microlensing. If a planet or other dark object moves between the observer and a faraway source, the image of the source is distorted. The mass of the dark object can be inferred based on the amount of distortion.
  • Sunyaev–Zel'dovich effect. The interaction between CMB photons and free electrons leaves an imprint in the CMB. This effect is sensitive to all free electrons independently of their temperature or the density of the surrounding medium, and thus it can be used to study baryonic matter otherwise not hot enough to be detected.
Generated image of the cosmic web which contains warm-hot regions where the missing baryons have been detected.

Prior to 2017, baryons were found to be distributed 10% inside galaxies, 50–60% in the circum-galactic medium, and 30–40% unaccounted, therefore accounting for about 70% of theoretical predictions.

Large scale galaxy surveys in the 2000s revealed a baryon deficit. This led theorists to reexamine their models and predict that gas must flow between galaxies and galaxy clusters.

Warm-Hot intergalactic medium

The Lambda-CDM model of the big bang predicts that matter between galaxies in the universe is distributed in web-like formations with a low density (1 - 10 particles per cubic meter) known as the Warm-hot intergalactic medium (WHIM). Cosmological hydrodynamical simulations from theory predict that a fraction of the missing baryons are located in the galactic halo at temperatures of 106 K and the (WHIM) at temperatures of 105–107 K, with recent observations providing strong support. The WHIM is composed of three states:

  • A warm state with temperatures 105–105.7 K. Neutral hydrogen is present in this state. (Observed via Oxygen-VI absorption lines)
  • A hot state with temperatures of 105.7–106.3 K. (Observed via Oxygen-VII in soft x-rays)
  • A very hot state with temperatures of 106.3–107 K. Very few hydrogen or hydrogen like metals, mostly present near the outskirts of galaxy clusters.

The warm phase of the WHIM had been previously detected and composes around 15% of the baryon content. The WHIM is mostly composed of ionized hydrogen. This creates difficulty for astronomers trying to detect baryons in the WHIM. It is easier to detect the WHIM through highly ionized oxygen such as OVI and OVII absorption.

Universe composition

The distribution of known baryons in the universe.

The census of known baryons in the universe tallied to around 60% of total baryons until the resolution of the missing baryon problem. This is in distinction from composition of the entire universe which includes dark energy and dark matter of which baryonic matter composes only 5%. Around 7% of baryons exists in stars and galaxies, while most of it exists around galaxies or galaxy clusters. The Lyman-alpha forest contains around 28% of the baryons. The warm phase of the WHIM was detected by soft X-ray absorption in 2012 to establish 15% of total baryon content. The intracluster medium (ICM) accounts for around 4% of total baryon content. It is composed of mostly ionized hydrogen and is about 10% of a galaxy cluster's total mass; the rest being dark matter. The ICM is low density with around 10−3 particles per cm3. The circum-galactic medium (CGM) was confirmed in 2003 by Chandra and Xmm-Newton. The CGM is a large sphere surrounding galaxies with a radius > 70 - 200 kpc. The CGM accounts for 5% of total baryons in the universe.

Detection methods

There are three main methods of detecting the WHIM where the missing baryons lie: the Sunyaev-Zel'dovich effect, Lyman-alpha emission lines, and metal absorption lines.

Sunyaev-Zel'dovich effect

The thermal Sunyaev-Zel'dovich (tSZ) effect occurs when photons from the CMB inverse Compton Scatter off ionized gas. For detecting baryons, the ionized gas from the WHIM is scattered by the CMB photons. The y-parameter quantifies the strength of the tSZ effect and is defined as:

.

Where is Boltzmann's constant, is the Thompson cross-section, is electron number density, is the electron rest mass energy, and is the temperature. Finding the y-parameter and overlaying that with a map of cosmic filament from millions of galaxies allows astronomers to find the weak signal from the WHIM. The y-parameter signal from a galaxy pair is overlaid on a model for galaxy halos. The signals are subtracted to reveal a signal between the two galaxies. This resulting signal is the filament. To ensure the signal is not coming from any other source, astronomers generate a control simulation which they use to compare and are able to determine that source must be from the WHIM.

Lyman-Alpha emission

The Lyman-alpha (Lyα) emission lines are detected from ionized hydrogen in cosmic filament. A source, such as a quasar, ionizes hydrogen in the cosmic filament leaving detectable dips in the absorption lines.

Metal absorption lines

Highly ionized oxygen like O+6, O+7, and O+8 absorption lines in the soft X-rays at energies of 0.6 - 0.8 keV. The column density of these lines can be derived:

.

Where is the abundance of a particular oxygen ion, is Hubble's constant, is the critical density .

Claimed resolution

The missing baryon problem was proclaimed to be solved in 2017 when two groups of scientists who were working independently found evidence for the location of missing baryons in intergalactic matter. The missing baryons had been postulated to exist as hot strands between galaxy pairs (WHIM). Since the strands are diffuse and they are not hot enough to emit x-rays, they are difficult to detect. The groups used the thermal Sunyaev–Zeldovich effect to measure the density of the strands in the local Universe. If baryons are present there, then some amount of energy should be lost when light from the cosmic microwave background scatters off of them. These show up as very dim patches in the CMB. The patches are too dim to see directly, but when overlaid with the visible galaxy distribution, become detectable. The density of the strands comes up to about 30% of the baryonic density, the exact amount needed to solve the problem. Despite their success, these works only set constraints on the distribution of baryons between nearby galaxies, failing to provide a complete picture of cosmic gas in the late universe.

Using observations of the kinematic Sunyaev-Zel'dovich effect spanning more than 90% of the lifetime of the Universe, in 2021 astrophysicists found that approximately 50% of all baryonic matter is outside dark matter haloes, filling the space between galaxies. Together with the amount of baryons inside galaxies and surrounding these, the total amount of baryons in the late time Universe is now compatible with early Universe measurements.

Current state

Currently, many groups have observed the intergalactic medium and circum-galactic medium to obtain more measurements and observations of baryons to support the leading observations. Baryons have more or less been found, so groups are working to detect them to a higher level of significance. Methods used have included soft X-ray, OVI, OVII, and OVIII absorption.

In 2020 astrophysicists reported the first direct X-ray emissions measurement of baryonic matter of cosmic web filaments, strengthening empirical support for the recent solution to the problem.  The missing baryon problem has been resolved but research groups are working to detect the WHIM using varying methods to confirm results.

Baryon asymmetry

From Wikipedia, the free encyclopedia

In physical cosmology, the baryon asymmetry problem, also known as the matter asymmetry problem or the matter–antimatter asymmetry problem, is the observed imbalance in baryonic matter (the type of matter experienced in everyday life) and antibaryonic matter in the observable universe. Neither the standard model of particle physics, nor the theory of general relativity provides a known explanation for why this should be so, and it is a natural assumption that the universe is neutral with all conserved charges. The Big Bang should have produced equal amounts of matter and antimatter. Since this does not seem to have been the case, it is likely some physical laws must have acted differently or did not exist for matter and antimatter. Several competing hypotheses exist to explain the imbalance of matter and antimatter that resulted in baryogenesis. However, there is as of yet no consensus theory to explain the phenomenon, which has been described as "one of the great mysteries in physics".

Sakharov conditions

In 1967, Andrei Sakharov proposed a set of three necessary conditions that a baryon-generating interaction must satisfy to produce matter and antimatter at different rates. These conditions were inspired by the recent discoveries of the cosmic background radiation and CP violation in the neutral kaon system. The three necessary "Sakharov conditions" are:

Baryon number violation

Baryon number violation is a necessary condition to produce an excess of baryons over anti-baryons. But C-symmetry violation is also needed so that the interactions which produce more baryons than anti-baryons will not be counterbalanced by interactions which produce more anti-baryons than baryons. CP-symmetry violation is similarly required because otherwise equal numbers of left-handed baryons and right-handed anti-baryons would be produced, as well as equal numbers of left-handed anti-baryons and right-handed baryons. Finally, the interactions must be out of thermal equilibrium, since otherwise CPT symmetry would assure compensation between processes increasing and decreasing the baryon number.

Currently, there is no experimental evidence of particle interactions where the conservation of baryon number is broken perturbatively: this would appear to suggest that all observed particle reactions have equal baryon number before and after. Mathematically, the commutator of the baryon number quantum operator with the (perturbative) Standard Model hamiltonian is zero: . However, the Standard Model is known to violate the conservation of baryon number only non-perturbatively: a global U(1) anomaly. To account for baryon violation in baryogenesis, such events (including proton decay) can occur in Grand Unification Theories (GUTs) and supersymmetric (SUSY) models via hypothetical massive bosons such as the X boson.

CP-symmetry violation

The second condition for generating baryon asymmetry—violation of charge-parity symmetry—is that a process is able to happen at a different rate to its antimatter counterpart. In the Standard Model, CP violation appears as a complex phase in the quark mixing matrix of the weak interaction. There may also be a non-zero CP-violating phase in the neutrino mixing matrix, but this is currently unmeasured. The first in a series of basic physics principles to be violated was parity through Chien-Shiung Wu's experiment. This led to CP violation being verified in the 1964 Fitch–Cronin experiment with neutral kaons, which resulted in the 1980 Nobel Prize in physics (direct CP violation, that is violation of CP symmetry in a decay process, was discovered later, in 1999). Due to CPT symmetry, violation of CP symmetry demands violation of time inversion symmetry, or T-symmetry. Despite the allowance for CP violation in the Standard Model, it is insufficient to account for the observed baryon asymmetry of the universe given the limits on baryon number violation, meaning that beyond-Standard Model sources are needed.

A possible new source of CP violation was found at the Large Hadron Collider (LHC) by the LHCb collaboration during the first three years of LHC operations (beginning March 2010). The experiment analyzed the decays of two particles, the bottom Lambdab0) and its antiparticle, and compared the distributions of decay products. The data showed an asymmetry of up to 20% of CP-violation sensitive quantities, implying a breaking of CP-symmetry. This analysis will need to be confirmed by more data from subsequent runs of the LHC.

Interactions out of thermal equilibrium

In the out-of-equilibrium decay scenario, the last condition states that the rate of a reaction which generates baryon-asymmetry must be less than the rate of expansion of the universe. In this situation the particles and their corresponding antiparticles do not achieve thermal equilibrium due to rapid expansion decreasing the occurrence of pair-annihilation.

Other explanations

Regions of the universe where antimatter dominates

Another possible explanation of the apparent baryon asymmetry is that matter and antimatter are essentially separated into different, widely distant regions of the universe. The formation of antimatter galaxies was originally thought to explain the baryon asymmetry, as from a distance, antimatter atoms are indistinguishable from matter atoms; both produce light (photons) in the same way. Along the boundary between matter and antimatter regions, however, annihilation (and the subsequent production of gamma radiation) would be detectable, depending on its distance and the density of matter and antimatter. Such boundaries, if they exist, would likely lie in deep intergalactic space. The density of matter in intergalactic space is reasonably well established at about one atom per cubic meter. Assuming this is a typical density near a boundary, the gamma ray luminosity of the boundary interaction zone can be calculated. No such zones have been detected, but 30 years of research have placed bounds on how far they might be. On the basis of such analyses, it is now deemed unlikely that any region within the observable universe is dominated by antimatter.

One attempt to explain the lack of observable interfaces between matter and antimatter dominated regions is that they are separated by a Leidenfrost layer of very hot matter created by the energy released from annihilation. This is similar to the manner in which water may be separated from a hot plate by a layer of evaporated vapor, delaying the evaporation of more water.

Electric dipole moment

The presence of an electric dipole moment (EDM) in any fundamental particle would violate both parity (P) and time (T) symmetries. As such, an EDM would allow matter and antimatter to decay at different rates leading to a possible matter–antimatter asymmetry as observed today. Many experiments are currently being conducted to measure the EDM of various physical particles. All measurements are currently consistent with no dipole moment. However, the results do place rigorous constraints on the amount of symmetry violation that a physical model can permit. The most recent EDM limit, published in 2014, was that of the ACME Collaboration, which measured the EDM of the electron using a pulsed beam of thorium monoxide (ThO) molecules.

Mirror anti-universe

The Big Bang generated a universe–antiuniverse pair, our universe flows forward in time, while our mirror counterpart flows backward.

The state of universe, as it is, does not violate the CPT symmetry, because the Big Bang could be considered as a double sided event, both classically and quantum mechanically, consisting of a universe-antiuniverse pair. This means that this universe is the charge (C), parity (P) and time (T) image of the anti-universe. This pair emerged from the Big Bang epochs not directly into a hot, radiation-dominated era. The antiuniverse would flow back in time from the Big Bang, becoming bigger as it does so, and would be also dominated by antimatter. Its spatial properties are inverted if compared to those in our universe, a situation analogous to creating electronpositron pairs in a vacuum. This model, devised by physicists from the Perimeter Institute for Theoretical Physics in Canada, proposes that temperature fluctuations in the cosmic microwave background (CMB) are due to the quantum-mechanical nature of space-time near the Big Bang singularity. This means that a point in the future of our universe and a point in the distant past of the antiuniverse would provide fixed classical points, while all possible quantum-based permutations would exist in between. Quantum uncertainty causes the universe and antiuniverse to not be exact mirror images of each other.

This model has not shown if it can reproduce certain observations regarding the inflation scenario, such as explaining the uniformity of the cosmos on large scales. However, it provides a natural and straightforward explanation for dark matter. Such a universe-antiuniverse pair would produce large numbers of superheavy neutrinos, also known as sterile neutrinos. These neutrinos might also be the source of recently observed bursts of high-energy cosmic rays.

Baryon asymmetry parameter

The challenges to the physics theories are then to explain how to produce the predominance of matter over antimatter, and also the magnitude of this asymmetry. An important quantifier is the asymmetry parameter,

This quantity relates the overall number density difference between baryons and antibaryons (nB and nB, respectively) and the number density of cosmic background radiation photons nγ.

According to the Big Bang model, matter decoupled from the cosmic background radiation (CBR) at a temperature of roughly 3000 kelvin, corresponding to an average kinetic energy of 3000 K / (10.08×103 K/eV) = 0.3 eV. After the decoupling, the total number of CBR photons remains constant. Therefore, due to space-time expansion, the photon density decreases. The photon density at equilibrium temperature T per cubic centimeter, is given by

with kB as the Boltzmann constant, ħ as the Planck constant divided by 2π and c as the speed of light in vacuum, and ζ(3) as Apéry's constant. At the current CBR photon temperature of 2.725 K, this corresponds to a photon density nγ of around 411 CBR photons per cubic centimeter.

Therefore, the asymmetry parameter η, as defined above, is not the "good" parameter. Instead, the preferred asymmetry parameter uses the entropy density s,

because the entropy density of the universe remained reasonably constant throughout most of its evolution. The entropy density is

with p and ρ as the pressure and density from the energy density tensor Tμν, and g* as the effective number of degrees of freedom for "massless" particles (inasmuch as mc2kBT holds) at temperature T,

,

for bosons and fermions with gi and gj degrees of freedom at temperatures Ti and Tj respectively. Presently, s = 7.04nγ.

Laterality

From Wikipedia, the free encyclopedia

The term laterality refers to the preference most humans show for one side of their body over the other. Examples include left-handedness/right-handedness and left/right-footedness; it may also refer to the primary use of the left or right hemisphere in the brain. It may also apply to animals or plants. The majority of tests have been conducted on humans, specifically to determine the effects on language.

Human

The majority of humans are right-handed. Many are also right-sided in general (that is, they prefer to use their right eye, right foot and right ear if forced to make a choice between the two). The reasons for this are not fully understood, but it is thought that because the left cerebral hemisphere of the brain controls the right side of the body, the right side is generally stronger; it is suggested that the left cerebral hemisphere is dominant over the right in most humans because in 90-92% of all humans, the left hemisphere is the language hemisphere.

Human cultures are predominantly right-handed, and so the right-sided trend may be socially as well as biologically enforced. This is quite apparent from a quick survey of languages. The English word "left" comes from the Anglo-Saxon word lyft which means "weak" or "useless". Similarly, the French word for left, gauche, is also used to mean "awkward" or "tactless", and sinistra, the Latin word from which the English word "sinister" was derived, means "left". Similarly, in many cultures the word for "right" also means "correct". The English word "right" comes from the Anglo-Saxon word riht which also means "straight" or "correct."

This linguistic and social bias is not restricted to European cultures: for example, Chinese characters are designed for right-handers to write, and no significant left-handed culture has ever been found in the world.

When a person is forced to use the hand opposite of the hand that they would naturally use, this is known as forced laterality, or more specifically forced dextrality. A study done by the Department of Neurology at Keele University, North Staffordshire Royal Infirmary suggests that forced dextrality may be part of the reason that the percentage of left-handed people decreases with the higher age groups, both because the effects of pressures toward right-handedness are cumulative over time (hence increasing with age for any given person subjected to them) and because the prevalence of such pressure is decreasing, such that fewer members of younger generations face any such pressure to begin with.

Ambidexterity is when a person has approximately equal skill with both hands and/or both sides of the body. True ambidexterity is very rare. Although a small number of people can write competently with both hands and use both sides of their body well, even these people usually show preference for one side of their body over the other. However, this preference is not necessarily consistent for all activities. Some people may, for instance, use their right hand for writing, and their left hand for playing racket sports and eating.

Also, it is not uncommon that people preferring to use the right hand prefer to use the left leg, e.g. when using a shovel, kicking a ball, or operating control pedals. In many cases, this may be because they are disposed for left-handedness but have been trained for right-handedness, which is usually attached to learning and behavioural disorders (term usually so called as "cross dominance"). In the sport of cricket, some players may find that they are more comfortable bowling with their left or right hand, but batting with the other hand.

Approximate statistics are below:

Laterality of motor and sensory control has been the subject of a recent intense study and review. It turns out that the hemisphere of speech is the hemisphere of action in general and that the command hemisphere is located either in the right or the left hemisphere (never in both). Around 80% of people are left hemispheric for speech and the remainder are right hemispheric: ninety percent of right-handers are left hemispheric for speech, but only 50% of left-handers are right hemispheric for speech (the remainder are left hemispheric). The reaction timeof the neurally dominant side of the body (the side opposite to the major hemisphere or the command center, as just defined) is shorter than that of the opposite side by an interval equal to the interhemispheric transfer time. Thus, one in five persons has a handedness that is the opposite for which they are wired (per laterality of command center or brainedness, as determined by reaction time study mentioned above).

Different expressions

  • Board footedness: The stance in a boardsport is not necessarily the same as the normal-footedness of the person. In skateboarding and other board sports, a “goofy footed” stance is one with the right foot leading. A stance with the left foot forward is called “regular” or “normal” stance.
  • Jump and spin: Direction of rotation in figure skating jumps and spins is not necessarily the same as the footedness or the handedness of each person. A skater can jump and spin counter-clockwise (the most common direction), yet be left-footed and left-handed.
  • Ocular dominance: The eye preferred when binocular vision is not possible, as through a keyhole or monocular microscope.

Speech

Cerebral dominance or specialization has been studied in relation to a variety of human functions. With speech in particular, many studies have been used as evidence that it is generally localized in the left hemisphere. Research comparing the effects of lesions in the two hemispheres, split-brain patients, and perceptual asymmetries have aided in the knowledge of speech lateralization. In one particular study, the left hemisphere's sensitivity to differences in rapidly changing sound cues was noted (Annett, 1991). This has real world implication, since very fine acoustic discriminations are needed to comprehend and produce speech signals. In an electrical stimulation demonstration performed by Ojemann and Mateer (1979), the exposed cortex was mapped revealing the same cortical sites were activated in phoneme discrimination and mouth movement sequences (Annett, 1991).

As suggested by Kimura (1975, 1982), left hemisphere speech lateralization might be based upon a preference for movement sequences as demonstrated by American Sign Language (ASL) studies. Since ASL requires intricate hand movements for language communication, it was proposed that skilled hand motions and speech require sequences of action over time. In deaf patients suffering from a left hemispheric stroke and damage, noticeable losses in their abilities to sign were noted. These cases were compared to studies of normal speakers with dysphasias located at lesioned areas similar to the deaf patients. In the same study, deaf patients with right hemispheric lesions did not display any significant loss of signing nor any decreased capacity for motor sequencing (Annett, 1991).

One theory, known as the acoustic laterality theory, the physical properties of certain speech sounds are what determine laterality to the left hemisphere. Stop consonants, for example t, p, or k, leave a defined silent period at the end of words that can easily be distinguished. This theory postulates that changing sounds such as these are preferentially processed by the left hemisphere. As a result of the right ear being responsible for transmission to sounds to the left hemisphere, it is capable of perceiving these sounds with rapid changes. This right ear advantage in hearing and speech laterality was evidenced in dichotic listening studies. Magnetic imaging results from this study showed greater left hemisphere activation when actual words were presented as opposed to pseudowords (Shtyrov, Pihko, and Pulvermuller, 2005). Two important aspects of speech recognition are phonetic cues, such as format patterning, and prosody cues, such as intonation, accent, and emotional state of the speaker (Imaizumi, Koichi, Kiritani, Hosoi & Tonoike, 1998).

In a study done with both monolinguals and bilinguals, which took into account language experience, second language proficiency, and onset of bilingualism among other variables, researchers were able to demonstrate left hemispheric dominance. In addition, bilinguals that began speaking a second language early in life demonstrated bilateral hemispheric involvement. The findings of this study were able to predict differing patterns of cerebral language lateralization in adulthood (Hull & Vaid, 2006).

In other animals

It has been shown that cerebral lateralization is a widespread phenomenon in the animal kingdom. Functional and structural differences between left and right brain hemispheres can be found in many other vertebrates and also in invertebrates.

It has been proposed that negative, withdrawal-associated emotions are processed predominantly by the right hemisphere, whereas the left hemisphere is largely responsible for processing positive, approach-related emotions. This has been called the "laterality-valence hypothesis".

One sub-set of laterality in animals is limb dominance. Preferential limb use for specific tasks has been shown in species including chimpanzees, mice, bats, wallabies, parrots, chickens and toads.

Another form of laterality is hemispheric dominance for processing conspecific vocalizations, reported for chimpanzees, sea lions, dogs, zebra finches and Bengalese finches.

In mice

In mice (Mus musculus), laterality in paw usage has been shown to be a learned behavior (rather than inherited), due to which, in any population, half of the mice become left-handed while the other half becomes right-handed. The learning occurs by a gradual reinforcement of randomly occurring weak asymmetries in paw choice early in training, even when training in an unbiased world. Meanwhile, reinforcement relies on short-term and long-term memory skills that are strain-dependent, causing strains to differ in the degree of laterality of its individuals. Long-term memory of previously gained laterality in handedness due to training is heavily diminished in mice with absent corpus callosum and reduced hippocampal commissure. Regardless of the amount of past training and consequent biasing of paw choice, there is a degree of randomness in paw choice that is not removed by training, which may provide adaptability to changing environments.

In other mammals

Domestic horses (Equus caballus) exhibit laterality in at least two areas of neural organization, i.e. sensory and motor. In thoroughbreds, the strength of motor laterality increases with age. Horses under 4 years old have a preference to initially use the right nostril during olfaction. Along with olfaction, French horses have an eye laterality when looking at novel objects. There is a correlation between their score on an emotional index and eye preference; horses with higher emotionality are more likely to look with their left eye. The less emotive French saddlebreds glance at novel objects using the right eye, however, this tendency is absent in the trotters, although the emotive index is the same for both breeds. Racehorses exhibit laterality in stride patterns as well. They use their preferred stride pattern at all times whether racing or not, unless they are forced to change it while turning, injured, or fatigued.

In domestic dogs (Canis familiaris), there is a correlation between motor laterality and noise sensitivity - a lack of paw preference is associated with noise-related fearfulness. (Branson and Rogers, 2006). Fearfulness is an undesirable trait in guide dogs, therefore, testing for laterality can be a useful predictor of a successful guide dog. Knowing a guide dog's laterality can also be useful for training because the dog may be better at walking to the left or the right of their blind owner.

Domestic cats (Felis catus) show an individual handedness when reaching for static food. In one study, 46% preferred to use the right paw, 44% the left, and 10% were ambi-lateral; 60% used one paw 100% of the time. There was no difference between male and female cats in the proportions of left and right paw preferences. In moving-target reaching tests, cats have a left-sided behavioural asymmetry. One study indicates that laterality in this species is strongly related to temperament. Furthermore, individuals with stronger paw preferences are rated as more confident, affectionate, active, and friendly.

Chimpanzees show right-handedness in certain conditions. This is expressed at the population level for females, but not males. The complexity of the task has a dominant effect on handedness in chimps.

Cattle use visual/brain lateralisation in their visual scanning of novel and familiar stimuli. Domestic cattle prefer to view novel stimuli with the left eye, (similar to horses, Australian magpies, chicks, toads and fish) but use the right eye for viewing familiar stimuli.

Schreibers' long-fingered bat is lateralized at the population level and shows a left-hand bias for climbing or grasping.

Some types of mastodon indicate laterality through the fossil remains having differing tusk lengths.

In marsupials

Marsupials are fundamentally different from other mammals in that they lack a corpus callosum. However, wild kangaroos and other macropod marsupials have a left-hand preference for everyday tasks. Left-handedness is particularly apparent in the red kangaroo (Macropus rufus) and the eastern gray kangaroo (Macropus giganteus). The red-necked wallaby (Macropus rufogriseus) preferentially uses the left hand for behaviours that involve fine manipulation, but the right for behaviours that require more physical strength. There is less evidence for handedness in arboreal species.

In birds

Parrots tend to favor one foot when grasping objects (for example fruit when feeding). Some studies indicate that most parrots are left footed.

The Australian magpie (Gymnorhina tibicen) uses both left-eye and right-eye laterality when performing anti-predator responses, which include mobbing. Prior to withdrawing from a potential predator, Australian magpies view the animal with the left eye (85%), but prior to approaching, the right eye is used (72%). The left eye is used prior to jumping (73%) and prior to circling (65%) the predator, as well as during circling (58%) and for high alert inspection of the predator (72%). The researchers commented that "mobbing and perhaps circling are agonistic responses controlled by the LE[left eye]/right hemisphere, as also seen in other species. Alert inspection involves detailed examination of the predator and likely high levels of fear, known to be right hemisphere function."

Yellow-legged gull (Larus michahellis) chicks show laterality when reverting from a supine to prone posture, and also in pecking at a dummy parental bill to beg for food. Lateralization occurs at both the population and individual level in the reverting response and at the individual level in begging. Females have a leftward preference in the righting response, indicating this is sex dependent. Laterality in the begging response in chicks varies according to laying order and matches variation in egg androgens concentration. 

In fish

Laterality determines the organisation of rainbowfish (Melanotaenia spp.) schools. These fish demonstrate an individual eye preference when examining their reflection in a mirror. Fish which show a right-eye preference in the mirror test prefer to be on the left side of the school. Conversely, fish that show a left-eye preference in the mirror test or were non-lateralised, prefer to be slightly to the right side of the school. The behaviour depends on the species and sex of the school.

In amphibians

Three species of toads, the common toad (Bufo bufo), green toad (Bufo viridis) and the cane toad (Bufo marinus) show stronger escape and defensive responses when a model predator was placed on the toad's left side compared to their right side. Emei music frogs (Babina daunchina) have a right-ear preference for positive or neutral signals such as a conspecific's advertisement call and white noise, but a left-ear preference for negative signals such as predatory attack.

In invertebrates

The Mediterranean fruit fly (Ceratitis capitata) exhibits left-biased population-level lateralisation of aggressive displays (boxing with forelegs and wing strikes) with no sex-differences. In ants, Temnothorax albipennis (rock ant) scouts show behavioural lateralization when exploring unknown nest sites, showing a population-level bias to prefer left turns. One possible reason for this is that its environment is partly maze-like and consistently turning in one direction is a good way to search and exit mazes without getting lost. This turning bias is correlated with slight asymmetries in the ants' compound eyes (differential ommatidia count).

Sequence homology

From Wikipedia, the free encyclopedia
 
Gene phylogeny as red and blue branches within grey species phylogeny. Top: An ancestral gene duplication produces two paralogs (histone H1.1 and 1.2). A speciation event produces orthologs in the two daughter species (human and chimpanzee). Bottom: in a separate species (E. coli), a gene has a similar function (histone-like nucleoid-structuring protein) but has a separate evolutionary origin and so is an analog.

Sequence homology is the biological homology between DNA, RNA, or protein sequences, defined in terms of shared ancestry in the evolutionary history of life. Two segments of DNA can have shared ancestry because of three phenomena: either a speciation event (orthologs), or a duplication event (paralogs), or else a horizontal (or lateral) gene transfer event (xenologs).

Homology among DNA, RNA, or proteins is typically inferred from their nucleotide or amino acid sequence similarity. Significant similarity is strong evidence that two sequences are related by evolutionary changes from a common ancestral sequence. Alignments of multiple sequences are used to indicate which regions of each sequence are homologous.

Identity, similarity, and conservation

A sequence alignment of mammalian histone proteins. Sequences are the middle 120-180 amino acid residues of the proteins. Residues that are conserved across all sequences are highlighted in grey. The key below denotes conserved sequence (*), conservative mutations (:), semi-conservative mutations (.), and non-conservative mutations ( ).

The term "percent homology" is often used to mean "sequence similarity”, that is the percentage of identical residues (percent identity), or the percentage of residues conserved with similar physicochemical properties (percent similarity), e.g. leucine and isoleucine, is usually used to "quantify the homology." Based on the definition of homology specified above this terminology is incorrect since sequence similarity is the observation, homology is the conclusion. Sequences are either homologous or not.[3] This involves that the term "percent homology" is a misnomer.

As with morphological and anatomical structures, sequence similarity might occur because of convergent evolution, or, as with shorter sequences, by chance, meaning that they are not homologous. Homologous sequence regions are also called conserved. This is not to be confused with conservation in amino acid sequences, where the amino acid at a specific position has been substituted with a different one that has functionally equivalent physicochemical properties.

Partial homology can occur where a segment of the compared sequences has a shared origin, while the rest does not. Such partial homology may result from a gene fusion event.

Orthology

Top: An ancestral gene duplicates to produce two paralogs (Genes A and B). A speciation event produces orthologs in the two daughter species. Bottom: in a separate species, an unrelated gene has a similar function (Gene C) but has a separate evolutionary origin and so is an analog.

Homologous sequences are orthologous if they are inferred to be descended from the same ancestral sequence separated by a speciation event: when a species diverges into two separate species, the copies of a single gene in the two resulting species are said to be orthologous. Orthologs, or orthologous genes, are genes in different species that originated by vertical descent from a single gene of the last common ancestor. The term "ortholog" was coined in 1970 by the molecular evolutionist Walter Fitch.

For instance, the plant Flu regulatory protein is present both in Arabidopsis (multicellular higher plant) and Chlamydomonas (single cell green algae). The Chlamydomonas version is more complex: it crosses the membrane twice rather than once, contains additional domains and undergoes alternative splicing. However it can fully substitute the much simpler Arabidopsis protein, if transferred from algae to plant genome by means of genetic engineering. Significant sequence similarity and shared functional domains indicate that these two genes are orthologous genes, inherited from the shared ancestor.

Orthology is strictly defined in terms of ancestry. Given that the exact ancestry of genes in different organisms is difficult to ascertain due to gene duplication and genome rearrangement events, the strongest evidence that two similar genes are orthologous is usually found by carrying out phylogenetic analysis of the gene lineage. Orthologs often, but not always, have the same function.

Orthologous sequences provide useful information in taxonomic classification and phylogenetic studies of organisms. The pattern of genetic divergence can be used to trace the relatedness of organisms. Two organisms that are very closely related are likely to display very similar DNA sequences between two orthologs. Conversely, an organism that is further removed evolutionarily from another organism is likely to display a greater divergence in the sequence of the orthologs being studied.

Databases of orthologous genes

Given their tremendous importance for biology and bioinformatics, orthologous genes have been organized in several specialized databases that provide tools to identify and analyze orthologous gene sequences. These resources employ approaches that can be generally classified into those that use heuristic analysis of all pairwise sequence comparisons, and those that use phylogenetic methods. Sequence comparison methods were first pioneered in the COGs database in 1997. These methods have been extended and automated in the following databases:

  • AYbRAH: Analyzing Yeasts by Reconstructing Ancestry of Homologs
  • eggNOG
  • GreenPhylDB for plants
  • InParanoid focuses on pairwise ortholog relationships
  • OHNOLOGS is a repository of the genes retained from whole genome duplications in the vertebrate genomes including human and mouse.
  • OMA
  • OrthoDB appreciates that the orthology concept is relative to different speciation points by providing a hierarchy of orthologs along the species tree.
  • OrthoInspector is a repository of orthologous genes for 4753 organisms covering the three domains of life
  • OrthologID
  • OrthoMaM for mammals
  • OrthoMCL
  • Roundup

Tree-based phylogenetic approaches aim to distinguish speciation from gene duplication events by comparing gene trees with species trees, as implemented in databases and software tools such as:

A third category of hybrid approaches uses both heuristic and phylogenetic methods to construct clusters and determine trees, for example:

  • EnsemblCompara GeneTrees
  • HomoloGene
  • Ortholuge

Paralogy

Paralogous genes are genes that are related via duplication events in the last common ancestor (LCA) of the species being compared. They result from the mutation of duplicated genes during separate speciation events. When descendants from the LCA share mutated homologs of the original duplicated genes then those genes are considered paralogs.

As an example, in the LCA, one gene (gene A) may get duplicated to make a separate similar gene (gene B), those two genes will continue to get passed to subsequent generations. During speciation, one environment will favor a mutation in gene A (gene A1), producing a new species with genes A1 and B. Then in a separate speciation event, one environment will favor a mutation in gene B (gene B1) giving rise to a new species with genes A and B1. The descendants’ genes A1 and B1 are paralogous to each other because they are homologs that are related via a duplication event in the last common ancestor of the two species.

Additional classifications of paralogs include alloparalogs (out-paralogs) and symparalogs (in-paralogs). Alloparalogs are paralogs that evolved from gene duplications that preceded the given speciation event. In other words, alloparalogs are paralogs that evolved from duplication events that happened in the LCA of the organisms being compared. The example above is an example alloparalogy. Symparalogs are paralogs that evolved from gene duplication of paralogous genes in subsequent speciation events. From the example above, if the descendant with genes A1 and B underwent another speciation event where gene A1 duplicated, the new species would have genes B, A1a, and A1b. In this example, genes A1a and A1b are symparalogs.

Vertebrate Hox genes are organized in sets of paralogs. Each Hox cluster (HoxA, HoxB, etc.) is on a different chromosome. For instance, the human HoxA cluster is on chromosome 7. The mouse HoxA cluster shown here has 11 paralogous genes (2 are missing).

Paralogous genes can shape the structure of whole genomes and thus explain genome evolution to a large extent. Examples include the Homeobox (Hox) genes in animals. These genes not only underwent gene duplications within chromosomes but also whole genome duplications. As a result, Hox genes in most vertebrates are clustered across multiple chromosomes with the HoxA-D clusters being the best studied.

Another example are the globin genes which encode myoglobin and hemoglobin and are considered to be ancient paralogs. Similarly, the four known classes of hemoglobins (hemoglobin A, hemoglobin A2, hemoglobin B, and hemoglobin F) are paralogs of each other. While each of these proteins serves the same basic function of oxygen transport, they have already diverged slightly in function: fetal hemoglobin (hemoglobin F) has a higher affinity for oxygen than adult hemoglobin. Function is not always conserved, however. Human angiogenin diverged from ribonuclease, for example, and while the two paralogs remain similar in tertiary structure, their functions within the cell are now quite different.

It is often asserted that orthologs are more functionally similar than paralogs of similar divergence, but several papers have challenged this notion.

Regulation

Paralogs are often regulated differently, e.g. by having different tissue-specific expression patterns (see Hox genes). However, they can also be regulated differently on the protein level. For instance, Bacillus subtilis encodes two paralogues of glutamate dehydrogenase: GudB is constitutively transcribed whereas RocG is tightly regulated. In their active, oligomeric states, both enzymes show similar enzymatic rates. However, swaps of enzymes and promoters cause severe fitness losses, thus indicating promoter–enzyme coevolution. Characterization of the proteins shows that, compared to RocG, GudB's enzymatic activity is highly dependent on glutamate and pH.

Paralogous chromosomal regions

Sometimes, large regions of chromosomes share gene content similar to other chromosomal regions within the same genome. They are well characterised in the human genome, where they have been used as evidence to support the 2R hypothesis. Sets of duplicated, triplicated and quadruplicated genes, with the related genes on different chromosomes, are deduced to be remnants from genome or chromosomal duplications. A set of paralogy regions is together called a paralogon. Well-studied sets of paralogy regions include regions of human chromosome 2, 7, 12 and 17 containing Hox gene clusters, collagen genes, keratin genes and other duplicated genes, regions of human chromosomes 4, 5, 8 and 10 containing neuropeptide receptor genes, NK class homeobox genes and many more gene families, and parts of human chromosomes 13, 4, 5 and X containing the ParaHox genes and their neighbors. The Major histocompatibility complex (MHC) on human chromosome 6 has paralogy regions on chromosomes 1, 9 and 19. Much of the human genome seems to be assignable to paralogy regions.

Ohnology

A whole genome duplication event produces a genome with two ohnolog copies of each gene.
 
A speciation event produces orthologs of a gene in the two daughter species. A horizontal gene transfer event from one species to another adds a xenolog of the gene to its genome.
 
A speciation event produces orthologs of a gene in the two daughter species. Subsequent hybridisation of those species generates a hybrid genome with a homoeolog copy of each gene from both species.

Ohnologous genes are paralogous genes that have originated by a process of 2R whole-genome duplication. The name was first given in honour of Susumu Ohno by Ken Wolfe. Ohnologues are useful for evolutionary analysis because all ohnologues in a genome have been diverging for the same length of time (since their common origin in the whole genome duplication). Ohnologues are also known to show greater association with cancers, dominant genetic disorders, and pathogenic copy number variations.

Xenology

Homologs resulting from horizontal gene transfer between two organisms are termed xenologs. Xenologs can have different functions if the new environment is vastly different for the horizontally moving gene. In general, though, xenologs typically have similar function in both organisms. The term was coined by Walter Fitch.

Homoeology

Homoeologous (also spelled homeologous) chromosomes or parts of chromosomes are those brought together following inter-species hybridization and allopolyploidization to form a hybrid genome, and whose relationship was completely homologous in an ancestral species. In allopolyploids, the homologous chromosomes within each parental sub-genome should pair faithfully during meiosis, leading to disomic inheritance; however in some allopolyploids, the homoeologous chromosomes of the parental genomes may be nearly as similar to one another as the homologous chromosomes, leading to tetrasomic inheritance (four chromosomes pairing at meiosis), intergenomic recombination, and reduced fertility.

Gametology

Gametology denotes the relationship between homologous genes on non-recombining, opposite sex chromosomes. The term was coined by García-Moreno and Mindell. 2000. Gametologs result from the origination of genetic sex determination and barriers to recombination between sex chromosomes. Examples of gametologs include CHDW and CHDZ in birds.

Delayed-choice quantum eraser

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Delayed-choice_quantum_eraser A delayed-cho...