Search This Blog

Monday, May 29, 2023

Quantum information

From Wikipedia, the free encyclopedia
Optical lattices use lasers to separate rubidium atoms (red) for use as information bits in neutral-atom quantum processors—prototype devices which designers are trying to develop into full-fledged quantum computers.

Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both the technical definition in terms of Von Neumann entropy and the general computational term.

It is an interdisciplinary field that involves quantum mechanics, computer science, information theory, philosophy and cryptography among other fields. Its study is also relevant to disciplines such as cognitive science, psychology and neuroscience. Its main focus is in extracting information from matter at the microscopic scale. Observation in science is one of the most important ways of acquiring information and measurement is required in order to quantify the observation, making this crucial to the scientific method. In quantum mechanics, due to the uncertainty principle, non-commuting observables cannot be precisely measured simultaneously, as an eigenstate in one basis is not an eigenstate in the other basis. According to the eigenstate–eigenvalue link, an observable is well-defined (definite) when the state of the system is an eigenstate of the observable. Since any two non-commuting observables are not simultaneously well-defined, a quantum state can never contain definitive information about both non-commuting observables.

Information is something physical that is encoded in the state of a quantum system. While quantum mechanics deals with examining properties of matter at the microscopic level, quantum information science focuses on extracting information from those properties, and quantum computation manipulates and processes information – performs logical operations – using quantum information processing techniques.

Quantum information, like classical information, can be processed using digital computers, transmitted from one location to another, manipulated with algorithms, and analyzed with computer science and mathematics. Just like the basic unit of classical information is the bit, quantum information deals with qubits. Quantum information can be measured using Von Neumann entropy.

Recently, the field of quantum computing has become an active research area because of the possibility to disrupt modern computation, communication, and cryptography.

History and development

Development from fundamental quantum mechanics

The history of quantum information theory began at the turn of the 20th century when classical physics was revolutionized into quantum physics. The theories of classical physics were predicting absurdities such as the ultraviolet catastrophe, or electrons spiraling into the nucleus. At first these problems were brushed aside by adding ad hoc hypotheses to classical physics. Soon, it became apparent that a new theory must be created in order to make sense of these absurdities, and the theory of quantum mechanics was born.

Quantum mechanics was formulated by Schrödinger using wave mechanics and Heisenberg using matrix mechanics. The equivalence of these methods was proven later. Their formulations described the dynamics of microscopic systems but had several unsatisfactory aspects in describing measurement processes. Von Neumann formulated quantum theory using operator algebra in a way that it described measurement as well as dynamics. These studies emphasized the philosophical aspects of measurement rather than a quantitative approach to extracting information via measurements.


Evolution of:
Schrödinger (S) Heisenberg (H) Interaction (I)
Ket state constant
Observable constant
Density matrix constant

Development from communication

In 1960s, Stratonovich, Helstrom and Gordon proposed a formulation of optical communications using quantum mechanics. This was the first historical appearance of quantum information theory. They mainly studied error probabilities and channel capacities for communication. Later, Alexander Holevo obtained an upper bound of communication speed in the transmission of a classical message via a quantum channel.

Development from atomic physics and relativity

In the 1970s, techniques for manipulating single-atom quantum states, such as the atom trap and the scanning tunneling microscope, began to be developed, making it possible to isolate single atoms and arrange them in arrays. Prior to these developments, precise control over single quantum systems was not possible, and experiments utilized coarser, simultaneous control over a large number of quantum systems. The development of viable single-state manipulation techniques led to increased interest in the field of quantum information and computation.

In the 1980s, interest arose in whether it might be possible to use quantum effects to disprove Einstein's theory of relativity. If it were possible to clone an unknown quantum state, it would be possible to use entangled quantum states to transmit information faster than the speed of light, disproving Einstein's theory. However, the no-cloning theorem showed that such cloning is impossible. The theorem was one of the earliest results of quantum information theory.

Development from cryptography

Despite all the excitement and interest over studying isolated quantum systems and trying to find a way to circumvent the theory of relativity, research in quantum information theory became stagnant in the 1980s. However, around the same time another avenue started dabbling into quantum information and computation: Cryptography. In a general sense, cryptography is the problem of doing communication or computation involving two or more parties who may not trust one another.

Bennett and Brassard developed a communication channel on which it is impossible to eavesdrop without being detected, a way of communicating secretly at long distances using the BB84 quantum cryptographic protocol. The key idea was the use of the fundamental principle of quantum mechanics that observation disturbs the observed, and the introduction of an eavesdropper in a secure communication line will immediately let the two parties trying to communicate know of the presence of the eavesdropper.

Development from computer science and mathematics

With the advent of Alan Turing's revolutionary ideas of a programmable computer, or Turing machine, he showed that any real-world computation can be translated into an equivalent computation involving a Turing machine. This is known as the Church–Turing thesis.

Soon enough, the first computers were made and computer hardware grew at such a fast pace that the growth, through experience in production, was codified into an empirical relationship called Moore's law. This 'law' is a projective trend that states that the number of transistors in an integrated circuit doubles every two years. As transistors began to become smaller and smaller in order to pack more power per surface area, quantum effects started to show up in the electronics resulting in inadvertent interference. This led to the advent of quantum computing, which used quantum mechanics to design algorithms.

At this point, quantum computers showed promise of being much faster than classical computers for certain specific problems. One such example problem was developed by David Deutsch and Richard Jozsa, known as the Deutsch–Jozsa algorithm. This problem however held little to no practical applications. Peter Shor in 1994 came up with a very important and practical problem, one of finding the prime factors of an integer. The discrete logarithm problem as it was called, could be solved efficiently on a quantum computer but not on a classical computer hence showing that quantum computers are more powerful than Turing machines.

Development from information theory

Around the time computer science was making a revolution, so was information theory and communication, through Claude Shannon. Shannon developed two fundamental theorems of information theory: noiseless channel coding theorem and noisy channel coding theorem. He also showed that error correcting codes could be used to protect information being sent.

Quantum information theory also followed a similar trajectory, Ben Schumacher in 1995 made an analogue to Shannon's noiseless coding theorem using the qubit. A theory of error-correction also developed, which allows quantum computers to make efficient computations regardless of noise, and make reliable communication over noisy quantum channels.

Qubits and information theory

Quantum information differs strongly from classical information, epitomized by the bit, in many striking and unfamiliar ways. While the fundamental unit of classical information is the bit, the most basic unit of quantum information is the qubit. Classical information is measured using Shannon entropy, while the quantum mechanical analogue is Von Neumann entropy. Given a statistical ensemble of quantum mechanical systems with the density matrix , it is given by Many of the same entropy measures in classical information theory can also be generalized to the quantum case, such as Holevo entropy and the conditional quantum entropy.

Unlike classical digital states (which are discrete), a qubit is continuous-valued, describable by a direction on the Bloch sphere. Despite being continuously valued in this way, a qubit is the smallest possible unit of quantum information, and despite the qubit state being continuous-valued, it is impossible to measure the value precisely. Five famous theorems describe the limits on manipulation of quantum information.

  1. no-teleportation theorem, which states that a qubit cannot be (wholly) converted into classical bits; that is, it cannot be fully "read".
  2. no-cloning theorem, which prevents an arbitrary qubit from being copied.
  3. no-deleting theorem, which prevents an arbitrary qubit from being deleted.
  4. no-broadcast theorem, which prevents an arbitrary qubit from being delivered to multiple recipients, although it can be transported from place to place (e.g. via quantum teleportation).
  5. no-hiding theorem, which demonstrates the conservation of quantum information.

These theorems are proven from unitarity, which according to Leonard Susskind is the technical term for the statement that quantum information within the universe is conserved. The five theorems open up possibilities in quantum information processing.

Quantum information processing

The state of a qubit contains all of its information. This state is frequently expressed as a vector on the Bloch sphere. This state can be changed by applying linear transformations or quantum gates to them. These unitary transformations are described as rotations on the Bloch Sphere. While classical gates correspond to the familiar operations of Boolean logic, quantum gates are physical unitary operators.

  • Due to the volatility of quantum systems and the impossibility of copying states, the storing of quantum information is much more difficult than storing classical information. Nevertheless, with the use of quantum error correction quantum information can still be reliably stored in principle. The existence of quantum error correcting codes has also led to the possibility of fault-tolerant quantum computation.
  • Classical bits can be encoded into and subsequently retrieved from configurations of qubits, through the use of quantum gates. By itself, a single qubit can convey no more than one bit of accessible classical information about its preparation. This is Holevo's theorem. However, in superdense coding a sender, by acting on one of two entangled qubits, can convey two bits of accessible information about their joint state to a receiver.
  • Quantum information can be moved about, in a quantum channel, analogous to the concept of a classical communications channel. Quantum messages have a finite size, measured in qubits; quantum channels have a finite channel capacity, measured in qubits per second.
  • Quantum information, and changes in quantum information, can be quantitatively measured by using an analogue of Shannon entropy, called the von Neumann entropy.
  • In some cases quantum algorithms can be used to perform computations faster than in any known classical algorithm. The most famous example of this is Shor's algorithm that can factor numbers in polynomial time, compared to the best classical algorithms that take sub-exponential time. As factorization is an important part of the safety of RSA encryption, Shor's algorithm sparked the new field of post-quantum cryptography that tries to find encryption schemes that remain safe even when quantum computers are in play. Other examples of algorithms that demonstrate quantum supremacy include Grover's search algorithm, where the quantum algorithm gives a quadratic speed-up over the best possible classical algorithm. The complexity class of problems efficiently solvable by a quantum computer is known as BQP.
  • Quantum key distribution (QKD) allows unconditionally secure transmission of classical information, unlike classical encryption, which can always be broken in principle, if not in practice. Do note that certain subtle points regarding the safety of QKD are still hotly debated.

The study of all of the above topics and differences comprises quantum information theory.

Relation to quantum mechanics

Quantum mechanics is the study of how microscopic physical systems change dynamically in nature. In the field of quantum information theory, the quantum systems studied are abstracted away from any real world counterpart. A qubit might for instance physically be a photon in a linear optical quantum computer, an ion in a trapped ion quantum computer, or it might be a large collection of atoms as in a superconducting quantum computer. Regardless of the physical implementation, the limits and features of qubits implied by quantum information theory hold as all these systems are mathematically described by the same apparatus of density matrices over the complex numbers. Another important difference with quantum mechanics is that, while quantum mechanics often studies infinite-dimensional systems such as a harmonic oscillator, quantum information theory concerns both with continuous-variable systems and finite-dimensional systems.

Entropy and information

Entropy measures the uncertainty in the state of a physical system. Entropy can be studied from the point of view of both the classical and quantum information theories.

Classical information theory

Classical information is based on the concepts of information laid out by Claude Shannon. Classical information, in principle, can be stored in a bit of binary strings. Any system having two states is a capable bit.

Shannon entropy

Shannon entropy is the quantification of the information gained by measuring the value of a random variable. Another way of thinking about it is by looking at the uncertainty of a system prior to measurement. As a result, entropy, as pictured by Shannon, can be seen either as a measure of the uncertainty prior to making a measurement or as a measure of information gained after making said measurement.

Shannon entropy, written as a functional of a discrete probability distribution, associated with events , can be seen as the average information associated with this set of events, in units of bits:

This definition of entropy can be used to quantify the physical resources required to store the output of an information source. The ways of interpreting Shannon entropy discussed above are usually only meaningful when the number of samples of an experiment is large.

Rényi entropy

The Rényi entropy is a generalization of Shannon entropy defined above. The Rényi entropy of order r, written as a function of a discrete probability distribution, , associated with events , is defined as:

for and .

We arrive at the definition of Shannon entropy from Rényi when , of Hartley entropy (or max-entropy) when , and min-entropy when .

Quantum information theory

Quantum information theory is largely an extension of classical information theory to quantum systems. Classical information is produced when measurements of quantum systems are made.

Von Neumann entropy

One interpretation of Shannon entropy was the uncertainty associated with a probability distribution. When we want to describe the information or the uncertainty of a quantum state, the probability distributions are simply swapped out by density operators .

s are the eigenvalues of .

Von Neumann plays a similar role in quantum information that Shannon entropy does in classical information.

Applications

Quantum communication

Quantum communication is one of the applications of quantum physics and quantum information. There are some famous theorems such as the no-cloning theorem that illustrate some important properties in quantum communication. Dense coding and quantum teleportation are also applications of quantum communication. They are two opposite ways to communicate using qubits. While teleportation transfers one qubit from Alice and Bob by communicating two classical bits under the assumption that Alice and Bob have a pre-shared Bell state, dense coding transfers two classical bits from Alice to Bob by using one qubit, again under the same assumption, that Alice and Bob have a pre-shared Bell state.

Quantum key distribution

One of the best known applications of quantum cryptography is quantum key distribution which provide a theoretical solution to the security issue of a classical key. The advantage of quantum key distribution is that it is impossible to copy a quantum key because of the no-cloning theorem. If someone tries to read encoded data, the quantum state being transmitted will change. This could be used to detect eavesdropping.

BB84

The first quantum key distribution scheme BB84, developed by Charles Bennett and Gilles Brassard in 1984. It is usually explained as a method of securely communicating a private key from a third party to another for use in one-time pad encryption.

E91

E91 was made by Artur Ekert in 1991. His scheme uses entangled pairs of photons. These two photons can be created by Alice, Bob, or by a third party including eavesdropper Eve. One of the photons is distributed to Alice and the other to Bob so that each one ends up with one photon from the pair.

This scheme relies on two properties of quantum entanglement:

  1. The entangled states are perfectly correlated which means that if Alice and Bob both measure their particles having either a vertical or horizontal polarization, they always get the same answer with 100% probability. The same is true if they both measure any other pair of complementary (orthogonal) polarizations. This necessitates that the two distant parties have exact directionality synchronization. However, from quantum mechanics theory the quantum state is completely random so that it is impossible for Alice to predict if she will get vertical polarization or horizontal polarization results.
  2. Any attempt at eavesdropping by Eve destroys this quantum entanglement such that Alice and Bob can detect.

B92

B92 is a simpler version of BB84.

The main difference between B92 and BB84:

  • B92 only needs two states
  • BB84 needs 4 polarization states

Like the BB84, Alice transmits to Bob a string of photons encoded with randomly chosen bits but this time the bits Alice chooses the bases she must use. Bob still randomly chooses a basis by which to measure but if he chooses the wrong basis, he will not measure anything which is guaranteed by quantum mechanics theories. Bob can simply tell Alice after each bit she sends whether or not he measured it correctly.

Quantum computation

The most widely used model in quantum computation is the quantum circuit, which are based on the quantum bit "qubit". Qubit is somewhat analogous to the bit in classical computation. Qubits can be in a 1 or 0 quantum state, or they can be in a superposition of the 1 and 0 states. However, when qubits are measured the result of the measurement is always either a 0 or a 1; the probabilities of these two outcomes depend on the quantum state that the qubits were in immediately prior to the measurement.

Any quantum computation algorithm can be represented as a network of quantum logic gates.

Quantum decoherence

If a quantum system were perfectly isolated, it would maintain coherence perfectly, but it would be impossible to test the entire system. If it is not perfectly isolated, for example during a measurement, coherence is shared with the environment and appears to be lost with time; this process is called quantum decoherence. As a result of this process, quantum behavior is apparently lost, just as energy appears to be lost by friction in classical mechanics.

Quantum error correction

QEC is used in quantum computing to protect quantum information from errors due to decoherence and other quantum noise. Quantum error correction is essential if one is to achieve fault-tolerant quantum computation that can deal not only with noise on stored quantum information, but also with faulty quantum gates, faulty quantum preparation, and faulty measurements.

Peter Shor first discovered this method of formulating a quantum error correcting code by storing the information of one qubit onto a highly entangled state of ancilla qubits. A quantum error correcting code protects quantum information against errors.

Radical (chemistry)

From Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Radical_(chemistry)
 
The hydroxyl radical, Lewis structure shown, contains one unpaired electron.
 
Hydroxide ion compared to a hydroxyl radical

In chemistry, a radical, also known as a free radical, is an atom, molecule, or ion that has at least one unpaired valence electron. With some exceptions, these unpaired electrons make radicals highly chemically reactive. Many radicals spontaneously dimerize. Most organic radicals have short lifetimes.

A notable example of a radical is the hydroxyl radical (HO·), a molecule that has one unpaired electron on the oxygen atom. Two other examples are triplet oxygen and triplet carbene (CH
2
) which have two unpaired electrons.

Radicals may be generated in a number of ways, but typical methods involve redox reactions, Ionizing radiation, heat, electrical discharges, and electrolysis are known to produce radicals. Radicals are intermediates in many chemical reactions, more so than is apparent from the balanced equations.

Radicals are important in combustion, atmospheric chemistry, polymerization, plasma chemistry, biochemistry, and many other chemical processes. A majority of natural products are generated by radical-generating enzymes. In living organisms, the radicals superoxide and nitric oxide and their reaction products regulate many processes, such as control of vascular tone and thus blood pressure. They also play a key role in the intermediary metabolism of various biological compounds. Such radicals can even be messengers in a process dubbed redox signaling. A radical may be trapped within a solvent cage or be otherwise bound.

Formation

Radicals are either (1) formed from spin-paired molecules or (2) from other radicals. Radicals are formed from spin-paired molecules through homolysis of weak bonds or electron transfer, also known as reduction. Radicals are formed from other radicals through substitution, addition, and elimination reactions.

Homolysis of a bromine molecule producing two bromine radicals

Radical formation from spin-paired molecules

Homolysis

Homolysis of dibenzoyl peroxide producing two benzoyloxy radicals

Homolysis makes two new radicals from a spin-paired molecule by breaking a covalent bond, leaving each of the fragments with one of the electrons in the bond. Because breaking a chemical bond requires energy, homolysis occurs under the addition of heat or light. The bond dissociation energy associated with homolysis depends on the stability of a given compound, and some weak bonds are able to homolyze at relatively lower temperatures.

Some homolysis reactions are particularly important because they serve as an initiator for other radical reactions. One such example is the homolysis of halogens, which occurs under light and serves as the driving force for radical halogenation reactions.

Another notable reaction is the homolysis of dibenzoyl peroxide, which results in the formation of two benzoyloxy radicals and acts as an initiator for many radical reactions.

Reduction of a ketone to form a ketyl radical

Reduction

Radicals can also form when a single electron is added to a spin-paired molecule, resulting in an electron transfer. This reaction, also called reduction, usually takes place with an alkali metal donating an electron to another spin-paired molecule.

Radical formation from other radicals

Abstraction

Radical abstraction between a benzoyloxy radical and hydrogen bromide
 
Radical addition of a bromine radical to a substituted alkene

Hydrogen abstraction describes when a hydrogen atom is removed from a hydrogen donor molecule (e.g. tin or silicon hydride) with its one electron. Abstraction produces a new radical and a new spin-paired molecule. This is different from homolysis, which results in two radicals from a single spin-paired molecule and doesn't include a radical as its reactant. Hydrogen abstraction is a fundamental process in radical chemistry because it serves as the final propagation step in many chemical reactions, converting carbon radicals into stable molecules. The figure to the right shows a radical abstraction between a benzoyloxy radical and a hydrogen bromide molecule, resulting in the production of a benzoic acid molecule and a bromine radical.

Addition

Radical addition describes when a radical is added to a spin-paired molecule to form a new radical. The figure on the right shows the addition of a bromine radical to an alkene. Radical addition follows the Anti -Markovnikov rule, where the substituent is added to the less substituted carbon atom.

Elimination

Radical elimination can be viewed as the reverse of radical addition. In radical elimination, an unstable radical compound breaks down into a spin-paired molecule and a new radical compound. Shown below is an example of a radical elimination reaction, where a benzoyloxy radical breaks down into a phenyl radical and a carbon dioxide molecule.

A radical elimination reaction of a benzoyloxy radical

Stability

Stability of organic radicals

The radical derived from α-tocopherol

Although organic radicals are generally stable intrinsically (in isolation), practically speaking their existence is only transient because they tend to dimerize. Some are quite long-lived. Generally organic radicals are stabilized by any or all of these factors: presence of electronegativity, delocalization, and steric hindrance. The compound 2,2,6,6-tetramethylpiperidinyloxyl illustrates the combination of all three factors. It is a commercially available solid that, aside from being magnetic, behaves like a normal organic compound.

Electronegativity

Organic radicals are inherently electron deficient thus the greater the electronegativity of the atom on which the unpaired electron resides the less stable the radical. Between carbon, nitrogen, and oxygen, for example, carbon is the most stable and oxygen the least stable.

Electronegativity also factors into the stability of carbon atoms of different hybridizations. Greater s-character correlates to higher electronegativity of the carbon atom (due to the close proximity of s orbitals to the nucleus), and the greater the electronegativity the less stable a radical. sp-hybridized carbons (50% s-character) form the least stable radicals compared to sp3-hybridized carbons (25% s-character) which form the most stable radicals.

Delocalization

The delocalization of electrons across the structure of a radical, also known as its ability to form one or more resonance structures, allows for the electron-deficiency to be spread over several atoms, minimizing instability. Delocalization usually occurs in the presence of electron-donating groups, such as hydroxyl groups (−OH), ethers (−OR), adjacent alkenes, and amines (−NH2 or −NR), or electron-withdrawing groups, such as C=O or C≡N.

Molecular orbital diagram of a radical with an electron-donating group

Delocalization effects can also be understood using molecular orbital theory as a lens, more specifically, by examining the intramolecular interaction of the unpaired electron with a donating group's pair of electrons or the empty π* orbital of an electron-withdrawing group in the form of a molecular orbital diagram. The HOMO of a radical is singly-occupied hence the orbital is aptly referred to as the SOMO, or the Singly-Occupied Molecular Orbital. For an electron-donating group, the SOMO interacts with the lower energy lone pair to form a new lower-energy filled bonding-orbital and a singly-filled new SOMO, higher in energy than the original. While the energy of the unpaired electron has increased, the decrease in energy of the lone pair forming the new bonding orbital outweighs the increase in energy of the new SOMO, resulting in a net decrease of the energy of the molecule. Therefore, electron-donating groups help stabilize radicals.

Molecular orbital diagram of a radical with an electron-withdrawing group

With a group that is instead electron-withdrawing, the SOMO then interacts with the empty π* orbital. There are no electrons occupying the higher energy orbital formed, while a new SOMO forms that is lower in energy. This results in a lower energy and higher stability of the radical species. Both donating groups and withdrawing groups stabilize radicals.

The relative stabilities of tertiary, secondary, primary and methyl radicals.

Another well-known albeit weaker form of delocalization is hyperconjugation. In radical chemistry, radicals are stabilized by hyperconjugation with adjacent alkyl groups. The donation of sigma (σ) C−H bonds into the partially empty radical orbitals helps to differentiate the stabilities of radicals on tertiary, secondary, and primary carbons. Tertiary carbon radicals have three σ C-H bonds that donate, secondary radicals only two, and primary radicals only one. Therefore, tertiary radicals are the most stable and primary radicals the least stable.

Steric hindrance

Radical form of N-hydroxypiperidine

Most simply, the greater the steric hindrance the more difficult it is for reactions to take place, and the radical form is favored by default. For example, compare the hydrogen-abstracted form of N-hydroxypiperidine to the molecule TEMPO. TEMPO, or (2,2,6,6-Tetramethylpiperidin-1-yl)oxyl, is too sterically hindered by the additional methyl groups to react making it stable enough to be sold commercially in its radical form. N-Hydroxypiperidine, however, does not have the four methyl groups to impede the way of a reacting molecule so the structure is unstable.

Facile H-atom donors

The stability of many (or most) organic radicals is not indicated by their isolability but is manifested in their ability to function as donors of H. This property reflects a weakened bond to hydrogen, usually O−H but sometimes N−H or C−H. This behavior is important because these H donors serve as antioxidants in biology and in commerce. Illustrative is α-tocopherol (vitamin E). The tocopherol radical itself is insufficiently stable for isolation, but the parent molecule is a highly effective hydrogen-atom donor. The C−H bond is weakened in triphenylmethyl (trityl) derivatives.

2,2,6,6-Tetramethylpiperidinyloxyl is an example of a robust organic radical.

Inorganic radicals

A large variety of inorganic radicals are stable and in fact isolable. Examples include most first-row transition metal complexes.

With regard to main group radicals, the most abundant radical in the universe is also the most abundant chemical in the universe, H. Most main group radicals are not however isolable, despite their intrinsic stability. Hydrogen radicals for example combine eagerly to form H2. Nitric oxide (NO) is well known example of an isolable inorganic radical. Fremy's salt (Potassium nitrosodisulfonate, (KSO3)2NO) is a related example. Many thiazyl radicals are known, despite limited extent of π resonance stabilization.

Many radicals can be envisioned as the products of breaking of covalent bonds by homolysis. The homolytic bond dissociation energies, usually abbreviated as "ΔH °" are a measure of bond strength. Splitting H2 into 2 H, for example, requires a ΔH ° of +435 kJ/mol, while splitting Cl2 into two Cl requires a ΔH ° of +243 kJ/mol. For weak bonds, homolysis can be induced thermally. Strong bonds require high energy photons or even flames to induce homolysis.

Diradicals

Diradicals are molecules containing two radical centers. Dioxygen (O2) is an important example of a stable diradical. Singlet oxygen, the lowest-energy non-radical state of dioxygen, is less stable than the diradical due to Hund's rule of maximum multiplicity. The relative stability of the oxygen diradical is primarily due to the spin-forbidden nature of the triplet-singlet transition required for it to grab electrons, i.e., "oxidize". The diradical state of oxygen also results in its paramagnetic character, which is demonstrated by its attraction to an external magnet. Diradicals can also occur in metal-oxo complexes, lending themselves for studies of spin forbidden reactions in transition metal chemistry. Carbenes in their triplet state can be viewed as diradicals centred on the same atom, while these are usually highly reactive persistent carbenes are known, with N-heterocyclic carbenes being the most common example.

Triplet carbenes and nitrenes are diradicals. Their chemical properties are distinct from the properties of their singlet analogues.

Occurrence of radicals

Combustion

Spectrum of the blue flame from a butane torch showing excited molecular radical band emission and Swan bands
 

A familiar radical reaction is combustion. The oxygen molecule is a stable diradical, best represented by O–O. Because spins of the electrons are parallel, this molecule is stable. While the ground state of oxygen is this unreactive spin-unpaired (triplet) diradical, an extremely reactive spin-paired (singlet) state is available. For combustion to occur, the energy barrier between these must be overcome. This barrier can be overcome by heat, requiring high temperatures. The triplet-singlet transition is also "forbidden". This presents an additional barrier to the reaction. It also means molecular oxygen is relatively unreactive at room temperature except in the presence of a catalytic heavy atom such as iron or copper.

Combustion consists of various radical chain reactions that the singlet radical can initiate. The flammability of a given material strongly depends on the concentration of radicals that must be obtained before initiation and propagation reactions dominate leading to combustion of the material. Once the combustible material has been consumed, termination reactions again dominate and the flame dies out. As indicated, promotion of propagation or termination reactions alters flammability. For example, because lead itself deactivates radicals in the gasoline-air mixture, tetraethyl lead was once commonly added to gasoline. This prevents the combustion from initiating in an uncontrolled manner or in unburnt residues (engine knocking) or premature ignition (preignition).

When a hydrocarbon is burned, a large number of different oxygen radicals are involved. Initially, hydroperoxyl radical (HOO) are formed. These then react further to give organic hydroperoxides that break up into hydroxyl radicals (HO).

Polymerization

Many polymerization reactions are initiated by radicals. Polymerization involves an initial radical adding to non-radical (usually an alkene) to give new radicals. This process is the basis of the radical chain reaction. The art of polymerization entails the method by which the initiating radical is introduced. For example, methyl methacrylate (MMA) can be polymerized to produce Poly(methyl methacrylate) (PMMA - Plexiglas or Perspex) via a repeating series of radical addition steps:

Radical intermediates in the formation of polymethacrylate (plexiglas or perspex)

Newer radical polymerization methods are known as living radical polymerization. Variants include reversible addition-fragmentation chain transfer (RAFT) and atom transfer radical polymerization (ATRP).

Being a prevalent radical, O2 reacts with many organic compounds to generate radicals together with the hydroperoxide radical. Drying oils and alkyd paints harden due to radical crosslinking initiated by oxygen from the atmosphere.

Atmospheric radicals

The most common radical in the lower atmosphere is molecular dioxygen. Photodissociation of source molecules produces other radicals. In the lower atmosphere, important radical are produced by the photodissociation of nitrogen dioxide to an oxygen atom and nitric oxide (see eq. 1.1 below), which plays a key role in smog formation—and the photodissociation of ozone to give the excited oxygen atom O(1D) (see eq. 1.2 below). The net and return reactions are also shown (eq. 1.3 and eq. 1.4, respectively).

 

 

 

 

(eq. 1.1)

 

 

 

 

(eq. 1.2)

 

 

 

 

(eq. 1.3)

 

 

 

 

(eq. 1.4)

In the upper atmosphere, the photodissociation of normally unreactive chlorofluorocarbons (CFCs) by solar ultraviolet radiation is an important source of radicals (see eq. 1 below). These reactions give the chlorine radical, Cl, which catalyzes the conversion of ozone to O2, thus facilitating ozone depletion (eq. 2.2eq. 2.4 below).

 

 

 

 

(eq. 2.1)

 

 

 

 

(eq. 2.2)

 

 

 

 

(eq. 2.3)

 

 

 

 

(eq. 2.4)

 

 

 

 

(eq. 2.5)

Such reactions cause the depletion of the ozone layer, especially since the chlorine radical is free to engage in another reaction chain; consequently, the use of chlorofluorocarbons as refrigerants has been restricted.

In biology

Structure of the deoxyadenosyl radical, a common biosynthetic intermediate
 
An approximate structure of lignin, which constitutes about 30% of plant matter. It is formed by radical reactions.

Radicals play important roles in biology. Many of these are necessary for life, such as the intracellular killing of bacteria by phagocytic cells such as granulocytes and macrophages. Radicals are involved in cell signalling processes, known as redox signaling. For example, radical attack of linoleic acid produces a series of 13-hydroxyoctadecadienoic acids and 9-hydroxyoctadecadienoic acids, which may act to regulate localized tissue inflammatory and/or healing responses, pain perception, and the proliferation of malignant cells. Radical attacks on arachidonic acid and docosahexaenoic acid produce a similar but broader array of signaling products.

Radicals may also be involved in Parkinson's disease, senile and drug-induced deafness, schizophrenia, and Alzheimer's. The classic free-radical syndrome, the iron-storage disease hemochromatosis, is typically associated with a constellation of free-radical-related symptoms including movement disorder, psychosis, skin pigmentary melanin abnormalities, deafness, arthritis, and diabetes mellitus. The free-radical theory of aging proposes that radicals underlie the aging process itself. Similarly, the process of mitohormesis suggests that repeated exposure to radicals may extend life span.

Because radicals are necessary for life, the body has a number of mechanisms to minimize radical-induced damage and to repair damage that occurs, such as the enzymes superoxide dismutase, catalase, glutathione peroxidase and glutathione reductase. In addition, antioxidants play a key role in these defense mechanisms. These are often the three vitamins, vitamin A, vitamin C and vitamin E and polyphenol antioxidants. Furthermore, there is good evidence indicating that bilirubin and uric acid can act as antioxidants to help neutralize certain radicals. Bilirubin comes from the breakdown of red blood cells' contents, while uric acid is a breakdown product of purines. Too much bilirubin, though, can lead to jaundice, which could eventually damage the central nervous system, while too much uric acid causes gout.

Reactive oxygen species

Reactive oxygen species or ROS are species such as superoxide, hydrogen peroxide, and hydroxyl radical, commonly associated with cell damage. ROS form as a natural by-product of the normal metabolism of oxygen and have important roles in cell signaling. Two important oxygen-centered radicals are superoxide and hydroxyl radical. They derive from molecular oxygen under reducing conditions. However, because of their reactivity, these same radicals can participate in unwanted side reactions resulting in cell damage. Excessive amounts of these radicals can lead to cell injury and death, which may contribute to many diseases such as cancer, stroke, myocardial infarction, diabetes and major disorders. Many forms of cancer are thought to be the result of reactions between radicals and DNA, potentially resulting in mutations that can adversely affect the cell cycle and potentially lead to malignancy. Some of the symptoms of aging such as atherosclerosis are also attributed to radical induced oxidation of cholesterol to 7-ketocholesterol. In addition radicals contribute to alcohol-induced liver damage, perhaps more than alcohol itself. Radicals produced by cigarette smoke are implicated in inactivation of alpha 1-antitrypsin in the lung. This process promotes the development of emphysema.

Oxybenzone has been found to form radicals in sunlight, and therefore may be associated with cell damage as well. This only occurred when it was combined with other ingredients commonly found in sunscreens, like titanium oxide and octyl methoxycinnamate.

ROS attack the polyunsaturated fatty acid, linoleic acid, to form a series of 13-hydroxyoctadecadienoic acid and 9-hydroxyoctadecadienoic acid products that serve as signaling molecules that may trigger responses that counter the tissue injury which caused their formation. ROS attacks other polyunsaturated fatty acids, e.g. arachidonic acid and docosahexaenoic acid, to produce a similar series of signaling products.

Reactive oxygen species are also used in controlled reactions involving singlet dioxygen known as type II photooxygenation reactions after Dexter energy transfer (triplet-triplet annihilation) from natural triplet dioxygen and triplet excited state of a photosensitizer. Typical chemical transformations with this singlet dioxygen species involve, among others, conversion of cellulosic biowaste into new poylmethine dyes.

History and nomenclature

Moses Gomberg (1866–1947), the founder of radical chemistry

Until late in the 20th century the word "radical" was used in chemistry to indicate any connected group of atoms, such as a methyl group or a carboxyl, whether it was part of a larger molecule or a molecule on its own. A radical is often known as an R group. The qualifier "free" was then needed to specify the unbound case. Following recent nomenclature revisions, a part of a larger molecule is now called a functional group or substituent, and "radical" now implies "free". However, the old nomenclature may still appear in some books.

The term radical was already in use when the now obsolete radical theory was developed. Louis-Bernard Guyton de Morveau introduced the phrase "radical" in 1785 and the phrase was employed by Antoine Lavoisier in 1789 in his Traité Élémentaire de Chimie. A radical was then identified as the root base of certain acids (the Latin word "radix" meaning "root"). Historically, the term radical in radical theory was also used for bound parts of the molecule, especially when they remain unchanged in reactions. These are now called functional groups. For example, methyl alcohol was described as consisting of a methyl "radical" and a hydroxyl "radical". Neither are radicals in the modern chemical sense, as they are permanently bound to each other, and have no unpaired, reactive electrons; however, they can be observed as radicals in mass spectrometry when broken apart by irradiation with energetic electrons.

In a modern context the first organic (carbon–containing) radical identified was the triphenylmethyl radical, (C6H5)3C. This species was discovered by Moses Gomberg in 1900. In 1933 Morris S. Kharasch and Frank Mayo proposed that free radicals were responsible for anti-Markovnikov addition of hydrogen bromide to allyl bromide.

In most fields of chemistry, the historical definition of radicals contends that the molecules have nonzero electron spin. However, in fields including spectroscopy and astrochemistry, the definition is slightly different. Gerhard Herzberg, who won the Nobel prize for his research into the electron structure and geometry of radicals, suggested a looser definition of free radicals: "any transient (chemically unstable) species (atom, molecule, or ion)". The main point of his suggestion is that there are many chemically unstable molecules that have zero spin, such as C2, C3, CH2 and so on. This definition is more convenient for discussions of transient chemical processes and astrochemistry; therefore researchers in these fields prefer to use this loose definition.

Depiction in chemical reactions

In chemical equations, radicals are frequently denoted by a dot placed immediately to the right of the atomic symbol or molecular formula as follows:

Radical reaction mechanisms use single-headed arrows to depict the movement of single electrons:

Radical.svg

The homolytic cleavage of the breaking bond is drawn with a 'fish-hook' arrow to distinguish from the usual movement of two electrons depicted by a standard curly arrow. The second electron of the breaking bond also moves to pair up with the attacking radical electron; this is not explicitly indicated in this case.

Radicals also take part in radical addition and radical substitution as reactive intermediates. Chain reactions involving radicals can usually be divided into three distinct processes. These are initiation, propagation, and termination.

  • Initiation reactions are those that result in a net increase in the number of radicals. They may involve the formation of radicals from stable species as in Reaction 1 above or they may involve reactions of radicals with stable species to form more radicals.
  • Propagation reactions are those reactions involving radicals in which the total number of radicals remains the same.
  • Termination reactions are those reactions resulting in a net decrease in the number of radicals. Typically two radicals combine to form a more stable species, for example:
    2 Cl → Cl2

Inequality (mathematics)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Inequality...