Search This Blog

Monday, May 29, 2023

Combustion

From Wikipedia, the free encyclopedia
The flames caused as a result of a fuel undergoing combustion (burning)
 

Combustion, or burning, is a high-temperature exothermic redox chemical reaction between a fuel (the reductant) and an oxidant, usually atmospheric oxygen, that produces oxidized, often gaseous products, in a mixture termed as smoke. Combustion does not always result in fire, because a flame is only visible when substances undergoing combustion vaporize, but when it does, a flame is a characteristic indicator of the reaction. While the activation energy must be overcome to initiate combustion (e.g., using a lit match to light a fire), the heat from a flame may provide enough energy to make the reaction self-sustaining.

Combustion is often a complicated sequence of elementary radical reactions. Solid fuels, such as wood and coal, first undergo endothermic pyrolysis to produce gaseous fuels whose combustion then supplies the heat required to produce more of them. Combustion is often hot enough that incandescent light in the form of either glowing or a flame is produced. A simple example can be seen in the combustion of hydrogen and oxygen into water vapor, a reaction which is commonly used to fuel rocket engines. This reaction releases 242 kJ/mol of heat and reduces the enthalpy accordingly (at constant temperature and pressure):

Uncatalyzed combustion in air requires relatively high temperatures. Complete combustion is stoichiometric concerning the fuel, where there is no remaining fuel, and ideally, no residual oxidant. Thermodynamically, the chemical equilibrium of combustion in air is overwhelmingly on the side of the products. However, complete combustion is almost impossible to achieve, since the chemical equilibrium is not necessarily reached, or may contain unburnt products such as carbon monoxide, hydrogen and even carbon (soot or ash). Thus, the produced smoke is usually toxic and contains unburned or partially oxidized products. Any combustion at high temperatures in atmospheric air, which is 78 percent nitrogen, will also create small amounts of several nitrogen oxides, commonly referred to as NOx, since the combustion of nitrogen is thermodynamically favored at high, but not low temperatures. Since burning is rarely clean, fuel gas cleaning or catalytic converters may be required by law.

Fires occur naturally, ignited by lightning strikes or by volcanic products. Combustion (fire) was the first controlled chemical reaction discovered by humans, in the form of campfires and bonfires, and continues to be the main method to produce energy for humanity. Usually, the fuel is carbon, hydrocarbons, or more complicated mixtures such as wood that contain partially oxidized hydrocarbons. The thermal energy produced from the combustion of either fossil fuels such as coal or oil, or from renewable fuels such as firewood, is harvested for diverse uses such as cooking, production of electricity or industrial or domestic heating. Combustion is also currently the only reaction used to power rockets. Combustion is also used to destroy (incinerate) waste, both nonhazardous and hazardous.

Oxidants for combustion have high oxidation potential and include atmospheric or pure oxygen, chlorine, fluorine, chlorine trifluoride, nitrous oxide and nitric acid. For instance, hydrogen burns in chlorine to form hydrogen chloride with the liberation of heat and light characteristic of combustion. Although usually not catalyzed, combustion can be catalyzed by platinum or vanadium, as in the contact process.

Types

Complete and incomplete

Complete

The combustion of methane, a hydrocarbon.

In complete combustion, the reactant burns in oxygen and produces a limited number of products. When a hydrocarbon burns in oxygen, the reaction will primarily yield carbon dioxide and water. When elements are burned, the products are primarily the most common oxides. Carbon will yield carbon dioxide, sulfur will yield sulfur dioxide, and iron will yield iron(III) oxide. Nitrogen is not considered to be a combustible substance when oxygen is the oxidant. Still, small amounts of various nitrogen oxides (commonly designated NO
x
species) form when the air is the oxidative.

Combustion is not necessarily favorable to the maximum degree of oxidation, and it can be temperature-dependent. For example, sulfur trioxide is not produced quantitatively by the combustion of sulfur. NOx species appear in significant amounts above about 2,800 °F (1,540 °C), and more is produced at higher temperatures. The amount of NOx is also a function of oxygen excess.

In most industrial applications and in fires, air is the source of oxygen (O
2
). In the air, each mole of oxygen is mixed with approximately 3.71 mol of nitrogen. Nitrogen does not take part in combustion, but at high temperatures, some nitrogen will be converted to NO
x
(mostly NO, with much smaller amounts of NO
2
). On the other hand, when there is insufficient oxygen to combust the fuel completely, some fuel carbon is converted to carbon monoxide, and some of the hydrogens remain unreacted. A complete set of equations for the combustion of a hydrocarbon in the air, therefore, requires an additional calculation for the distribution of oxygen between the carbon and hydrogen in the fuel.

The amount of air required for complete combustion is known as the "theoretical air" or "stoichiometric air". The amount of air above this value actually needed for optimal combustion is known as the "excess air", and can vary from 5% for a natural gas boiler, to 40% for anthracite coal, to 300% for a gas turbine.

Incomplete

Incomplete combustion will occur when there is not enough oxygen to allow the fuel to react completely to produce carbon dioxide and water. It also happens when the combustion is quenched by a heat sink, such as a solid surface or flame trap. As is the case with complete combustion, water is produced by incomplete combustion; however, carbon and carbon monoxide are produced instead of carbon dioxide.

For most fuels, such as diesel oil, coal, or wood, pyrolysis occurs before combustion. In incomplete combustion, products of pyrolysis remain unburnt and contaminate the smoke with noxious particulate matter and gases. Partially oxidized compounds are also a concern; partial oxidation of ethanol can produce harmful acetaldehyde, and carbon can produce toxic carbon monoxide.

The designs of combustion devices can improve the quality of combustion, such as burners and internal combustion engines. Further improvements are achievable by catalytic after-burning devices (such as catalytic converters) or by the simple partial return of the exhaust gases into the combustion process. Such devices are required by environmental legislation for cars in most countries. They may be necessary to enable large combustion devices, such as thermal power stations, to reach legal emission standards.

The degree of combustion can be measured and analyzed with test equipment. HVAC contractors, firefighters and engineers use combustion analyzers to test the efficiency of a burner during the combustion process. Also, the efficiency of an internal combustion engine can be measured in this way, and some U.S. states and local municipalities use combustion analysis to define and rate the efficiency of vehicles on the road today.

Carbon monoxide is one of the products from incomplete combustion. The formation of carbon monoxide produces less heat than formation of carbon dioxide so complete combustion is greatly preferred especially as carbon monoxide is a poisonous gas. When breathed, carbon monoxide takes the place of oxygen and combines with some of the hemoglobin in the blood, rendering it unable to transport oxygen.

These oxides combine with water and oxygen in the atmosphere, creating nitric acid and sulfuric acids, which return to Earth's surface as acid deposition, or "acid rain." Acid deposition harms aquatic organisms and kills trees. Due to its formation of certain nutrients that are less available to plants such as calcium and phosphorus, it reduces the productivity of the ecosystem and farms. An additional problem associated with nitrogen oxides is that they, along with hydrocarbon pollutants, contribute to the formation of ground level ozone, a major component of smog.

Human health problems

Breathing carbon monoxide causes headache, dizziness, vomiting, and nausea. If carbon monoxide levels are high enough, humans become unconscious or die. Exposure to moderate and high levels of carbon monoxide over long periods is positively correlated with the risk of heart disease. People who survive severe carbon monoxide poisoning may suffer long-term health problems. Carbon monoxide from the air is absorbed in the lungs which then binds with hemoglobin in human's red blood cells. This reduces the capacity of red blood cells that carry oxygen throughout the body.

Smoldering

Smoldering is the slow, low-temperature, flameless form of combustion, sustained by the heat evolved when oxygen directly attacks the surface of a condensed-phase fuel. It is a typically incomplete combustion reaction. Solid materials that can sustain a smoldering reaction include coal, cellulose, wood, cotton, tobacco, peat, duff, humus, synthetic foams, charring polymers (including polyurethane foam) and dust. Common examples of smoldering phenomena are the initiation of residential fires on upholstered furniture by weak heat sources (e.g., a cigarette, a short-circuited wire) and the persistent combustion of biomass behind the flaming fronts of wildfires.

Rapid

Rapid combustion is a form of combustion, otherwise known as a fire, in which large amounts of heat and light energy are released, which often results in a flame. This is used in a form of machinery such as internal combustion engines and in thermobaric weapons. Such a combustion is frequently called a Rapid combustion, though for an internal combustion engine, this is inaccurate. An internal combustion engine nominally operates on a controlled rapid burn. When the fuel-air mixture in an internal combustion engine explodes, that is known as detonation.

Spontaneous

Spontaneous combustion is a type of combustion that occurs by self-heating (increase in temperature due to exothermic internal reactions), followed by thermal runaway (self-heating which rapidly accelerates to high temperatures) and finally, ignition. For example, phosphorus self-ignites at room temperature without the application of heat. Organic materials undergoing bacterial composting can generate enough heat to reach the point of combustion.

Turbulent

Combustion resulting in a turbulent flame is the most used for industrial applications (e.g. gas turbines, gasoline engines, etc.) because the turbulence helps the mixing process between the fuel and oxidizer.

Micro-gravity

Colourized gray-scale composite image of the individual frames from a video of a backlit fuel droplet burning in microgravity.

The term 'micro' gravity refers to a gravitational state that is 'low' (i.e., 'micro' in the sense of 'small' and not necessarily a millionth of Earth's normal gravity) such that the influence of buoyancy on physical processes may be considered small relative to other flow processes that would be present at normal gravity. In such an environment, the thermal and flow transport dynamics can behave quite differently than in normal gravity conditions (e.g., a candle's flame takes the shape of a sphere). Microgravity combustion research contributes to the understanding of a wide variety of aspects that are relevant to both the environment of a spacecraft (e.g., fire dynamics relevant to crew safety on the International Space Station) and terrestrial (Earth-based) conditions (e.g., droplet combustion dynamics to assist developing new fuel blends for improved combustion, materials fabrication processes, thermal management of electronic systems, multiphase flow boiling dynamics, and many others).

Micro-combustion

Combustion processes that happen in very small volumes are considered micro-combustion. The high surface-to-volume ratio increases specific heat loss. Quenching distance plays a vital role in stabilizing the flame in such combustion chambers.

Chemical equations

Stoichiometric combustion of a hydrocarbon in oxygen

Generally, the chemical equation for stoichiometric combustion of a hydrocarbon in oxygen is:

where .

For example, the stoichiometric burning of propane in oxygen is:

Stoichiometric combustion of a hydrocarbon in air

If the stoichiometric combustion takes place using air as the oxygen source, the nitrogen present in the air (Atmosphere of Earth) can be added to the equation (although it does not react) to show the stoichiometric composition of the fuel in air and the composition of the resultant flue gas. Treating all non-oxygen components in air as nitrogen gives a 'nitrogen' to oxygen ratio of 3.77, i.e. (100% - O2%) / O2% where O2% is 20.95% vol:

where .

For example, the stoichiometric combustion of propane () in air is:

The stoichiometric composition of propane in air is 1 / (1 + 5 + 18.87) = 4.02% vol.

The stoichiometric combustion reaction for CαHβOγ in air:

The stoichiometric combustion reaction for CαHβOγSδ:

The stoichiometric combustion reaction for CαHβOγNδSε:

The stoichiometric combustion reaction for CαHβOγFδ:

Trace combustion products

Various other substances begin to appear in significant amounts in combustion products when the flame temperature is above about 1600 K. When excess air is used, nitrogen may oxidize to NO and, to a much lesser extent, to NO
2
. CO forms by disproportionation of CO2, and H
2
and OH form by disproportionation of H2O.

For example, when mol of propane is burned with 28.6 mol of air (120% of the stoichiometric amount), the combustion products contain 3.3% O
2
. At 1400 K, the equilibrium combustion products contain 0.03% NO and 0.002% OH. At 1800 K, the combustion products contain 0.17% NO, 0.05% OH, 0.01% CO, and 0.004% H
2
.

Diesel engines are run with an excess of oxygen to combust small particles that tend to form with only a stoichiometric amount of oxygen, necessarily producing nitrogen oxide emissions. Both the United States and European Union enforce limits to vehicle nitrogen oxide emissions, which necessitate the use of special catalytic converters or treatment of the exhaust with urea (see Diesel exhaust fluid).

Incomplete combustion of a hydrocarbon in oxygen

The incomplete (partial) combustion of a hydrocarbon with oxygen produces a gas mixture containing mainly CO
2
, CO, H2O, and H
2
. Such gas mixtures are commonly prepared for use as protective atmospheres for the heat-treatment of metals and for gas carburizing. The general reaction equation for incomplete combustion of one mole of a hydrocarbon in oxygen is:

When z falls below roughly 50% of the stoichiometric value, CH
4
can become an important combustion product; when z falls below roughly 35% of the stoichiometric value, elemental carbon may become stable.

The products of incomplete combustion can be calculated with the aid of a material balance, together with the assumption that the combustion products reach equilibrium. For example, in the combustion of one mole of propane (C
3
H
8
) with four moles of O
2
, seven moles of combustion gas are formed, and z is 80% of the stoichiometric value. The three elemental balance equations are:

  • Carbon:
  • Hydrogen:
  • Oxygen:

These three equations are insufficient in themselves to calculate the combustion gas composition. However, at the equilibrium position, the water-gas shift reaction gives another equation:

;

For example, at 1200 K the value of Keq is 0.728. Solving, the combustion gas consists of 42.4% H2O, 29.0% CO2, 14.7% H
2
, and 13.9% CO. Carbon becomes a stable phase at 1200 K and atm pressure when z is less than 30% of the stoichiometric value, at which point the combustion products contain more than 98% H
2
and CO and about 0.5% CH
4
.

Substances or materials which undergo combustion are called fuels. The most common examples are natural gas, propane, kerosene, diesel, petrol, charcoal, coal, wood, etc.

Liquid fuels

Combustion of a liquid fuel in an oxidizing atmosphere actually happens in the gas phase. It is the vapor that burns, not the liquid. Therefore, a liquid will normally catch fire only above a certain temperature: its flash point. The flash point of liquid fuel is the lowest temperature at which it can form an ignitable mix with air. It is the minimum temperature at which there is enough evaporated fuel in the air to start combustion.

Gaseous fuels

Combustion of gaseous fuels may occur through one of four distinctive types of burning: diffusion flame, premixed flame, autoignitive reaction front, or as a detonation. The type of burning that actually occurs depends on the degree to which the fuel and oxidizer are mixed prior to heating: for example, a diffusion flame is formed if the fuel and oxidizer are separated initially, whereas a premixed flame is formed otherwise. Similarly, the type of burning also depends on the pressure: a detonation, for example, is an autoignitive reaction front coupled to a strong shock wave giving it its characteristic high-pressure peak and high detonation velocity.

Solid fuels

A general scheme of polymer combustion

The act of combustion consists of three relatively distinct but overlapping phases:

  • Preheating phase, when the unburned fuel is heated up to its flash point and then fire point. Flammable gases start being evolved in a process similar to dry distillation.
  • Distillation phase or gaseous phase, when the mix of evolved flammable gases with oxygen is ignited. Energy is produced in the form of heat and light. Flames are often visible. Heat transfer from the combustion to the solid maintains the evolution of flammable vapours.
  • Charcoal phase or solid phase, when the output of flammable gases from the material is too low for the persistent presence of flame and the charred fuel does not burn rapidly and just glows and later only smoulders.

Combustion management

Efficient process heating requires recovery of the largest possible part of a fuel's heat of combustion into the material being processed. There are many avenues of loss in the operation of a heating process. Typically, the dominant loss is sensible heat leaving with the offgas (i.e., the flue gas). The temperature and quantity of offgas indicates its heat content (enthalpy), so keeping its quantity low minimizes heat loss.

In a perfect furnace, the combustion air flow would be matched to the fuel flow to give each fuel molecule the exact amount of oxygen needed to cause complete combustion. However, in the real world, combustion does not proceed in a perfect manner. Unburned fuel (usually CO and H
2
) discharged from the system represents a heating value loss (as well as a safety hazard). Since combustibles are undesirable in the offgas, while the presence of unreacted oxygen there presents minimal safety and environmental concerns, the first principle of combustion management is to provide more oxygen than is theoretically needed to ensure that all the fuel burns. For methane (CH
4
) combustion, for example, slightly more than two molecules of oxygen are required.

The second principle of combustion management, however, is to not use too much oxygen. The correct amount of oxygen requires three types of measurement: first, active control of air and fuel flow; second, offgas oxygen measurement; and third, measurement of offgas combustibles. For each heating process, there exists an optimum condition of minimal offgas heat loss with acceptable levels of combustibles concentration. Minimizing excess oxygen pays an additional benefit: for a given offgas temperature, the NOx level is lowest when excess oxygen is kept lowest.

Adherence to these two principles is furthered by making material and heat balances on the combustion process. The material balance directly relates the air/fuel ratio to the percentage of O
2
in the combustion gas. The heat balance relates the heat available for the charge to the overall net heat produced by fuel combustion. Additional material and heat balances can be made to quantify the thermal advantage from preheating the combustion air, or enriching it in oxygen.

Reaction mechanism

Combustion in oxygen is a chain reaction in which many distinct radical intermediates participate. The high energy required for initiation is explained by the unusual structure of the dioxygen molecule. The lowest-energy configuration of the dioxygen molecule is a stable, relatively unreactive diradical in a triplet spin state. Bonding can be described with three bonding electron pairs and two antibonding electrons, with spins aligned, such that the molecule has nonzero total angular momentum. Most fuels, on the other hand, are in a singlet state, with paired spins and zero total angular momentum. Interaction between the two is quantum mechanically a "forbidden transition", i.e. possible with a very low probability. To initiate combustion, energy is required to force dioxygen into a spin-paired state, or singlet oxygen. This intermediate is extremely reactive. The energy is supplied as heat, and the reaction then produces additional heat, which allows it to continue.

Combustion of hydrocarbons is thought to be initiated by hydrogen atom abstraction (not proton abstraction) from the fuel to oxygen, to give a hydroperoxide radical (HOO). This reacts further to give hydroperoxides, which break up to give hydroxyl radicals. There are a great variety of these processes that produce fuel radicals and oxidizing radicals. Oxidizing species include singlet oxygen, hydroxyl, monatomic oxygen, and hydroperoxyl. Such intermediates are short-lived and cannot be isolated. However, non-radical intermediates are stable and are produced in incomplete combustion. An example is acetaldehyde produced in the combustion of ethanol. An intermediate in the combustion of carbon and hydrocarbons, carbon monoxide, is of special importance because it is a poisonous gas, but also economically useful for the production of syngas.

Solid and heavy liquid fuels also undergo a great number of pyrolysis reactions that give more easily oxidized, gaseous fuels. These reactions are endothermic and require constant energy input from the ongoing combustion reactions. A lack of oxygen or other improperly designed conditions result in these noxious and carcinogenic pyrolysis products being emitted as thick, black smoke.

The rate of combustion is the amount of a material that undergoes combustion over a period of time. It can be expressed in grams per second (g/s) or kilograms per second (kg/s).

Detailed descriptions of combustion processes, from the chemical kinetics perspective, require the formulation of large and intricate webs of elementary reactions.[29] For instance, combustion of hydrocarbon fuels typically involve hundreds of chemical species reacting according to thousands of reactions.

The inclusion of such mechanisms within computational flow solvers still represents a pretty challenging task mainly in two aspects. First, the number of degrees of freedom (proportional to the number of chemical species) can be dramatically large; second, the source term due to reactions introduces a disparate number of time scales which makes the whole dynamical system stiff. As a result, the direct numerical simulation of turbulent reactive flows with heavy fuels soon becomes intractable even for modern supercomputers.

Therefore, a plethora of methodologies have been devised for reducing the complexity of combustion mechanisms without resorting to high detail levels. Examples are provided by:

  • The Relaxation Redistribution Method (RRM)
  • The Intrinsic Low-Dimensional Manifold (ILDM) approach and further developments
  • The invariant-constrained equilibrium edge preimage curve method.
  • A few variational approaches
  • The Computational Singular perturbation (CSP) method and further developments.
  • The Rate Controlled Constrained Equilibrium (RCCE) and Quasi Equilibrium Manifold (QEM) approach.
  • The G-Scheme.
  • The Method of Invariant Grids (MIG).

Kinetic modelling

The kinetic modelling may be explored for insight into the reaction mechanisms of thermal decomposition in the combustion of different materials by using for instance Thermogravimetric analysis.

Temperature

Antoine Lavoisier conducting an experiment related to combustion generated by amplified sunlight.

Assuming perfect combustion conditions, such as complete combustion under adiabatic conditions (i.e., no heat loss or gain), the adiabatic combustion temperature can be determined. The formula that yields this temperature is based on the first law of thermodynamics and takes note of the fact that the heat of combustion is used entirely for heating the fuel, the combustion air or oxygen, and the combustion product gases (commonly referred to as the flue gas).

In the case of fossil fuels burnt in air, the combustion temperature depends on all of the following:

The adiabatic combustion temperature (also known as the adiabatic flame temperature) increases for higher heating values and inlet air and fuel temperatures and for stoichiometric air ratios approaching one.

Most commonly, the adiabatic combustion temperatures for coals are around 2,200 °C (3,992 °F) (for inlet air and fuel at ambient temperatures and for ), around 2,150 °C (3,902 °F) for oil and 2,000 °C (3,632 °F) for natural gas.

In industrial fired heaters, power station steam generators, and large gas-fired turbines, the more common way of expressing the usage of more than the stoichiometric combustion air is percent excess combustion air. For example, excess combustion air of 15 percent means that 15 percent more than the required stoichiometric air is being used.

Instabilities

Combustion instabilities are typically violent pressure oscillations in a combustion chamber. These pressure oscillations can be as high as 180 dB, and long-term exposure to these cyclic pressure and thermal loads reduces the life of engine components. In rockets, such as the F1 used in the Saturn V program, instabilities led to massive damage to the combustion chamber and surrounding components. This problem was solved by re-designing the fuel injector. In liquid jet engines, the droplet size and distribution can be used to attenuate the instabilities. Combustion instabilities are a major concern in ground-based gas turbine engines because of NOx emissions. The tendency is to run lean, an equivalence ratio less than 1, to reduce the combustion temperature and thus reduce the NOx emissions; however, running the combustion lean makes it very susceptible to combustion instability.

The Rayleigh Criterion is the basis for analysis of thermoacoustic combustion instability and is evaluated using the Rayleigh Index over one cycle of instability

where q' is the heat release rate perturbation and p' is the pressure fluctuation. When the heat release oscillations are in phase with the pressure oscillations, the Rayleigh Index is positive and the magnitude of the thermoacoustic instability is maximised. On the other hand, if the Rayleigh Index is negative, then thermoacoustic damping occurs. The Rayleigh Criterion implies that thermoacoustic instability can be optimally controlled by having heat release oscillations 180 degrees out of phase with pressure oscillations at the same frequency. This minimizes the Rayleigh Index.

Categorical quantum mechanics

From Wikipedia, the free encyclopedia

Categorical quantum mechanics is the study of quantum foundations and quantum information using paradigms from mathematics and computer science, notably monoidal category theory. The primitive objects of study are physical processes, and the different ways that these can be composed. It was pioneered in 2004 by Samson Abramsky and Bob Coecke. Categorical quantum mechanics is entry 18M40 in MSC2020.

Mathematical setup

Mathematically, the basic setup is captured by a dagger symmetric monoidal category: composition of morphisms models sequential composition of processes, and the tensor product describes parallel composition of processes. The role of the dagger is to assign to each state a corresponding test. These can then be adorned with more structure to study various aspects. For instance:

A substantial portion of the mathematical backbone to this approach is drawn from Australian category theory, most notably from work by Max Kelly and M. L. Laplaza, Andre Joyal and Ross Street, A. Carboni and R. F. C. Walters, and Steve Lack. Modern textbooks include Categories for quantum theory and Picturing quantum processes.

Diagrammatic calculus

One of the most notable features of categorical quantum mechanics is that the compositional structure can be faithfully captured by string diagrams.

An illustration of the diagrammatic calculus: the quantum teleportation protocol as modeled in categorical quantum mechanics.

These diagrammatic languages can be traced back to Penrose graphical notation, developed in the early 1970s. Diagrammatic reasoning has been used before in quantum information science in the quantum circuit model, however, in categorical quantum mechanics primitive gates like the CNOT-gate arise as composites of more basic algebras, resulting in a much more compact calculus. In particular, the ZX-calculus has sprung forth from categorical quantum mechanics as a diagrammatic counterpart to conventional linear algebraic reasoning about quantum gates. The ZX-calculus consists of a set of generators representing the common Pauli quantum gates and the Hadamard gate equipped with a set of graphical rewrite rules governing their interaction. Although a standard set of rewrite rules has not yet been established, some versions have been proven to be complete, meaning that any equation that holds between two quantum circuits represented as diagrams can be proven using the rewrite rules. The ZX-calculus has been used to study for instance measurement-based quantum computing.

Branches of activity

Axiomatization and new models

One of the main successes of the categorical quantum mechanics research program is that from seemingly weak abstract constraints on the compositional structure, it turned out to be possible to derive many quantum mechanical phenomena. In contrast to earlier axiomatic approaches, which aimed to reconstruct Hilbert space quantum theory from reasonable assumptions, this attitude of not aiming for a complete axiomatization may lead to new interesting models that describe quantum phenomena, which could be of use when crafting future theories.

Completeness and representation results

There are several theorems relating the abstract setting of categorical quantum mechanics to traditional settings for quantum mechanics.

  • Completeness of the diagrammatic calculus: an equality of morphisms can be proved in the category of finite-dimensional Hilbert spaces if and only if it can be proved in the graphical language of dagger compact closed categories.
  • Dagger commutative Frobenius algebras in the category of finite-dimensional Hilbert spaces correspond to orthogonal bases. A version of this correspondence also holds in arbitrary dimension.
  • Certain extra axioms guarantee that the scalars embed into the field of complex numbers, namely the existence of finite dagger biproducts and dagger equalizers, well-pointedness, and a cardinality restriction on the scalars.
  • Certain extra axioms on top of the previous guarantee that a dagger symmetric monoidal category embeds into the category of Hilbert spaces, namely if every dagger monic is a dagger kernel. In that case the scalars form an involutive field instead of just embedding in one. If the category is compact, the embedding lands in finite-dimensional Hilbert spaces.
  • Six axioms characterize the category of Hilbert spaces completely, fulfilling the reconstruction programme. Two of these axioms concern a dagger and a tensor product, a third concerns biproducts.
  • Special dagger commutative Frobenius algebras in the category of sets and relations correspond to discrete abelian groupoids.
  • Finding complementary basis structures in the category of sets and relations corresponds to solving combinatorical problems involving Latin squares.
  • Dagger commutative Frobenius algebras on qubits must be either special or antispecial, relating to the fact that maximally entangled tripartite states are SLOCC-equivalent to either the GHZ or the W state.

Categorical quantum mechanics as logic

Categorical quantum mechanics can also be seen as a type theoretic form of quantum logic that, in contrast to traditional quantum logic, supports formal deductive reasoning. There exists software that supports and automates this reasoning.

There is another connection between categorical quantum mechanics and quantum logic, as subobjects in dagger kernel categories and dagger complemented biproduct categories form orthomodular lattices. In fact, the former setting allows logical quantifiers, the existence of which was never satisfactorily addressed in traditional quantum logic.

Categorical quantum mechanics as foundation for quantum mechanics

Categorical quantum mechanics allows a description of more general theories than quantum theory. This enables one to study which features single out quantum theory in contrast to other non-physical theories, hopefully providing some insight into the nature of quantum theory. For example, the framework allows a succinct compositional description of Spekkens' toy theory that allows one to pinpoint which structural ingredient causes it to be different from quantum theory.

Categorical quantum mechanics and DisCoCat

The DisCoCat framework applies categorical quantum mechanics to natural language processing. The types of a pregroup grammar are interpreted as quantum systems, i.e. as objects of a dagger compact category. The grammatical derivations are interpreted as quantum processes, e.g. a transitive verb takes its subject and object as input and produces a sentence as output. Function words such as determiners, prepositions, relative pronouns, coordinators, etc. can be modeled using the same Frobenius algebras that model classical communication. This can be understood as a monoidal functor from grammar to quantum processes, a formal analogy which led to the development of quantum natural language processing.

Television set

From Wikipedia, the free encyclopedia

A modern television displaying the Wikipedia homepage with poor color balance.

A television set or television receiver, more commonly called the television, TV, TV set, telly, tele, or tube, is a large device that combines a tuner, display, and loudspeakers, for the purpose of viewing and hearing television broadcasts, or as a computer monitor. Introduced in the late 1920s in mechanical form, television sets became a popular consumer product after World War II in electronic form, using cathode ray tube (CRT) technology. The addition of color to broadcast television after 1953 further increased the popularity of television sets in the 1960s, and an outdoor antenna became a common feature of suburban homes. The ubiquitous television set became the display device for the first recorded media for consumer use in the 1970s, such as Betamax, VHS; these were later succeeded by DVD. It has been used as a display device since the first generation of home computers (e.g. Timex Sinclair 1000) and dedicated video game consoles (e.g. Atari) in the 1980s. By the early 2010s, flat-panel television incorporating liquid-crystal display (LCD) technology, especially LED-backlit LCD technology, largely replaced CRT and other display technologies. Modern flat panel TVs are typically capable of high-definition display (720p, 1080i, 1080p, 4K, 8K) and can also play content from a USB device. Starting in the late 2010s, most flat panel TVs began to offer 4K and 8K resolutions.

History

Early television

RCA 630-TS, the first mass-produced electronic television set, which sold in 1946–1947

Mechanical televisions were commercially sold from 1928 to 1934 in the United Kingdom, France, the United States, and the Soviet Union. The earliest commercially made televisions were radios with the addition of a television device consisting of a neon tube behind a mechanically spinning disk with a spiral of apertures that produced a red postage-stamp size image, enlarged to twice that size by a magnifying glass. The Baird "Televisor" (sold in 1930–1933 in the UK) is considered the first mass-produced television, selling about a thousand units.

In 1926, Kenjiro Takayanagi demonstrated the first TV system that employed a cathode ray tube (CRT) display, at Hamamatsu Industrial High School in Japan. This was the first working example of a fully electronic television receiver. His research toward creating a production model was halted by the US after Japan lost World War II.

The first commercially made electronic televisions with cathode ray tubes were manufactured by Telefunken in Germany in 1934, followed by other makers in France (1936), Britain (1936), and US (1938). The cheapest model with a 12-inch (30 cm) screen was $445 (equivalent to $9,251 in 2022). An estimated 19,000 electronic televisions were manufactured in Britain, and about 1,600 in Germany, before World War II. About 7,000–8,000 electronic sets were made in the U.S. before the War Production Board halted manufacture in April 1942, production resuming in August 1945. Television usage in the western world skyrocketed after World War II with the lifting of the manufacturing freeze, war-related technological advances, the drop in television prices caused by mass production, increased leisure time, and additional disposable income. While only 0.5% of U.S. households had a television in 1946, 55.7% had one in 1954, and 90% by 1962. In Britain, there were 15,000 television households in 1947, 1.4 million in 1952, and 15.1 million by 1968.

Transistorized television

Portable boombox television by Sharp Corporation

Early electronic television sets were large and bulky, with analog circuits made of vacuum tubes. As an example, the RCA CT-100 color TV set used 36 vacuum tubes. Following the invention of the first working transistor at Bell Labs, Sony founder Masaru Ibuka predicted in 1952 that the transition to electronic circuits made of transistors would lead to smaller and more portable television sets. The first fully transistorized, portable solid-state television set was the 8-inch Sony TV8-301, developed in 1959 and released in 1960. By the 1970s, television manufacturers utilized this push for miniaturization to create small, console-styled sets which their salesmen could easily transport, pushing demand for television sets out into rural areas. However, the first fully transistorized color TV set, the HMV Colourmaster Model 2700, was released in 1967 by the British Radio Corporation. This began the transformation of television viewership from a communal viewing experience to a solitary viewing experience. By 1960, Sony had sold over 4 million portable television sets worldwide.

The MOSFET (metal–oxide–semiconductor field-effect transistor, or MOS transistor) was invented by Mohamed M. Atalla and Dawon Kahng at Bell Labs in 1959, and presented in 1960. RCA Laboratories researchers W.M. Austin, J.A. Dean, D.M. Griswold and O.P. Hart in 1966 proposed the use of the MOSFET in television circuits, including RF amplifier, low-level video, chroma and AGC circuits. The MOSFET was later widely adopted for most television circuits.

By the late 1960s and early 1970s, color television had come into wide use. In Britain, BBC1, BBC2 and ITV were regularly broadcasting in colour by 1969.

Portable boombox televisions have existed since at least the early 1980s.

LCD television

Samsung widescreen LCD television and DVD player

Building on the work of Mohamed M. Atalla and Dawon Kahng on the MOSFET, Paul K. Weimer at RCA developed the thin-film transistor (TFT) in 1962. It was a type of MOSFET distinct from the standard bulk MOSFET. The idea of a TFT-based liquid-crystal display (LCD) was conceived by Bernard Lechner of RCA Laboratories in 1968. Lechner, F. J. Marlowe, E. O. Nester and J. Tults demonstrated the concept in 1968 with a dynamic scattering LCD that used standard discrete MOSFETs.

In 1973, T. Peter Brody, J. A. Asars and G. D. Dixon at Westinghouse Research Laboratories demonstrated the first thin-film-transistor liquid-crystal display (TFT LCD). Brody and Fang-Chen Luo demonstrated the first flat active-matrix liquid-crystal display (AM LCD) in 1974.

By 1982, pocket LCD TVs based on AM LCD technology were developed in Japan. The 2.1-inch Epson ET-10 (Epson Elf) was the first color LCD pocket TV, released in 1984. In 1988, a Sharp research team led by engineer T. Nagayasu demonstrated a 14-inch full-color LCD display, which convinced the electronics industry that LCD would eventually replace cathode-ray tube (CRT) as the standard television display technology. The first wall-mountable TV was introduced by Sharp Corporation in 1992.

During the first decade of the 21st century, CRT "picture tube" display technology was almost entirely supplanted worldwide by flat-panel displays: first plasma displays around 1997, then LCDs. By the early 2010s, LCD TVs, which increasingly used LED-backlit LCDs, accounted for the overwhelming majority of television sets being manufactured.

TV sizes

Cambridge's Clive Sinclair created a mini TV in 1967 which could be held in the palm of a hand and was the world's smallest television at the time, though it never took off commercially because the design was complex. In 2019, Samsung launched the largest television to date at 292 inches, which is around 24 feet. The average size of TVs has grown over time.

Display

Television sets may employ one of several available display technologies. As of mid-2019, LCDs overwhelmingly predominate in new merchandise, but OLED displays are claiming an increasing market share as they become more affordable and DLP technology continues to offer some advantages in projection systems. The production of plasma and CRT displays has been completely discontinued.

There are four primary competing TV technologies:

  • CRT
  • LCD (multiple variations of LCD screens are called QLED, quantum dot, LED, LCD TN, LCD IPS, LCD PLS, LCD VA, etc.)
  • OLED
  • Plasma

CRT

A 14-inch cathode ray tube showing its deflection coils and electron guns

The cathode ray tube (CRT) is a vacuum tube containing a so-called electron gun (or three for a color television) and a fluorescent screen where the television image is displayed. The electron gun accelerates electrons in a beam which is deflected in both the vertical and horizontal directions using varying electric or (usually, in television sets) magnetic fields, in order to scan a raster image onto the fluorescent screen. The CRT requires an evacuated glass envelope, which is rather deep (well over half of the screen size), fairly heavy, and breakable. As a matter of radiation safety, both the face (panel) and back (funnel) were made of thick lead glass in order to reduce human exposure to harmful ionizing radiation (in the form of x-rays) produced when electrons accelerated using a high voltage (10-30kV) strike the screen. By the early 1970s, most color TVs replaced leaded glass in the face panel with vitrified strontium oxide glass, which also blocked x-ray emissions but allowed better color visibility. This also eliminated the need for cadmium phosphors in earlier color televisions. Leaded glass, which is less expensive, continued to be used in the funnel glass, which is not visible to the consumer.

In television sets (or most computer monitors that used CRT's), the entire screen area is scanned repetitively (completing a full frame 25 or 30 times a second) in a fixed pattern called a raster. The image information is received in real-time from a video signal which controls the electrical current supplying the electron gun, or in color television each of the three electron guns whose beams land on phosphors of the three primary colors (red, green, and blue). Except in the very early days of television, magnetic deflection has been used to scan the image onto the face of the CRT; this involves a varying current applied to both the vertical and horizontal deflection coils placed around the neck of the tube just beyond the electron gun(s).

DLP

The Christie Mirage 5000, a 2001 DLP projector.

Digital Light Processing (DLP) is a type of video projector technology that uses a digital micromirror device. Some DLPs have a TV tuner, which makes them a type of TV display. It was originally developed in 1987 by Larry Hornbeck of Texas Instruments. While the DLP imaging device was invented by Texas Instruments, the first DLP based projector was introduced by Digital Projection Ltd in 1997. Digital Projection and Texas Instruments were both awarded Emmy Awards in 1998 for the DLP projector technology. DLP is used in a variety of display applications from traditional static displays to interactive displays and also non-traditional embedded applications including medical, security, and industrial uses.

DLP technology is used in DLP front projectors (standalone projection units for classrooms and business primarily), DLP rear projection television sets, and digital signs. It is also used in about 85% of digital cinema projection, and in additive manufacturing as a power source in some SLA 3D printers to cure resins into solid 3D objects.

Rear projection

Rear-projection televisions (RPTVs) became very popular in the early days of television, when the ability to practically produce tubes with a large display size did not exist. In 1936, for a tube capable of being mounted horizontally in the television cabinet, nine inches would have been regarded as the largest convenient size that could be made owing to its required length, due to the low deflection angles of CRTs produced in the era, which meant that CRTs with large front sizes would have also needed to be very deep, which caused such CRTs to be installed at an angle to reduce the cabinet depth of the TV set. Twelve inch tubes and TV sets were available, but the tubes were so long (deep) that they were mounted vertically and viewed via a mirror in the top of the TV set cabinet which was usually mounted under a hinged lid, reducing considerably the depth of the set but making it taller. These mirror lid televisions were large pieces of furniture.

As a solution, Philips introduced a television set in 1937 that relied on back projecting an image from a 4+12 inch tube onto a 25-inch screen. This required the tube to be driven very hard (at unusually high voltages and currents, see Cathode-ray tube#Projection CRTs) to produce an extremely bright image on its fluorescent screen. Further, Philips decided to use a green phosphor on the tube face as it was brighter than the white phosphors of the day. In fact these early tubes were not up to the job and by November of that year Philips decided that it was cheaper to buy the sets back than to provide replacement tubes under warranty every couple of weeks or so. Substantial improvements were very quickly made to these small tubes and a more satisfactory tube design was available the following year helped by Philips's decision to use a smaller screen size of 23 inches. In 1950 a more efficient 2+12 inch tube with vastly improved technology and more efficient white phosphor, along with smaller and less demanding screen sizes, was able to provide an acceptable image, though the life of the tubes was still shorter than contemporary direct view tubes. As cathode ray tube technology improved during the 1950s, producing larger and larger screen sizes and later on, (more or less) rectangular tubes, the rear projection system was obsolete before the end of the decade.

However, in the early to mid 2000s RPTV systems made a comeback as a cheaper alternative to contemporary LCD and Plasma TVs. They were larger and lighter than contemporary CRT TVs and had a flat screen just like LCD and Plasma, but unlike LCD and Plasma, RPTVs were often dimmer, had lower contrast ratios and viewing angles, image quality was affected by room lighting and suffered when compared with direct view CRTs, and were still bulky like CRTs. These TVs worked by having a DLP, LCoS or LCD projector at the bottom of the unit, and using a mirror to project the image onto a screen. The screen may be a fresnel lens to increase brightness at the cost of viewing angles. Some early units used CRT projectors and were heavy, weighing up to 500 pounds. Most RPTVs used Ultra-high-performance lamps as their light source, which required periodic replacement partly because they dimmed with use but mainly because the operating bulb glass became weaker with ageing to the point where the bulb could eventually shatter often damaging the projection system. Those that used CRTs and lasers did not require replacement.

Plasma

A plasma display panel (PDP) is a type of flat panel display common to large TV displays 30 inches (76 cm) or larger. They are called "plasma" displays because the technology utilizes small cells containing electrically charged ionized gases, or what are in essence chambers more commonly known as fluorescent lamps. Around 2014, television manufacturers were largely phasing out plasma TVs, because a plasma TV became higher cost and more difficult to make in 4k compared to LED or LCD.

LCD

A generic LCD TV, with speakers on either side of the screen.

Liquid-crystal-display televisions (LCD TV) are television sets that use Liquid-crystal displays to produce images. LCD televisions are much thinner and lighter than cathode ray tube (CRTs) of similar display size and are available in much larger sizes (e.g., 90-inch diagonal). When manufacturing costs fell, this combination of features made LCDs practical for television receivers.

In 2007, LCD televisions surpassed sales of CRT-based televisions globally for the first time, and their sales figures relative to other technologies accelerated. LCD TVs quickly displaced the only major competitors in the large-screen market, the plasma display panel and rear-projection television. In the mid-2010s LCDs became, by far, the most widely produced and sold television display type.

LCDs also have disadvantages. Other technologies address these weaknesses, including OLEDs, FED and SED.

OLED

An OLED TV.

An OLED (organic light-emitting diode) is a light-emitting diode (LED) in which the emissive electroluminescent layer is a film of organic compound which emits light in response to an electric current. This layer of organic semiconductor is situated between two electrodes. Generally, at least one of these electrodes is transparent. OLEDs are used to create digital displays in devices such as television screens. It is also used for computer monitors, portable systems such as mobile phones, handheld game consoles and PDAs.

There are two main families of OLED: those based on small molecules and those employing polymers. Adding mobile ions to an OLED creates a light-emitting electrochemical cell or LEC, which has a slightly different mode of operation. OLED displays can use either passive-matrix (PMOLED) or active-matrix addressing schemes. Active-matrix OLEDs (AMOLED) require a thin-film transistor backplane to switch each individual pixel on or off, but allow for higher resolution and larger display sizes.

An OLED display works without a backlight. Thus, it can display deep black levels and can be thinner and lighter than a liquid crystal display (LCD). In low ambient light conditions such as a dark room, an OLED screen can achieve a higher contrast ratio than an LCD, whether the LCD uses cold cathode fluorescent lamps or LED backlight.

Television types

While most televisions are designed for consumers in the household, there are several markets that demand variations including hospitality, healthcare, and other commercial settings.

Hospitality television

Televisions made for the hospitality industry are part of an establishment's internal television system designed to be used by its guests. Therefore, settings menus are hidden and locked by a password. Other common software features include volume limiting, customizable power-on splash image, and channel hiding. These TVs are typically controlled by a set-back box using one of the data ports on the rear of the TV. The set back box may offer channel lists, pay per view, video on demand, and casting from a smart phone or tablet.

Hospitality spaces are insecure with respect to content piracy, so many content providers require the use of Digital rights management. Hospitality TVs decrypt the industry standard Pro:Idiom when no set back box is used. While H.264 is not part of the ATSC 1.0 standard in North America, TV content in hospitality can include H.264 encoded video, so hospitality TVs include H.264 decoding. Managing dozens or hundreds of TVs can be time consuming, so hospitality TVs can be cloned by storing settings on a USB drive and restoring those settings quickly. Additionally, server-based and cloud-based management systems can monitor and configure an entire fleet of TVs.

Healthcare television

Healthcare televisions include the provisions of hospitality TVs with additional features for usability and safety. They are designed for use in a healthcare setting in which the user may have limited mobility and audio/visual impairment. A key feature is the pillow speaker connection. Pillow speakers combine nurse call functions, TV remote control and a speaker for audio. In multiple occupancy rooms where several TVs are used in close proximity, the televisions can be programmed to respond to a remote control with unique codes so that each remote only controls one TV. Smaller TVs, also called bedside infotainment systems, have a full function keypad below the screen. This allows direct interaction without the use of a pillow speaker or remote. These TVs typically have antimicrobial surfaces and can withstand daily cleaning using disinfectants. In the US, the UL safety standard for televisions, UL62636-1, contains a special section (annex DVB) which outlines additional safety requirements for televisions used in healthcare.

Outdoor television

Outdoor television sets are designed for outdoor use and are usually found in the outdoor sections of bars, sports field, or other community facilities. Most outdoor televisions use high-definition television technology. Their body is more robust. The screens are designed to remain clearly visible even in sunny outdoor lighting. The screens also have anti-reflective coatings to prevent glare. They are weather-resistant and often also have anti-theft brackets. Outdoor TV models can also be connected with BD players and PVRs for greater functionality.

Replacing

46-inch LCD television set in a cardbox with 1,20 m height, 70 cm width and 25 cm depth. Such packages are difficult to handle and expensive to send via commercial carriers, which renders the selling of used TVs cumbersome.

In the United States, the average consumer replaces their television every 6.9 years, but research suggests that due to advanced software and apps, the replacement cycle may be shortening.

Recycling and disposal

Due to recent changes in electronic waste legislation, economical and environmentally friendly television disposal has been made increasingly more available in the form of television recycling. Challenges with recycling television sets include proper HAZMAT disposal, landfill pollution, and illegal international trade.

Major manufacturers

Consumer Reports product testing, with LCD and plasma television sets

Global 2016 years statistics for LCD TV.

Rank Manufacturer Market share (%) Headquarters
1 South Korea Samsung Electronics 20.2 Suwon, South Korea
2 South Korea LG Electronics 12.1 Seoul, South Korea
3 China TCL Technology 9 Huizhou, China
4 China Hisense 6.1 Qingdao, China
5 Japan Sony 5.6 Tokyo, Japan
7 China Skyworth 3.8 Shenzhen, China
8 United States Vizio Inc. 3.7 Irvine, United States
9 China Changhong 3.2 Mianyang, China
10 China Haier 3 Qingdao, China
11
Others 27.2

Entropy (information theory)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Entropy_(information_theory) In info...