Search This Blog

Sunday, July 14, 2019

Ergodic theory

From Wikipedia, the free encyclopedia
 Ergodic theory (Greek: έργον ergon "work", όδος hodos "way") is a branch of mathematics that studies dynamical systems with an invariant measure and related problems. Its initial development was motivated by problems of statistical physics.

A central concern of ergodic theory is the behavior of a dynamical system when it is allowed to run for a long time. The first result in this direction is the Poincaré recurrence theorem, which claims that almost all points in any subset of the phase space eventually revisit the set. More precise information is provided by various ergodic theorems which assert that, under certain conditions, the time average of a function along the trajectories exists almost everywhere and is related to the space average. Two of the most important theorems are those of Birkhoff (1931) and von Neumann which assert the existence of a time average along each trajectory. For the special class of ergodic systems, this time average is the same for almost all initial points: statistically speaking, the system that evolves for a long time "forgets" its initial state. Stronger properties, such as mixing and equidistribution, have also been extensively studied.

The problem of metric classification of systems is another important part of the abstract ergodic theory. An outstanding role in ergodic theory and its applications to stochastic processes is played by the various notions of entropy for dynamical systems.

The concepts of ergodicity and the ergodic hypothesis are central to applications of ergodic theory. The underlying idea is that for certain systems the time average of their properties is equal to the average over the entire space. Applications of ergodic theory to other parts of mathematics usually involve establishing ergodicity properties for systems of special kind. In geometry, methods of ergodic theory have been used to study the geodesic flow on Riemannian manifolds, starting with the results of Eberhard Hopf for Riemann surfaces of negative curvature. Markov chains form a common context for applications in probability theory. Ergodic theory has fruitful connections with harmonic analysis, Lie theory (representation theory, lattices in algebraic groups), and number theory (the theory of diophantine approximations, L-functions).

Ergodic transformations

Ergodic theory is often concerned with ergodic transformations. The intuition behind such transformations, which act on a given set, is that they do a thorough job "stirring" the elements of that set (e.g., if the set is a quantity of hot oatmeal in a bowl, and if a spoonful of syrup is dropped into the bowl, then iterations of the inverse of an ergodic transformation of the oatmeal will not allow the syrup to remain in a local subregion of the oatmeal, but will distribute the syrup evenly throughout. At the same time, these iterations will not compress or dilate any portion of the oatmeal: they preserve the measure that is density.) Here is the formal definition. 

Let T : XX be a measure-preserving transformation on a measure space (X, Σ, μ), with μ(X) = 1. Then T is ergodic if for every E in Σ with T−1(E) = E, either μ(E) = 0 or μ(E) = 1.

Examples

Evolution of an ensemble of classical systems in phase space (top). The systems are massive particles in a one-dimensional potential well (red curve, lower figure). The initially compact ensemble becomes swirled up over time and "spread around" phase space. This is however not ergodic behaviour since the systems do not visit the left-hand potential well.
  • An irrational rotation of the circle R/Z, T: xx + θ, where θ is irrational, is ergodic. This transformation has even stronger properties of unique ergodicity, minimality, and equidistribution. By contrast, if θ = p/q is rational (in lowest terms) then T is periodic, with period q, and thus cannot be ergodic: for any interval I of length a, 0 < a < 1/q, its orbit under T (that is, the union of I, T(I), ..., Tq−1(I), which contains the image of I under any number of applications of T) is a T-invariant mod 0 set that is a union of q intervals of length a, hence it has measure qa strictly between 0 and 1.
  • Let G be a compact abelian group, μ the normalized Haar measure, and T a group automorphism of G. Let G* be the Pontryagin dual group, consisting of the continuous characters of G, and T* be the corresponding adjoint automorphism of G*. The automorphism T is ergodic if and only if the equality (T*)n(χ) = χ is possible only when n = 0 or χ is the trivial character of G. In particular, if G is the n-dimensional torus and the automorphism T is represented by a unimodular matrix A then T is ergodic if and only if no eigenvalue of A is a root of unity.
  • A Bernoulli shift is ergodic. More generally, ergodicity of the shift transformation associated with a sequence of i.i.d. random variables and some more general stationary processes follows from Kolmogorov's zero–one law.
  • Ergodicity of a continuous dynamical system means that its trajectories "spread around" the phase space. A system with a compact phase space which has a non-constant first integral cannot be ergodic. This applies, in particular, to Hamiltonian systems with a first integral I functionally independent from the Hamilton function H and a compact level set X = {(p,q): H(p,q) = E} of constant energy. Liouville's theorem implies the existence of a finite invariant measure on X, but the dynamics of the system is constrained to the level sets of I on X, hence the system possesses invariant sets of positive but less than full measure. A property of continuous dynamical systems that is the opposite of ergodicity is complete integrability.

Ergodic theorems

Let T: XX be a measure-preserving transformation on a measure space (X, Σ, μ) and suppose ƒ is a μ-integrable function, i.e. ƒ ∈ L1(μ). Then we define the following averages:
Time average: This is defined as the average (if it exists) over iterations of T starting from some initial point x:
Space average: If μ(X) is finite and nonzero, we can consider the space or phase average of ƒ:
In general the time average and space average may be different. But if the transformation is ergodic, and the measure is invariant, then the time average is equal to the space average almost everywhere. This is the celebrated ergodic theorem, in an abstract form due to George David Birkhoff. (Actually, Birkhoff's paper considers not the abstract general case but only the case of dynamical systems arising from differential equations on a smooth manifold.) The equidistribution theorem is a special case of the ergodic theorem, dealing specifically with the distribution of probabilities on the unit interval. 

More precisely, the pointwise or strong ergodic theorem states that the limit in the definition of the time average of ƒ exists for almost every x and that the (almost everywhere defined) limit function ƒ̂ is integrable:
Furthermore, is T-invariant, that is to say
holds almost everywhere, and if μ(X) is finite, then the normalization is the same:
In particular, if T is ergodic, then ƒ̂ must be a constant (almost everywhere), and so one has that
almost everywhere. Joining the first to the last claim and assuming that μ(X) is finite and nonzero, one has that
for almost all x, i.e., for all x except for a set of measure zero. 

For an ergodic transformation, the time average equals the space average almost surely. 

As an example, assume that the measure space (X, Σ, μ) models the particles of a gas as above, and let ƒ(x) denote the velocity of the particle at position x. Then the pointwise ergodic theorems says that the average velocity of all particles at some given time is equal to the average velocity of one particle over time. 

A generalization of Birkhoff's theorem is Kingman's subadditive ergodic theorem.

Probabilistic formulation: Birkhoff–Khinchin theorem

Birkhoff–Khinchin theorem. Let ƒ be measurable, E(|ƒ|) < ∞, and T be a measure-preserving map. Then with probability 1:
where is the conditional expectation given the σ-algebra of invariant sets of T

Corollary (Pointwise Ergodic Theorem): In particular, if T is also ergodic, then is the trivial σ-algebra, and thus with probability 1:

Mean ergodic theorem

Von Neumann's mean ergodic theorem, holds in Hilbert spaces.

Let U be a unitary operator on a Hilbert space H; more generally, an isometric linear operator (that is, a not necessarily surjective linear operator satisfying ‖Ux‖ = ‖x‖ for all x in H, or equivalently, satisfying U*U = I, but not necessarily UU* = I). Let P be the orthogonal projection onto {ψ ∈ H |  = ψ} = ker(I − U). 

Then, for any x in H, we have:
where the limit is with respect to the norm on H. In other words, the sequence of averages
converges to P in the strong operator topology

Indeed, it is not difficult to see that in this case any admits an orthogonal decomposition into parts from and respectively. The former part is invariant in all the partial sums as grows, while for the latter part, from the telescoping series one would have:
This theorem specializes to the case in which the Hilbert space H consists of L2 functions on a measure space and U is an operator of the form
where T is a measure-preserving endomorphism of X, thought of in applications as representing a time-step of a discrete dynamical system. The ergodic theorem then asserts that the average behavior of a function ƒ over sufficiently large time-scales is approximated by the orthogonal component of ƒ which is time-invariant. 

In another form of the mean ergodic theorem, let Ut be a strongly continuous one-parameter group of unitary operators on H. Then the operator
converges in the strong operator topology as T → ∞. In fact, this result also extends to the case of strongly continuous one-parameter semigroup of contractive operators on a reflexive space.

Remark: Some intuition for the mean ergodic theorem can be developed by considering the case where complex numbers of unit length are regarded as unitary transformations on the complex plane (by left multiplication). If we pick a single complex number of unit length (which we think of as U), it is intuitive that its powers will fill up the circle. Since the circle is symmetric around 0, it makes sense that the averages of the powers of U will converge to 0. Also, 0 is the only fixed point of U, and so the projection onto the space of fixed points must be the zero operator (which agrees with the limit just described).

Convergence of the ergodic means in the Lp norms

Let (X, Σ, μ) be as above a probability space with a measure preserving transformation T, and let 1 ≤ p ≤ ∞. The conditional expectation with respect to the sub-σ-algebra ΣT of the T-invariant sets is a linear projector ET of norm 1 of the Banach space Lp(X, Σ, μ) onto its closed subspace Lp(X, ΣT, μ) The latter may also be characterized as the space of all T-invariant Lp-functions on X. The ergodic means, as linear operators on Lp(X, Σ, μ) also have unit operator norm; and, as a simple consequence of the Birkhoff–Khinchin theorem, converge to the projector ET in the strong operator topology of Lp if 1 ≤ p ≤ ∞, and in the weak operator topology if p = ∞. More is true if 1 < p ≤ ∞ then the Wiener–Yoshida–Kakutani ergodic dominated convergence theorem states that the ergodic means of ƒ ∈ Lp are dominated in Lp; however, if ƒ ∈ L1, the ergodic means may fail to be equidominated in Lp. Finally, if ƒ is assumed to be in the Zygmund class, that is |ƒ| log+(|ƒ|) is integrable, then the ergodic means are even dominated in L1.

Sojourn time

Let (X, Σ, μ) be a measure space such that μ(X) is finite and nonzero. The time spent in a measurable set A is called the sojourn time. An immediate consequence of the ergodic theorem is that, in an ergodic system, the relative measure of A is equal to the mean sojourn time:
for all x except for a set of measure zero, where χA is the indicator function of A.

The occurrence times of a measurable set A is defined as the set k1, k2, k3, ..., of times k such that Tk(x) is in A, sorted in increasing order. The differences between consecutive occurrence times Ri = kiki−1 are called the recurrence times of A. Another consequence of the ergodic theorem is that the average recurrence time of A is inversely proportional to the measure of A, assuming that the initial point x is in A, so that k0 = 0.
That is, the smaller A is, the longer it takes to return to it.

Ergodic flows on manifolds

The ergodicity of the geodesic flow on compact Riemann surfaces of variable negative curvature and on compact manifolds of constant negative curvature of any dimension was proved by Eberhard Hopf in 1939, although special cases had been studied earlier: see for example, Hadamard's billiards (1898) and Artin billiard (1924). The relation between geodesic flows on Riemann surfaces and one-parameter subgroups on SL(2, R) was described in 1952 by S. V. Fomin and I. M. Gelfand. The article on Anosov flows provides an example of ergodic flows on SL(2, R) and on Riemann surfaces of negative curvature. Much of the development described there generalizes to hyperbolic manifolds, since they can be viewed as quotients of the hyperbolic space by the action of a lattice in the semisimple Lie group SO(n,1). Ergodicity of the geodesic flow on Riemannian symmetric spaces was demonstrated by F. I. Mautner in 1957. In 1967 D. V. Anosov and Ya. G. Sinai proved ergodicity of the geodesic flow on compact manifolds of variable negative sectional curvature. A simple criterion for the ergodicity of a homogeneous flow on a homogeneous space of a semisimple Lie group was given by Calvin C. Moore in 1966. Many of the theorems and results from this area of study are typical of rigidity theory

In the 1930s G. A. Hedlund proved that the horocycle flow on a compact hyperbolic surface is minimal and ergodic. Unique ergodicity of the flow was established by Hillel Furstenberg in 1972. Ratner's theorems provide a major generalization of ergodicity for unipotent flows on the homogeneous spaces of the form Γ \ G, where G is a Lie group and Γ is a lattice in G.

In the last 20 years, there have been many works trying to find a measure-classification theorem similar to Ratner's theorems but for diagonalizable actions, motivated by conjectures of Furstenberg and Margulis. An important partial result (solving those conjectures with an extra assumption of positive entropy) was proved by Elon Lindenstrauss, and he was awarded the Fields medal in 2010 for this result.

Heat death of the universe

From Wikipedia, the free encyclopedia

The heat death of the universe, also known as the Big Chill or Big Freeze, is a conjecture on the ultimate fate of the universe, which suggests the universe would evolve to a state of no thermodynamic free energy and would therefore be unable to sustain processes that increase entropy. Heat death does not imply any particular absolute temperature; it only requires that temperature differences or other processes may no longer be exploited to perform work. In the language of physics, this is when the universe reaches thermodynamic equilibrium (maximum entropy).

If the topology of the universe is open or flat, or if dark energy is a positive cosmological constant (both of which are consistent with current data), the universe will continue expanding forever, and a heat death is expected to occur, with the universe cooling to approach equilibrium at a very low temperature after a very long time period.

The hypothesis of heat death stems from the ideas of William Thomson, 1st Baron Kelvin (Lord Kelvin), who in the 1850s took the theory of heat as mechanical energy loss in nature (as embodied in the first two laws of thermodynamics) and extrapolated it to larger processes on a universal scale.

Origins of the idea

The idea of heat death stems from the second law of thermodynamics, of which one version states that entropy tends to increase in an isolated system. From this, the hypothesis implies that if the universe lasts for a sufficient time, it will asymptotically approach a state where all energy is evenly distributed. In other words, according to this hypothesis, there is a tendency in nature to the dissipation (energy transformation) of mechanical energy (motion) into thermal energy; hence, by extrapolation, there exists the view that, in time, the mechanical movement of the universe will run down as work is converted to heat because of the second law. 

The conjecture that all bodies in the universe cool off, eventually becoming too cold to support life, seems to have been first put forward by the French astronomer Jean Sylvain Bailly in 1777 in his writings on the history of astronomy and in the ensuing correspondence with Voltaire. In Bailly's view, all planets have an internal heat and are now at some particular stage of cooling. Jupiter, for instance, is still too hot for life to arise there for thousands of years, while the Moon is already too cold. The final state, in this view, is described as one of "equilibrium" in which all motion ceases.

The idea of heat death as a consequence of the laws of thermodynamics, however, was first proposed in loose terms beginning in 1851 by William Thomson, who theorized further on the mechanical energy loss views of Sadi Carnot (1824), James Joule (1843), and Rudolf Clausius (1850). Thomson's views were then elaborated on more definitively over the next decade by Hermann von Helmholtz and William Rankine.

History

The idea of heat death of the universe derives from discussion of the application of the first two laws of thermodynamics to universal processes. Specifically, in 1851, William Thomson outlined the view, as based on recent experiments on the dynamical theory of heat: "heat is not a substance, but a dynamical form of mechanical effect, we perceive that there must be an equivalence between mechanical work and heat, as between cause and effect."

Lord Kelvin originated the idea of universal heat death in 1852.
 
In 1852, Thomson published On a Universal Tendency in Nature to the Dissipation of Mechanical Energy, in which he outlined the rudiments of the second law of thermodynamics summarized by the view that mechanical motion and the energy used to create that motion will naturally tend to dissipate or run down. The ideas in this paper, in relation to their application to the age of the Sun and the dynamics of the universal operation, attracted the likes of William Rankine and Hermann von Helmholtz. The three of them were said to have exchanged ideas on this subject. In 1862, Thomson published "On the age of the Sun’s heat", an article in which he reiterated his fundamental beliefs in the indestructibility of energy (the first law) and the universal dissipation of energy (the second law), leading to diffusion of heat, cessation of useful motion (work), and exhaustion of potential energy through the material universe, while clarifying his view of the consequences for the universe as a whole. In a key paragraph, Thomson wrote:
The result would inevitably be a state of universal rest and death, if the universe were finite and left to obey existing laws. But it is impossible to conceive a limit to the extent of matter in the universe; and therefore science points rather to an endless progress, through an endless space, of action involving the transformation of potential energy into palpable motion and hence into heat, than to a single finite mechanism, running down like a clock, and stopping for ever.
In the years to follow both Thomson's 1852 and the 1865 papers, Helmholtz and Rankine both credited Thomson with the idea, but read further into his papers by publishing views stating that Thomson argued that the universe will end in a "heat death" (Helmholtz) which will be the "end of all physical phenomena" (Rankine).

Current status

Proposals about the final state of the universe depend on the assumptions made about its ultimate fate, and these assumptions have varied considerably over the late 20th century and early 21st century. In a hypothesized "open" or "flat" universe that continues expanding indefinitely, either a heat death or a Big Rip is expected to eventually occur. If the cosmological constant is zero, the universe will approach absolute zero temperature over a very long timescale. However, if the cosmological constant is positive, as appears to be the case in recent observations, the temperature will asymptote to a non-zero positive value, and the universe will approach a state of maximum entropy in which no further work is possible.

If a Big Rip does not happen long before that, the "heat death" situation could be avoided if there is a method or mechanism to regenerate hydrogen atoms from radiation, dark matter, dark energy, zero-point energy, or other sources so that star formation and heat transfer can continue to avoid a gradual running down of the universe due to the conversion of matter into energy and heavier elements in stellar processes and the absorption of matter by black holes and their subsequent evaporation as Hawking radiation.

Time frame for heat death

From the Big Bang through the present day, matter and dark matter in the universe are thought to have been concentrated in stars, galaxies, and galaxy clusters, and are presumed to continue to be so well into the future. Therefore, the universe is not in thermodynamic equilibrium, and objects can do physical work. The decay time for a supermassive black hole of roughly 1 galaxy mass (1011 solar masses) due to Hawking radiation is on the order of 10100 years, so entropy can be produced until at least that time. Some monster black holes in the universe are predicted to continue to grow up to perhaps 1014 M during the collapse of superclusters of galaxies. Even these would evaporate over a timescale of up to 10106 years. After that time, the universe enters the so-called Dark Era and is expected to consist chiefly of a dilute gas of photons and leptons. With only very diffuse matter remaining, activity in the universe will have tailed off dramatically, with extremely low energy levels and extremely long timescales. Speculatively, it is possible that the universe may enter a second inflationary epoch, or assuming that the current vacuum state is a false vacuum, the vacuum may decay into a lower-energy state. It is also possible that entropy production will cease and the universe will reach heat death. Another universe could possibly be created by random quantum fluctuations or quantum tunneling in roughly years. Over vast periods of time, a spontaneous entropy decrease would eventually occur via the Poincaré recurrence theorem, thermal fluctuations, and fluctuation theorem. Such a scenario, however, has been described as "highly speculative, probably wrong, [and] completely untestable". Sean M. Carroll, originally an advocate of this idea, no longer supports it.

Controversies

Max Planck wrote that the phrase "entropy of the universe" has no meaning because it admits of no accurate definition. More recently, Grandy writes: "It is rather presumptuous to speak of the entropy of a universe about which we still understand so little, and we wonder how one might define thermodynamic entropy for a universe and its major constituents that have never been in equilibrium in their entire existence." According to Tisza: "If an isolated system is not in equilibrium, we cannot associate an entropy with it." Buchdahl writes of "the entirely unjustifiable assumption that the universe can be treated as a closed thermodynamic system". According to Gallavotti: "... there is no universally accepted notion of entropy for systems out of equilibrium, even when in a stationary state." Discussing the question of entropy for non-equilibrium states in general, Lieb and Yngvason express their opinion as follows: "Despite the fact that most physicists believe in such a nonequilibrium entropy, it has so far proved impossible to define it in a clearly satisfactory way." In Landsberg's opinion: "The third misconception is that thermodynamics, and in particular, the concept of entropy, can without further enquiry be applied to the whole universe. ... These questions have a certain fascination, but the answers are speculations, and lie beyond the scope of this book."

A recent analysis of entropy states, "The entropy of a general gravitational field is still not known", and, "gravitational entropy is difficult to quantify". The analysis considers several possible assumptions that would be needed for estimates and suggests that the observable universe has more entropy than previously thought. This is because the analysis concludes that supermassive black holes are the largest contributor. Lee Smolin goes further: "It has long been known that gravity is important for keeping the universe out of thermal equilibrium. Gravitationally bound systems have negative specific heat—that is, the velocities of their components increase when energy is removed. ... Such a system does not evolve toward a homogeneous equilibrium state. Instead it becomes increasingly structured and heterogeneous as it fragments into subsystems."

Saturday, July 13, 2019

Severe weather

From Wikipedia, the free encyclopedia

Various forms of severe weather

Severe weather refers to any dangerous meteorological phenomena with the potential to cause damage, serious social disruption, or loss of human life. Types of severe weather phenomena vary, depending on the latitude, altitude, topography, and atmospheric conditions. High winds, hail, excessive precipitation, and wildfires are forms and effects of severe weather, as are thunderstorms, downbursts, tornadoes, waterspouts, tropical cyclones, and extratropical cyclones. Regional and seasonal severe weather phenomena include blizzards (snowstorms), ice storms, and duststorms.

Terminology

Meteorologists generally define severe weather as any aspect of the weather that poses risks to life, property or requires the intervention of authorities. A narrower definition of severe weather is any weather phenomena relating to severe thunderstorms.

According to the World Meteorological Organization (WMO), severe weather can be categorized into two groups: general severe weather and localized severe weather. Nor'easters, European wind storms, and the phenomena that accompany them form over wide geographic areas. These occurrences are classified as general severe weather. Downbursts and tornadoes are more localized and therefore have a more limited geographic effect. These forms of weather are classified as localized severe weather. The term severe weather is technically not the same phenomenon as extreme weather. Extreme weather describes unusual weather events that are at the extremes of the historical distribution for a given area.

Causes

This graphic shows the conditions favorable for certain organized thunderstorm complexes, based upon CAPE and vertical wind shear values.
 
Organized severe weather occurs from the same conditions that generate ordinary thunderstorms: atmospheric moisture, lift (often from thermals), and instability. A wide variety of conditions cause severe weather. Several factors can convert thunderstorms into severe weather. For example, a pool of cold air aloft may aid in the development of large hail from an otherwise innocuous appearing thunderstorm. However, the most severe hail and tornadoes are produced by supercell thunderstorms, and the worst downbursts and derechos (straight-line winds) are produced by bow echoes. Both of these types of storms tend to form in environments high in wind shear.

Floods, hurricanes, tornadoes, and thunderstorms are considered to be the most destructive weather-related [develop models to predict the most frequent and possible locations. This information is used to notify affected areas and save lives.

Categories

Diagram showing ingredients needed for severe weather. The red arrow shows the position of the low level jet stream, while the blue arrow shows the location of the upper level jet stream
 
Severe thunderstorms can be assessed in three different categories. These are "approaching severe", "severe", and "significantly severe". 

Approaching severe is defined as hail between 12 to 1 inch (13 to 25 mm) diameter or winds between 50 and 58 M.P.H. (50 knots, 80–93 km/h). In the United States, such storms will usually warrant a Significant Weather Alert.

Severe is defined as hail 1 to 2 inches (25 to 51 mm) diameter, winds 58 to 75 miles per hour (93 to 121 km/h), or an F1 tornado.

Significant severe is defined as hail 2 inches (51 mm) in diameter or larger, winds 75 M.P.H. (65 knots, 120 km/h) or more, or a tornado of strength EF2 or stronger. 

Both severe and significant severe events warrant a severe thunderstorm warning from the United States National Weather Service (excludes flash floods), the Environment Canada, the Australian Bureau of Meteorology, or the Meteorological Service of New Zealand if the event occurs in those countries. If a tornado is occurring (a tornado has been seen by spotters) or is imminent (Doppler weather radar has observed strong rotation in a storm, indicating an incipient tornado), the severe thunderstorm warning will be superseded by a tornado warning in the United States and Canada.

A severe weather outbreak is typically considered to be when ten or more tornadoes, some of which will likely be long-tracked and violent, and many large hail or damaging wind reports occur within one or more consecutive days. Severity is also dependent on the size of the geographic area affected, whether it covers hundreds or thousands of square kilometers.

High winds

Panorama of a strong shelf cloud, which can precede the onset of high winds
 
High winds are known to cause damage, depending upon their strength.

Wind speeds as low as 23 knots (43 km/h) may lead to power outages when tree branches fall and disrupt power lines. Some species of trees are more vulnerable to winds. Trees with shallow roots are more prone to uproot, and brittle trees such as eucalyptus, sea hibiscus, and avocado are more prone to branch damage.

Wind gusts may cause poorly designed suspension bridges to sway. When wind gusts harmonize with the frequency of the swaying bridge, the bridge may fail as occurred with the Tacoma Narrows Bridge in 1940.

Hurricane-force winds, caused by individual thunderstorms, thunderstorm complexes, derechos, tornadoes, extratropical cyclones, or tropical cyclones can destroy mobile homes and structurally damage buildings with foundations. Winds of this strength due to downslope winds off terrain have been known to shatter windows and sandblast paint from cars.

Once winds exceed 135 knots (250 km/h) within strong tropical cyclones and tornadoes, homes completely collapse, and significant damage is done to larger buildings. Total destruction to man-made structures occurs when winds reach 175 knots (324 km/h). The Saffir–Simpson scale for cyclones and Enhanced Fujita scale (TORRO scale in Europe) for tornados were developed to help estimate wind speed from the damage they cause.

Tornado

The F5 tornado that struck Elie, Manitoba, Canada in 2007.
 
A dangerous rotating column of air in contact with both the surface of the earth and the base of a cumulonimbus cloud (thundercloud) or a cumulus cloud, in rare cases. Tornadoes come in many sizes but typically form a visible condensation funnel whose narrowest end reaches the earth and surrounded by a cloud of debris and dust.

Tornadoes' wind speeds generally average between 40 miles per hour (64 km/h) and 110 miles per hour (180 km/h). They are approximately 250 feet (76 m) across and travel a few miles (kilometers) before dissipating. Some attain wind speeds in excess of 300 miles per hour (480 km/h), may stretch more than two miles (3.2 km) across, and maintain contact with the ground for dozens of miles (more than 100 km).

Tornadoes, despite being one of the most destructive weather phenomena, are generally short-lived. A long-lived tornado generally lasts no more than an hour, but some have been known to last for 2 hours or longer (for example, the Tri-State Tornado). Due to their relatively short duration, less information is known about the development and formation of tornadoes.

Downburst and derecho

Downbursts are created within thunderstorms by significantly rain-cooled air, which, upon reaching ground level, spreads out in all directions and produce strong winds. Unlike winds in a tornado, winds in a downburst are not rotational but are directed outwards from the point where they strike land or water.

Illustration of a microburst. The air moves in a downward motion until it hits ground level. It then spreads outward in all directions.
 
"Dry downbursts" are associated with thunderstorms with very little precipitation, while wet downbursts are generated by thunderstorms with large amounts of rainfall. Microbursts are very small downbursts with winds that extend up to 2.5 miles (4 km) from their source, while macrobursts are large-scale downbursts with winds that extend in excess of 2.5 miles (4 km). The heat burst is created by vertical currents on the backside of old outflow boundaries and squall lines where rainfall is lacking. Heat bursts generate significantly higher temperatures due to the lack of rain-cooled air in their formation. Derechos are longer, usually stronger, forms of downburst winds characterized by straight-lined windstorms.

Downbursts create vertical wind shear or microbursts, which are dangerous to aviation. These convective downbursts can produce damaging winds, lasting 5 to 30 minutes, with wind speeds as high as 168 mph (75 m/s), and cause tornado-like damage on the ground. Downbursts also occur much more frequently than tornadoes, with ten downburst damage reports for every one tornado.

Squall line

Cyclonic vortex over Pennsylvania with a trailing squall line.

A squall line is an elongated line of severe thunderstorms that can form along or ahead of a cold front. The squall line typically contains heavy precipitation, hail, frequent lightning, strong straight line winds, and possibly tornadoes or waterspouts. Severe weather in the form of strong straight-line winds can be expected in areas where the squall line forms a bow echo, in the farthest portion of the bow. Tornadoes can be found along waves within a line echo wave pattern (LEWP) where mesoscale low-pressure areas are present. Intense bow echoes responsible for widespread, extensive wind damage are called derechos, and move quickly over large territories. A wake low or a mesoscale low-pressure area forms behind the rain shield (a high pressure system under the rain canopy) of a mature squall line and is sometimes associated with a heat burst.

Squall lines often cause severe straight-line wind damage, and most non-tornadic wind damage is caused from squall lines. Although the primary danger from squall lines is straight-line winds, some squall lines also contain weak tornadoes.

Tropical cyclone

Hurricane Isabel (2003) as seen from orbit during Expedition 7 of the International Space Station.

Very high winds can be caused by mature tropical cyclones (called hurricanes in the United States and Canada and typhoons in eastern Asia). A tropical cyclone's heavy surf created by such winds may cause harm to marine life either close to or upon the surface of the water, such as coral reefs. Coastal regions may receive significant damage from a tropical cyclone while inland regions are relatively safe from the strong winds, due to their rapid dissipation over land. However, severe flooding can occur even far inland because of high amounts of rain from tropical cyclones and their remnants.

Waterspout

Formation of numerous waterspouts in the Great Lakes region.
 
Waterspouts are generally defined as tornadoes or non-supercell tornadoes that develop over bodies of water.

Waterspouts typically do not do much damage because they occur over open water, but they are capable of traveling over land. Vegetation, weakly constructed buildings, and other infrastructure may be damaged or destroyed by waterspouts. Waterspouts do not generally last long over terrestrial environments as the friction produced easily dissipates the winds. Strong horizontal winds disturbe the vortex, causing waterspouts to dissipate, While not generally as dangerous as "classic" tornadoes, waterspouts can overturn boats, and they can cause severe damage to larger ships.

Strong extratropical cyclones

GOES-13 Imagery of an intense Nor'Easter that impacted the North East US on 26 March 2014 and produced recorded gusts of 101mph+

Severe local windstorms in Europe that develop from winds off the North Atlantic. These windstorms are commonly associated with the destructive extratropical cyclones and their low pressure frontal systems. European windstorms occur mainly in the seasons of autumn and winter.

A synoptic-scale extratropical storm along the East Coast of the United States and Atlantic Canada is called a Nor'easter. They are so named because their winds come from the northeast, especially in the coastal areas of the Northeastern United States and Atlantic Canada. More specifically, it describes a low-pressure area whose center of rotation is just off the East Coast and whose leading winds in the left forward quadrant rotate onto land from the northeast. Nor'easters may cause coastal flooding, coastal erosion, and hurricane-force winds.

Dust storm

An unusual form of windstorm that is characterized by the existence of large quantities of sand and dust particles carried by the wind. Dust storms frequently develop during periods of droughts, or over arid and semi-arid regions. 

A massive dust storm cloud (Haboob) is close to enveloping a military camp as it rolls over Al Asad Airbase, Iraq, just before nightfall on 27 April 2005.
 
Dust storms have numerous hazards and are capable of causing deaths. Visibility may be reduced dramatically, so risks of vehicle and aircraft crashes are possible. Additionally, the particulates may reduce oxygen intake by the lungs, potentially resulting in suffocation. Damage can also be inflicted upon the eyes due to abrasion.

Dust storms can many issues for agricultural industries as well. Soil erosion is one of the most common hazards and decreases arable lands. Dust and sand particles can cause severe weathering of buildings and rock formations. Nearby bodies of water may be polluted by settling dust and sand, killing aquatic organisms. Decrease in exposure to sunlight can affect plant growth, as well as decrease in infrared radiation may cause decreased temperatures.

Wildfires

Wildfire in Yellowstone National Park produces a pyrocumulus cloud
 
The most common cause of wildfires varies throughout the world. In the United States, Canada, and Northwest China, lightning is the major source of ignition. In other parts of the world, human involvement is a major contributor. For instance, in Mexico, Central America, South America, Africa, Southeast Asia, Fiji, and New Zealand, wildfires can be attributed to human activities such as animal husbandry, agriculture, and land-conversion burning. Human carelessness is a major cause of wildfires in China and in the Mediterranean Basin. In Australia, the source of wildfires can be traced to both lightning strikes and human activities such as machinery sparks and cast-away cigarette butts." Wildfires have a rapid forward rate of spread (FROS) when burning through dense, uninterrupted fuels. They can move as fast as 10.8 kilometers per hour (6.7 mph) in forests and 22 kilometers per hour (14 mph) in grasslands. Wildfires can advance tangential to the main front to form a flanking front, or burn in the opposite direction of the main front by backing.

Wildfires may also spread by jumping or spotting as winds and vertical convection columns carry firebrands (hot wood embers) and other burning materials through the air over roads, rivers, and other barriers that may otherwise act as firebreaks. Torching and fires in tree canopies encourage spotting, and dry ground fuels that surround a wildfire are especially vulnerable to ignition from firebrands. Spotting can create spot fires as hot embers and firebrands ignite fuels downwind from the fire. In Australian bushfires, spot fires are known to occur as far as 10 kilometers (6 mi) from the fire front. Since the mid-1980s, earlier snowmelt and associated warming has also been associated with an increase in length and severity of the wildfire season in the Western United States.

Hail

A large hailstone with concentric rings
 
Any form of thunderstorm that produces precipitating hailstones is known as a hailstorm. Hailstorms are generally capable of developing in any geographic area where thunderclouds (cumulonimbus) are present, although they are most frequent in tropical and monsoon regions. The updrafts and downdrafts within cumulonimbus clouds cause water molecules to freeze and solidify, creating hailstones and other forms of solid precipitation. Due to their larger density, these hailstones become heavy enough to overcome the density of the cloud and fall towards the ground. The downdrafts in cumulonimbus clouds can also cause increases in the speed of the falling hailstones. The term "hailstorm" is usually used to describe the existence of significant quantities or size of hailstones. 

Hailstones can cause serious damage, notably to automobiles, aircraft, skylights, glass-roofed structures, livestock, and crops. Rarely, massive hailstones have been known to cause concussions or fatal head trauma. Hailstorms have been the cause of costly and deadly events throughout history. One of the earliest recorded incidents occurred around the 12th century in Wellesbourne, Britain. The largest hailstone in terms of maximum circumference and length ever recorded in the United States fell in 2003 in Aurora, Nebraska, USA. The hailstone had a diameter of 7 inches (18 cm) and a circumference of 18.75 inches (47.6 cm).

Heavy rainfall and flooding

A flash flood caused by a severe thunderstorm

Heavy rainfall can lead to a number of hazards, most of which are floods or hazards resulting from floods. Flooding is the inundation of areas that are not normally under water. It is typically divided into three classes: River flooding, which relates to rivers rising outside their normals banks; flash flooding, which is the process where a landscape, often in urban and arid environments, is subjected to rapid floods; and coastal flooding, which can be caused by strong winds from tropical or non-tropical cyclones. Meteorologically, excessive rains occur within a plume of air with high amounts of moisture (also known as an atmospheric river) which is directed around an upper level cold-core low or a tropical cyclone. Flash flooding can frequently occur in slow-moving thunderstorms and are usually caused by the heavy liquid precipitation that accompanies it. Flash floods are most common in dense populated urban environments, where less plants and bodies of water are presented to absorb and contain the extra water. Flash flooding can be hazardous to small infrastructure, such as bridges, and weakly constructed buildings. Plants and crops in agricultural areas can be destroyed and devastated by the force of raging water. Automobiles parked within experiencing areas can also be displaced. Soil erosion can occur as well, exposing risks of landslide phenomena. Like all forms of flooding phenomenon, flash flooding can also spread and produce waterborne and insect-borne diseases cause by microorganisms. Flash flooding can be caused by extensive rainfall released by tropical cyclones of any strength or the sudden thawing effect of ice dams.

Monsoons

Seasonal wind shifts lead to long-lasting wet seasons which produce the bulk of annual precipitation in areas such as Southeast Asia, Australia, Western Africa, eastern South America, Mexico, and the Philippines. Widespread flooding occurs if rainfall is excessive, which can lead to landslides and mudflows in mountainous areas. Floods cause rivers to exceed their capacity with nearby buildings becoming submerged. Flooding may be exacerbated if there are fires during the previous dry season. This may cause soils which are sandy or composed of loam to become hydrophobic and repel water.

Government organizations help their residents deal with wet-season floods though floodplain mapping and information on erosion control. Mapping is conducted to help determine areas that may be more prone to flooding. Erosion control instructions are provided through outreach over the telephone or the internet.

Flood waters that occur during monsoon seasons can often host numerous protozoa, bacterial, and viral microorganisms. Mosquitoes and flies will lay their eggs within the contaminated bodies of water. These disease agents may cause infections of foodborne and waterborne diseases. Diseases associated with exposure to flood waters include malaria, cholera, typhoid, hepatitis A, and the common cold. Possible trenchfoot infections may also occur when personnel are exposed for extended periods of time within flooded areas.

Tropical cyclone

The damage caused by Hurricane Andrew is a good example of the damage caused by a category 5 Tropical cyclone
 
A tropical cyclone is a storm system characterized by a low-pressure center and numerous thunderstorms that produce strong winds and flooding rain. A tropical cyclone feeds on heat released when moist air rises, resulting in condensation of water vapor contained in the moist air. Tropical cyclones may produce torrential rain, high waves, and damaging storm surge. Heavy rains produce significant inland flooding. Storm surges may produce extensive coastal flooding up to 40 kilometres (25 mi) from the coastline. 

Although cyclones take an enormous toll in lives and personal property, they are also important factors in the precipitation regimes of areas they affect. They bring much-needed precipitation to otherwise dry regions. Areas in their path can receive a year's worth of rainfall from a tropical cyclone passage. Tropical cyclones can also relieve drought conditions. They also carry heat and energy away from the tropics and transport it toward temperate latitudes, which makes them an important part of the global atmospheric circulation mechanism. As a result, tropical cyclones help to maintain equilibrium in the Earth's troposphere.

Severe winter weather

Heavy snowfall

Damage caused by Lake Storm "Aphid" in October 2006
 
When extratropical cyclones deposit heavy, wet snow with a snow-water equivalent (SWE) ratio of between 6:1 and 12:1 and a weight in excess of 10 pounds per square foot (~50 kg/m2) piles onto trees or electricity lines, significant damage may occur on a scale usually associated with strong tropical cyclones. An avalanche can occur with a sudden thermal or mechanical impact on snow that has accumulated on a mountain, which causes the snow to rush downhill suddenly. Preceding an avalanche is a phenomenon known as an avalanche wind caused by the approaching avalanche itself, which adds to its destructive potential. Large amounts of snow which accumulate on top of man-made structures can lead to structural failure. During snowmelt, acidic precipitation which previously fell in the snow pack is released and harms marine life.

Lake-effect snow is produced in the winter in the shape of one or more elongated bands. This occurs when cold winds move across long expanses of warmer lake water, providing energy and picking up water vapor which freezes and is deposited on the lee shores. For more information on this effect see the main article. 

Conditions within blizzards often include large quantities of blowing snow and strong winds which may significantly reduce visibility. Reduced viability of personnel on foot may result in extended exposure to the blizzard and increase the chance of becoming lost. The strong winds associated with blizzards create wind chill that can result in frostbites and hypothermia. The strong winds present in blizzards are capable of damaging plants and may cause power outages, frozen pipes, and cut off fuel lines.

Strong extratropical cyclones

The precipitation pattern of Nor'easters is similar to other mature extratropical storms. Nor'easters can cause heavy rain or snow, either within their comma-head precipitation pattern or along their trailing cold or stationary front. Nor'easters can occur at any time of the year but are mostly known for their presence in the winter season. Severe European windstorms are often characterized by heavy precipitation as well.

Ice storm

Trees that have been destroyed by an ice storm.
 
Ice storms are also known as a Silver storm, referring to the color of the freezing precipitation. Ice storms are caused by liquid precipitation which freezes upon cold surfaces and leads to the gradual development of a thickening layer of ice. The accumulations of ice during the storm can be extremely destructive. Trees and vegetation can be destroyed and in turn may bring down power lines, causing the loss of heat and communication lines. Roofs of buildings and automobiles may be severely damaged. Gas pipes can become frozen or even damaged causing gas leaks. Avalanches may develop due to the extra weight of the ice present. Visibility can be reduced dramatically. The aftermath of an ice storm may result in severe flooding due to sudden thawing, with large quantities of displaced water, especially near lakes, rivers, and bodies of water.

Heat and drought

Drought

Crops in Australia that have failed due to drought conditions.
 
Another form of severe weather is drought, which is a prolonged period of persistently dry weather (that is, absence of precipitation). Although droughts do not develop or progress as quickly as other forms of severe weather, their effects can be just as deadly; in fact, droughts are classified and measured based upon these effects. Droughts have a variety of severe effects; they can cause crops to fail, and they can severely deplete water resources, sometimes interfering with human life. A drought in the 1930s known as the Dust Bowl affected 50 million acres of farmland in the central United States. In economic terms, they can cost many billions of dollars: a drought in the United States in 1988 caused over $40 billion in losses, exceeding the economic totals of Hurricane Andrew, the Great Flood of 1993, and the 1989 Loma Prieta earthquake. In addition to the other severe effects, the dry conditions caused by droughts also significantly increase the risk of wildfires.

Heat waves

A map indicating above-normal temperatures in Europe in 2003
 
Although official definitions vary, a heat wave is generally defined as a prolonged period with excessive heat. Although heat waves do not cause as much economic damage as other types of severe weather, they are extremely dangerous to humans and animals: according to the United States National Weather Service, the average total number of heat-related fatalities each year is higher than the combined total fatalities for floods, tornadoes, lightning strikes, and hurricanes. In Australia, heat waves cause more fatalities than any other type of severe weather. As in droughts, plants can also be severely affected by heat waves (which are often accompanied by dry conditions) can cause plants to lose their moisture and die. Heat waves are often more severe when combined with high humidity.

Classical radicalism

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Cla...