Updated May 11 with text added at the end.
You only have to compare Sea Surface Temperatures (SST) from HADSST3
with estimates of Global Mean Surface Temperatures (GMST) from Hadcrut4
and RSS.
This first graph shows how global SST has varied since 1850. There are
obvious changepoints where the warming or cooling periods have occurred.
This graph shows in green Hadcrut4 estimates of global surface
temperature, including ocean SST, and near surface air temperatures over
land. The blue line from RSS tracks lower tropospheric air temperatures
measured by satellites, not near the surface but many meters higher.
Finally, the red line is again Hadsst3 global SST All lines use 30-month
averages to reduce annual noise and display longer term patterns.
Strikingly, SST and GMST are almost synonymous from the beginning
until about 1980. Then GMST diverges with more warming than global SST.
Satellite TLT shows the same patterns but with less warming than the
surface. Curious as to the post 1980s patterns, I looked into HADSST3
and found NH SST warmed much more strongly during that period.
This graph shows how warming from circulations in the Northern
Pacific and Northern Atlantic drove GMST since 1980. And it suggests
that since 2005 NH SST is no longer increasing, and may turn toward
cooling.
Surface Heat Flux from Ocean to Air
Now one can read convoluted explanations about how rising CO2 in the
atmosphere can cause land surface heating which is then transported over
the ocean and causes higher SST. But the interface between ocean and
air is well described and measured. Not surprisingly it is the warmer
ocean water sending heat into the atmosphere, and not the other way
around.
The graph displays measures of heat flux in the sub-tropics during a
21-day period in November. Shortwave solar energy shown above in green
labeled radiative is stored in the upper 200 meters of the ocean. The
upper panel shows the rise in SST (Sea Surface Temperature) due to net
incoming energy. The yellow shows latent heat cooling the ocean,
(lowering SST) and transferring heat upward, driving convection.
From
An Investigation of Turbulent Heat Exchange in the Subtropics
James B. Edson
“One can think of the ocean as a capacitor for the MJO (Madden-Julian
Oscillation), where the energy is being accumulated when there is a net
heat flux into the ocean (here occurring to approximately November 24)
after which it is released to the atmosphere during the active phase of
the MJO under high winds and large latent heat exchange.”
http://www.onr.navy.mil/reports/FY13/mmedson.pdf
Conclusion
As we see in the graphs ocean circulations change sea surface
temperatures which then cause global land and sea temperatures to
change. Thus, oceans make climate by making temperature changes.
On another post I describe how oceans also drive precipitation, the
other main determinant of climate. Oceans make rain, and the processes
for distributing rain over land are shown here: https://rclutz.wordpress.com/2015/04/30/here-comes-the-rain-again/
And a word from Dr. William Gray:
“Changes in the ocean’s deep circulation currents appears to be, by
far, the best physical explanation for the observed global surface
temperature changes (see Gray 2009, 2011, 2012, 2012). It seems
ridiculous to me for both the AGW advocates and us skeptics to so
closely monitor current weather and short-time climate change as
indication of CO2’s influence on our climate. This assumes that the much
more dominant natural climate changes that have always occurred are no
longer in operation or have relevance.”
http://www.icecap.us/
Indeed, Oceans Make Climate, or as Dr. Arnd Bernaerts put it:
“Climate is the continuation of oceans by other means.”
Update May 11, 2015
Kenneth Richards provided some supporting references in a comment at
Paul Homewood’s site. They are certainly on point especially this one:
“Examining data sets of surface heat flux during the last few decades
for the same region, we find that the SST warming was not a consequence
of atmospheric heat flux forcing. Conversely, we suggest that long-term
SST warming drives changes in atmosphere parameters at the sea surface,
most notably an increase in latent heat flux, and that an acceleration
of the hydrological cycle induces a strengthening of the trade winds and
an acceleration of the Hadley circulation.”
That quote is from Servain et al, unfortunately behind a paywall. The paper is discussed here:
http://hockeyschtick.blogspot.ca/2014/09/new-paper-finds-climate-of-tropical.html
Full comment from Richards:
http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-13-00651.1
The surface of the world’s oceans has been warming since the beginning
of industrialization. In addition to this, multidecadal sea surface
temperature (SST) variations of internal [natural] origin exist.
Evidence suggests that the North Atlantic Ocean exhibits the strongest
multidecadal SST variations and that these variations are connected to
the overturning circulation. This work investigates the extent to which
these internal multidecadal variations have contributed to enhancing or
diminishing the trend induced by the external radiative forcing,
globally and in the North Atlantic. A model study is carried out wherein
the analyses of a long control simulation with constant radiative
forcing at preindustrial level and of an ensemble of simulations with
historical forcing from 1850 until 2005 are combined. First, it is noted
that global SST trends calculated from the different historical
simulations are similar, while there is a large disagreement between the
North Atlantic SST trends. Then the control simulation is analyzed,
where a relationship between SST anomalies and anomalies in the Atlantic
meridional overturning circulation (AMOC) for multidecadal and longer
time scales is identified. This relationship enables the extraction of
the AMOC-related SST variability from each individual member of the
ensemble of historical simulations and then the calculation of the SST
trends with the AMOC-related variability excluded. For the global SST
trends this causes only a little difference while SST trends with
AMOC-related variability excluded for the North Atlantic show closer
agreement than with the AMOC-related variability included. From this it
is concluded that AMOC [Atlantic meridional overturning circulation]
variability has contributed significantly to North Atlantic SST trends
since the mid nineteenth century.
http://link.springer.com/article/10.1007%2Fs00382-014-2168-7
After a decrease of SST by about 1 °C during 1964–1975, most apparent in
the northern tropical region, the entire tropical basin warmed up. That
warming was the most substantial (>1 °C) in the eastern tropical
ocean and in the longitudinal band of the intertropical convergence
zone. Examining data sets of surface heat flux during the last few
decades for the same region, we find that the SST [sea surface
temperature] warming was not a consequence of atmospheric heat flux
forcing [greenhouse gases]. Conversely, we suggest that long-term SST
warming drives changes in atmosphere parameters at the sea surface, most
notably an increase in latent heat flux, and that an acceleration of
the hydrological cycle induces a strengthening of the trade winds and an
acceleration of the Hadley circulation. These trends are also
accompanied by rising sea levels and upper ocean heat content over
similar multi-decadal time scales in the tropical Atlantic. Though more
work is needed to fully understand these long term trends, especially
what happens from the mid-1970’s, it is likely that changes in ocean
circulation involving some combination of the Atlantic meridional
overtuning circulation [AMOC] and the subtropical cells are required to
explain the observations.
http://www.nature.com/ncomms/2014/141208/ncomms6752/full/ncomms6752.html
The Atlantic Meridional Overturning Circulation (AMOC) is a key
component of the global climate system, responsible for a large fraction
of the 1.3 PW northward heat transport in the Atlantic basin. Numerical
modelling experiments suggest that without a vigorous AMOC, surface air
temperature in the North Atlantic region would cool by around 1–3 °C,
with enhanced local cooling of up to 8 °C in regions with large sea-ice
changes. Substantial weakening of the AMOC would also cause a southward
shift of the inter-tropical convergence zone, encouraging Sahelian
drought, and dynamic changes in sea level of up to 80 cm along the
coasts of North America and Europe.
A Medley of Potpourri is just what it says; various thoughts, opinions, ruminations, and contemplations on a variety of subjects.
Search This Blog
Monday, May 11, 2015
Hottest year ever? Giant clam reveals Middle Ages were warmer than today
Original link: http://wattsupwiththat.com/2015/01/05/hottest-year-ever-giant-clam-reveals-middle-ages-were-warmer-than-today/
While government science and media begin the ramp-up to claim 2014 as the “hottest year ever” China’s Sea’s biggest bivalve shows that the Middle Ages were warmer than today, when Carbon Dioxide was lower.
From the Chinese Academy of Sciences:
Two recent papers, one is in Earth-Science Reviews and the other is in Chinese Science Bulletin, have studied key chemical contents in micro-drilled giant clams shells and coral samples to demonstrate that in the South China Sea the warm period of the Middle Ages was warmer than the present.
The scientists examined surveys of the ratio of strontium to calcium content and heavy oxygen isotopes, both are sensitive recorders of sea surface temperatures past and present. The aragonite bicarbonate of the Tridacna gigas clam-shell is so fine-grained that daily growth-lines are exposed by micro-drilling with an exceptionally fine drill-bit, allowing an exceptionally detailed time-series of sea-temperature changes to be compiled – a feat of detection worthy of Sherlock Holmes himself.
By using overlaps between successive generations of giant clams and corals, the three scientists – Hong Yan of the Institute of Earth Environment, Chinese Academy of Sciences, Willie Soon of the Harvard-Smithsonian Center for Astrophysics and Yuhong Wang of Fudan University, Shanghai – reconstructed a record of sea-surface temperature changes going back 2500 years.
The Roman and Mediaeval Warm Periods both showed up prominently in the western Pacific and East Asia. Sea surface temperatures varied considerably over the 2500-year period.
Changing patterns of winter and summer temperature variation were also detected, disproving the notion that until the warming of the 20th century there had been little change in global temperatures for at least 1000 years, and confirming that – at least in the South China Sea – there is nothing exceptional about today’s temperatures.
Dr. Yan said: “This new paper adds further material to the substantial body of real-world proxy evidence establishing that today’s global temperature is within natural ranges of past changes.” Dr. Soon added: “The UN’s climate panel should never have trusted the claim that the medieval warm period was mainly a European phenomenon. It was clearly warm in South China Sea too.”
Sunday, May 10, 2015
Scientific American Demonstrates How To Commit Major Science Fraud
Original link: https://stevengoddard.wordpress.com/2015/05/10/scientific-american-demonstrates-how-to-commit-major-science-fraud/
Scientific American published this map showing an increase in heavy rains since 1958, and blamed them on global warming.
Now check out the spectacular fraud being committed. The author cherry picked the start date of 1958, because it was the minimum in the US climate record. In fact, heavy rainfall events were much more common in the early 20th century, when temperatures were cooler.
Global Warming May Mean More Downpours like in Oklahoma – Scientific American
There is no correlation between US temperature and heavy precipitation events
The author is engaged in scientific malfeasance, in an effort mislead Scientific American readers and direct them to the wrong conclusion.
Dark energy
From Wikipedia, the free encyclopedia
In physical cosmology and astronomy, dark energy is an unknown form of energy which is hypothesized to permeate all of space, tending to accelerate the expansion of the universe.[1] Dark energy is the most accepted hypothesis to explain the observations since the 1990s indicating that the universe is expanding at an accelerating rate. According to the Planck mission team, and based on the standard model of cosmology, on a mass–energy equivalence basis, the observable universe contains 26.8% dark matter, 68.3% dark energy (for a total of 95.1%) and 4.9% ordinary matter.[2][3][4][5] Again on a mass–energy equivalence basis, the density of dark energy (6.91 × 10−27 kg/m3) is very low, much less than the density of ordinary matter or dark matter within galaxies. However, it comes to dominate the mass–energy of the universe because it is uniform across space.[6]
Two proposed forms for dark energy are the cosmological constant, a constant energy density filling space homogeneously,[7] and scalar fields such as quintessence or moduli, dynamic quantities whose energy density can vary in time and space. Contributions from scalar fields that are constant in space are usually also included in the cosmological constant. The cosmological constant can be formulated to be equivalent to vacuum energy. Scalar fields that do change in space can be difficult to distinguish from a cosmological constant because the change may be extremely slow.
High-precision measurements of the expansion of the universe are required to understand how the expansion rate changes over time and space. In general relativity, the evolution of the expansion rate is parameterized by the cosmological equation of state (the relationship between temperature, pressure, and combined matter, energy, and vacuum energy density for any region of space). Measuring the equation of state for dark energy is one of the biggest efforts in observational cosmology today.
Adding the cosmological constant to cosmology's standard FLRW metric leads to the Lambda-CDM model, which has been referred to as the "standard model" of cosmology because of its precise agreement with observations. Dark energy has been used as a crucial ingredient in a recent attempt to formulate a cyclic model for the universe.[8]
Nature of dark energy
Many things about the nature of dark energy remain matters of speculation. The evidence for dark energy is indirect but comes from three independent sources:- Distance measurements and their relation to redshift, which suggest the universe has expanded more in the last half of its life.[9]
- The theoretical need for a type of additional energy that is not matter or dark matter to form the observationally flat universe (absence of any detectable global curvature).
- It can be inferred from measures of large scale wave-patterns of mass density in the universe.
Effect of dark energy: a small constant negative pressure of vacuum
Independently of its actual nature, dark energy would need to have a strong negative pressure (acting repulsively) in order to explain the observed acceleration of the expansion of the universe.According to general relativity, the pressure within a substance contributes to its gravitational attraction for other things just as its mass density does. This happens because the physical quantity that causes matter to generate gravitational effects is the stress–energy tensor, which contains both the energy (or matter) density of a substance and its pressure and viscosity.
In the Friedmann–Lemaître–Robertson–Walker metric, it can be shown that a strong constant negative pressure in all the universe causes an acceleration in universe expansion if the universe is already expanding, or a deceleration in universe contraction if the universe is already contracting. More exactly, the second derivative of the universe scale factor, , is positive if the equation of state of the universe is such that (see Friedmann equations).
This accelerating expansion effect is sometimes labeled "gravitational repulsion", which is a colorful but possibly confusing expression. In fact a negative pressure does not influence the gravitational interaction between masses—which remains attractive—but rather alters the overall evolution of the universe at the cosmological scale, typically resulting in the accelerating expansion of the universe despite the attraction among the masses present in the universe.
The acceleration is simply a function of dark energy density. Dark energy is persistent: its density remains constant (experimentally, within a factor of 1:10), i.e. it does not get diluted when space expands.
Evidence of existence
Supernovae
In 1998, published observations of Type Ia supernovae ("one-A") by the High-Z Supernova Search Team[10] followed in 1999 by the Supernova Cosmology Project[11] suggested that the expansion of the universe is accelerating.[12] The 2011 Nobel Prize in Physics was awarded to Saul Perlmutter, Brian P. Schmidt and Adam G. Riess for their leadership in the discovery.[13][14]
Since then, these observations have been corroborated by several independent sources. Measurements of the cosmic microwave background, gravitational lensing, and the large-scale structure of the cosmos as well as improved measurements of supernovae have been consistent with the Lambda-CDM model.[15] Some people argue that the only indication for the existence of dark energy is observations of distance measurements and associated redshifts. Cosmic microwave background anisotropies and baryon acoustic oscillations are only observations that redshifts are larger than expected from a "dusty" Friedmann–Lemaître universe and the local measured Hubble constant.[16]
Supernovae are useful for cosmology because they are excellent standard candles across cosmological distances. They allow the expansion history of the universe to be measured by looking at the relationship between the distance to an object and its redshift, which gives how fast it is receding from us. The relationship is roughly linear, according to Hubble's law. It is relatively easy to measure redshift, but finding the distance to an object is more difficult. Usually, astronomers use standard candles: objects for which the intrinsic brightness, the absolute magnitude, is known. This allows the object's distance to be measured from its actual observed brightness, or apparent magnitude. Type Ia supernovae are the best-known standard candles across cosmological distances because of their extreme and consistent luminosity.
Recent observations of supernovae are consistent with a universe made up 71.3% of dark energy and 27.4% of a combination of dark matter and baryonic matter.[17]
Cosmic microwave background
The existence of dark energy, in whatever form, is needed to reconcile the measured geometry of space with the total amount of matter in the universe. Measurements of cosmic microwave background (CMB) anisotropies indicate that the universe is close to flat. For the shape of the universe to be flat, the mass/energy density of the universe must be equal to the critical density. The total amount of matter in the universe (including baryons and dark matter), as measured from the CMB spectrum, accounts for only about 30% of the critical density. This implies the existence of an additional form of energy to account for the remaining 70%.[15] The Wilkinson Microwave Anisotropy Probe (WMAP) spacecraft seven-year analysis estimated a universe made up of 72.8% dark energy, 22.7% dark matter and 4.5% ordinary matter.[4] Work done in 2013 based on the Planck spacecraft observations of the CMB gave a more accurate estimate of 68.3% of dark energy, 26.8% of dark matter and 4.9% of ordinary matter.[18]
Large-scale structure
The theory of large-scale structure, which governs the formation of structures in the universe (stars, quasars, galaxies and galaxy groups and clusters), also suggests that the density of matter in the universe is only 30% of the critical density.A 2011 survey, the WiggleZ galaxy survey of more than 200,000 galaxies, provided further evidence towards the existence of dark energy, although the exact physics behind it remains unknown.[19][20] The WiggleZ survey from Australian Astronomical Observatory scanned the galaxies to determine their redshift. Then, by exploiting the fact that baryon acoustic oscillations have left voids regularly of ~150 Mpc diameter, surrounded by the galaxies, the voids were used as standard rulers to determine distances to galaxies as far as 2,000 Mpc (redshift 0.6), which allowed astronomers to determine more accurately the speeds of the galaxies from their redshift and distance. The data confirmed cosmic acceleration up to half of the age of the universe (7 billion years) and constrain its inhomogeneity to 1 part in 10.[20] This provides a confirmation to cosmic acceleration independent of supernovae.
Late-time integrated Sachs-Wolfe effect
Accelerated cosmic expansion causes gravitational potential wells and hills to flatten as photons pass through them, producing cold spots and hot spots on the CMB aligned with vast supervoids and superclusters. This so-called late-time Integrated Sachs–Wolfe effect (ISW) is a direct signal of dark energy in a flat universe.[21] It was reported at high significance in 2008 by Ho et al.[22] and Giannantonio et al.[23]Observational Hubble constant data
A new approach to test evidence of dark energy through observational Hubble constant (H(z)) data (OHD) has gained significant attention in recent years.[24][25][26][27] The Hubble constant is measured as a function of cosmological redshift. OHD directly tracks the expansion history of the universe by taking passively evolving early-type galaxies as “cosmic chronometers”.[28] From this point, this approach provides standard clocks in the universe. The core of this idea is the measurement of the differential age evolution as a function of redshift of these cosmic chronometers. Thus, it provides a direct estimate of the Hubble parameter H(z)=-1/(1+z)dz/dt≈-1/(1+z)Δz/Δt. The merit of this approach is clear: the reliance on a differential quantity, Δz/Δt, can minimize many common issues and systematic effects; and as a direct measurement of the Hubble parameter instead of its integral, like supernovae and baryon acoustic oscillations (BAO), it brings more information and is appealing in computation. For these reasons, it has been widely used to examine the accelerated cosmic expansion and study properties of dark energy.Theories of explanation
Cosmological constant
The simplest explanation for dark energy is that it is simply the "cost of having space": that is, a volume of space has some intrinsic, fundamental energy. This is the cosmological constant, sometimes called Lambda (hence Lambda-CDM model) after the Greek letter Λ, the symbol used to represent this quantity mathematically. Since energy and mass are related by E = mc2, Einstein's theory of general relativity predicts that this energy will have a gravitational effect. It is sometimes called a vacuum energy because it is the energy density of empty vacuum. In fact, most theories of particle physics predict vacuum fluctuations that would give the vacuum this sort of energy. This is related to the Casimir effect, in which there is a small suction into regions where virtual particles are geometrically inhibited from forming (e.g. between plates with tiny separation). The cosmological constant is estimated by cosmologists to be on the order of 10−29 g/cm3, or about 10−120 in reduced Planck units[citation needed]. Particle physics predicts a natural value of 1 in reduced Planck units, leading to a large discrepancy.
The cosmological constant has negative pressure equal to its energy density and so causes the expansion of the universe to accelerate. The reason why a cosmological constant has negative pressure can be seen from classical thermodynamics; Energy must be lost from inside a container to do work on the container. A change in volume dV requires work done equal to a change of energy −P dV, where P is the pressure. But the amount of energy in a container full of vacuum actually increases when the volume increases (dV is positive), because the energy is equal to ρV, where ρ (rho) is the energy density of the cosmological constant. Therefore, P is negative and, in fact, P = −ρ.
A major outstanding problem is that most quantum field theories predict a huge cosmological constant from the energy of the quantum vacuum, more than 100 orders of magnitude too large.[7] This would need to be cancelled almost, but not exactly, by an equally large term of the opposite sign. Some supersymmetric theories require a cosmological constant that is exactly zero,[citation needed] which does not help because supersymmetry must be broken. The present scientific consensus amounts to extrapolating the empirical evidence where it is relevant to predictions, and fine-tuning theories until a more elegant solution is found. Technically, this amounts to checking theories against macroscopic observations. Unfortunately, as the known error-margin in the constant predicts the fate of the universe more than its present state, many such "deeper" questions remain unknown.
In spite of its problems, the cosmological constant is in many respects the most economical solution to the problem of cosmic acceleration. One number successfully explains a multitude of observations. Thus, the current standard model of cosmology, the Lambda-CDM model, includes the cosmological constant as an essential feature.
Quintessence
In quintessence models of dark energy, the observed acceleration of the scale factor is caused by the potential energy of a dynamical field, referred to as quintessence field. Quintessence differs from the cosmological constant in that it can vary in space and time. In order for it not to clump and form structure like matter, the field must be very light so that it has a large Compton wavelength.No evidence of quintessence is yet available, but it has not been ruled out either. It generally predicts a slightly slower acceleration of the expansion of the universe than the cosmological constant. Some scientists think that the best evidence for quintessence would come from violations of Einstein's equivalence principle and variation of the fundamental constants in space or time.[citation needed] Scalar fields are predicted by the standard model and string theory, but an analogous problem to the cosmological constant problem (or the problem of constructing models of cosmic inflation) occurs: renormalization theory predicts that scalar fields should acquire large masses.
The cosmic coincidence problem asks why the cosmic acceleration began when it did. If cosmic acceleration began earlier in the universe, structures such as galaxies would never have had time to form and life, at least as we know it, would never have had a chance to exist. Proponents of the anthropic principle view this as support for their arguments. However, many models of quintessence have a so-called tracker behavior, which solves this problem. In these models, the quintessence field has a density which closely tracks (but is less than) the radiation density until matter-radiation equality, which triggers quintessence to start behaving as dark energy, eventually dominating the universe. This naturally sets the low energy scale of the dark energy.[citation needed]
In 2004, when scientists fit the evolution of dark energy with the cosmological data, they found that the equation of state had possibly crossed the cosmological constant boundary (w=−1) from above to below. A No-Go theorem has been proved that gives this scenario at least two degrees of freedom as required for dark energy models. This scenario is so-called Quintom scenario.
Some special cases of quintessence are phantom energy, in which the energy density of quintessence actually increases with time, and k-essence (short for kinetic quintessence) which has a non-standard form of kinetic energy. They can have unusual properties: phantom energy, for example, can cause a Big Rip.
Alternative ideas
Some alternatives to dark energy aim to explain the observational data by a more refined use of established theories, focusing, for example, on the gravitational effects of density inhomogeneities, or on consequences of electroweak symmetry breaking in the early universe. If we are located in an emptier-than-average region of space, the observed cosmic expansion rate could be mistaken for a variation in time, or acceleration.[29][30][31][32] A different approach uses a cosmological extension of the equivalence principle to show how space might appear to be expanding more rapidly in the voids surrounding our local cluster. While weak, such effects considered cumulatively over billions of years could become significant, creating the illusion of cosmic acceleration, and making it appear as if we live in a Hubble bubble.[33][34][35]Another class of theories attempts to come up with an all-encompassing theory of both dark matter and dark energy as a single phenomenon that modifies the laws of gravity at various scales. An example of this type of theory is the theory of dark fluid. Another class of theories that unifies dark matter and dark energy are suggested to be covariant theories of modified gravities. These theories alter the dynamics of the space-time such that the modified dynamic stems what have been assigned to the presence of dark energy and dark matter.[36]
A 2011 paper in the journal Physical Review D by Christos Tsagas, a cosmologist at Aristotle University of Thessaloniki in Greece, argued that it is likely that the accelerated expansion of the universe is an illusion caused by the relative motion of us to the rest of the universe. The paper cites data showing that the 2.5 billion ly wide region of space we are inside of is moving very quickly relative to everything around it. If the theory is confirmed, then dark energy would not exist (but the "dark flow" still might).[37][38]
Some theorists think that dark energy and cosmic acceleration are a failure of general relativity on very large scales, larger than superclusters.[citation needed] However most attempts at modifying general relativity have turned out to be either equivalent to theories of quintessence, or inconsistent with observations.[citation needed] Other ideas for dark energy have come from string theory, brane cosmology and the holographic principle, but have not yet proved[citation needed] as compelling as quintessence and the cosmological constant.
On string theory, an article in the journal Nature described:
String theories, popular with many particle physicists, make it possible, even desirable, to think that the observable universe is just one of 10500 universes in a grander multiverse, says Leonard Susskind, a cosmologist at Stanford University in California. The vacuum energy will have different values in different universes, and in many or most it might indeed be vast. But it must be small in ours because it is only in such a universe that observers such as ourselves can evolve.Paul Steinhardt in the same article criticizes string theory's explanation of dark energy stating "...Anthropics and randomness don't explain anything... I am disappointed with what most theorists are willing to accept".[39]
—[39]
Another set of proposals is based on the possibility of a double metric tensor for space-time.[40][41] It has been argued that time reversed solutions in general relativity require such double metric for consistency, and that both dark matter and dark energy can be understood in terms of time reversed solutions of general relativity.[42]
It has been shown that if inertia is assumed to be due to the effect of horizons on Unruh radiation then this predicts galaxy rotation and a cosmic acceleration similar to that observed.[43]
Implications for the fate of the universe
Cosmologists estimate that the acceleration began roughly 5 billion years ago. Before that, it is thought that the expansion was decelerating, due to the attractive influence of dark matter and baryons. The density of dark matter in an expanding universe decreases more quickly than dark energy, and eventually the dark energy dominates.Specifically, when the volume of the universe doubles, the density of dark matter is halved, but the density of dark energy is nearly unchanged (it is exactly constant in the case of a cosmological constant).
If the acceleration continues indefinitely, the ultimate result will be that galaxies outside the local supercluster will have a line-of-sight velocity that continually increases with time, eventually far exceeding the speed of light.[44] This is not a violation of special relativity because the notion of "velocity" used here is different from that of velocity in a local inertial frame of reference, which is still constrained to be less than the speed of light for any massive object (see Uses of the proper distance for a discussion of the subtleties of defining any notion of relative velocity in cosmology). Because the Hubble parameter is decreasing with time, there can actually be cases where a galaxy that is receding from us faster than light does manage to emit a signal which reaches us eventually.[45][46]
However, because of the accelerating expansion, it is projected that most galaxies will eventually cross a type of cosmological event horizon where any light they emit past that point will never be able to reach us at any time in the infinite future[47] because the light never reaches a point where its "peculiar velocity" toward us exceeds the expansion velocity away from us (these two notions of velocity are also discussed in Uses of the proper distance). Assuming the dark energy is constant (a cosmological constant), the current distance to this cosmological event horizon is about 16 billion light years, meaning that a signal from an event happening at present would eventually be able to reach us in the future if the event were less than 16 billion light years away, but the signal would never reach us if the event were more than 16 billion light years away.[46]
As galaxies approach the point of crossing this cosmological event horizon, the light from them will become more and more redshifted, to the point where the wavelength becomes too large to detect in practice and the galaxies appear to vanish completely[48][49] (see Future of an expanding universe). The Earth, the Milky Way, and the Virgo Supercluster[contradictory], however, would remain virtually undisturbed while the rest of the universe recedes and disappears from view. In this scenario, the local supercluster would ultimately suffer heat death, just as was thought for the flat, matter-dominated universe before measurements of cosmic acceleration.
There are some very speculative ideas about the future of the universe. One suggests that phantom energy causes divergent expansion, which would imply that the effective force of dark energy continues growing until it dominates all other forces in the universe. Under this scenario, dark energy would ultimately tear apart all gravitationally bound structures, including galaxies and solar systems, and eventually overcome the electrical and nuclear forces to tear apart atoms themselves, ending the universe in a "Big Rip". On the other hand, dark energy might dissipate with time or even become attractive. Such uncertainties leave open the possibility that gravity might yet rule the day and lead to a universe that contracts in on itself in a "Big Crunch".[50] Some scenarios, such as the cyclic model, suggest this could be the case. It is also possible the universe may never have an end and continue in its present state forever (see The Second Law as a law of disorder). While these ideas are not supported by observations, they are not ruled out.
History of discovery and previous speculation
The cosmological constant was first proposed by Einstein as a mechanism to obtain a solution of the gravitational field equation that would lead to a static universe, effectively using dark energy to balance gravity.[51] Not only was the mechanism an inelegant example of fine-tuning but it was also later realized that Einstein's static universe would actually be unstable because local inhomogeneities would ultimately lead to either the runaway expansion or contraction of the universe. The equilibrium is unstable: If the universe expands slightly, then the expansion releases vacuum energy, which causes yet more expansion. Likewise, a universe which contracts slightly will continue contracting. These sorts of disturbances are inevitable, due to the uneven distribution of matter throughout the universe. More importantly, observations made by Edwin Hubble in 1929 showed that the universe appears to be expanding and not static at all. Einstein reportedly referred to his failure to predict the idea of a dynamic universe, in contrast to a static universe, as his greatest blunder.[52]Alan Guth and Alexei Starobinsky proposed in 1980 that a negative pressure field, similar in concept to dark energy, could drive cosmic inflation in the very early universe. Inflation postulates that some repulsive force, qualitatively similar to dark energy, resulted in an enormous and exponential expansion of the universe slightly after the Big Bang. Such expansion is an essential feature of most current models of the Big Bang. However, inflation must have occurred at a much higher energy density than the dark energy we observe today and is thought to have completely ended when the universe was just a fraction of a second old. It is unclear what relation, if any, exists between dark energy and inflation. Even after inflationary models became accepted, the cosmological constant was thought to be irrelevant to the current universe.
Nearly all inflation models predict that the total (matter+energy) density of the universe should be very close to the critical density. During the 1980s, most cosmological research focused on models with critical density in matter only, usually 95% cold dark matter and 5% ordinary matter (baryons). These models were found to be successful at forming realistic galaxies and clusters, but some problems appeared in the late 1980s: notably, the model required a value for the Hubble constant lower than preferred by observations, and the model under-predicted observations of large-scale galaxy clustering. These difficulties became stronger after the discovery of anisotropy in the cosmic microwave background by the COBE spacecraft in 1992, and several modified CDM models came under active study through the mid-1990s: these included the Lambda-CDM model and a mixed cold/hot dark matter model. The first direct evidence for dark energy came from supernova observations in 1998 of accelerated expansion in Riess et al.[10] and in Perlmutter et al.,[11] and the Lambda-CDM model then became the leading model. Soon after, dark energy was supported by independent observations: in 2000, the BOOMERanG and Maxima cosmic microwave background experiments observed the first acoustic peak in the CMB, showing that the total (matter+energy) density is close to 100% of critical density. Then in 2001, the 2dF Galaxy Redshift Survey gave strong evidence that the matter density is around 30% of critical. The large difference between these two supports a smooth component of dark energy making up the difference. Much more precise measurements from WMAP in 2003–2010 have continued to support the standard model and give more accurate measurements of the key parameters.
The term "dark energy", echoing Fritz Zwicky's "dark matter" from the 1930s, was coined by Michael Turner in 1998.[53]
As of 2013, the Lambda-CDM model is consistent with a series of increasingly rigorous cosmological observations, including the Planck spacecraft and the Supernova Legacy Survey. First results from the SNLS reveal that the average behavior (i.e., equation of state) of dark energy behaves like Einstein's cosmological constant to a precision of 10%.[54] Recent results from the Hubble Space Telescope Higher-Z Team indicate that dark energy has been present for at least 9 billion years and during the period preceding cosmic acceleration.
The UN is using climate change as a tool not an issue
Written by Maurice Newman, The Australian on .
Original link: http://www.climatechangedispatch.com/the-un-is-using-climate-change-as-a-tool-not-an-issue.html
It’s a well-kept secret, but 95 per cent of the climate models we are told prove the link between human CO2 emissions and catastrophic global warming have been found, after nearly two decades of temperature stasis, to be in error. It’s not surprising.
We have been subjected to extravagance from climate catastrophists for close to 50 years.
In January 1970, Life magazine, based on “solid scientific evidence”, claimed that by 1985 air pollution would reduce the sunlight reaching the Earth by half. In fact, across that period sunlight fell by between 3 per cent and 5 per cent. In a 1971 speech, Paul Ehrlich said: “If I were a gambler I would take even money that England will not exist in the year 2000.”
Fast forward to March 2000 and David Viner, senior research scientist at the Climatic Research Unit, University of East Anglia, told The Independent, “Snowfalls are now a thing of the past.” In December 2010, the Mail Online reported, “Coldest December since records began as temperatures plummet to minus 10C bringing travel chaos across Britain”.
We’ve had our own busted predictions. Perhaps the most preposterous was climate alarmist Tim Flannery’s 2005 observation: “If the computer records are right, these drought conditions will become permanent in eastern Australia.” Subsequent rainfall and severe flooding have shown the records or his analysis are wrong. We’ve swallowed dud prediction after dud prediction. What’s more, the Intergovernmental Panel on Climate Change, which we were instructed was the gold standard on global warming, has been exposed repeatedly for misrepresentation and shoddy methods.
Weather bureaus appear to have “homogenised” data to suit narratives. NASA’s claim that 2014 was the warmest year on record was revised, after challenge, to only 38 per cent probability. Extreme weather events, once blamed on global warming, no longer are, as their frequency and intensity decline.
Why then, with such little evidence, does the UN insist the world spend hundreds of billions of dollars a year on futile climate change policies? Perhaps Christiana Figueres, executive secretary of the UN’s Framework on Climate Change has the answer?
In Brussels last February she said, “This is the first time in the history of mankind that we are setting ourselves the task of intentionally, within a defined period of time, to change the economic development model that has been reigning for at least 150 years since the Industrial Revolution.”
In other words, the real agenda is concentrated political authority. Global warming is the hook.
Figueres is on record saying democracy is a poor political system for fighting global warming. Communist China, she says, is the best model. This is not about facts or logic. It’s about a new world order under the control of the UN. It is opposed to capitalism and freedom and has made environmental catastrophism a household topic to achieve its objective.
Figueres says that, unlike the Industrial Revolution, “This is a centralised transformation that is taking place.” She sees the US partisan divide on global warming as “very detrimental”. Of course. In her authoritarian world there will be no room for debate or disagreement.
Make no mistake, climate change is a must-win battlefield for authoritarians and fellow travellers. As Timothy Wirth, president of the UN Foundation, says: “Even if the (climate change) theory is wrong, we will be doing the right thing in terms of economic and environmental policy.”
Having gained so much ground, eco-catastrophists won’t let up. After all, they have captured the UN and are extremely well funded. They have a hugely powerful ally in the White House. They have successfully enlisted compliant academics and an obedient and gullible mainstream media (the ABC and Fairfax in Australia) to push the scriptures regardless of evidence.
They will continue to present the climate change movement as an independent, spontaneous consensus of concerned scientists, politicians and citizens who believe human activity is “extremely likely” to be the dominant cause of global warming. (“Extremely likely” is a scientific term?)
And they will keep mobilising public opinion using fear and appeals to morality. UN support will be assured through promised wealth redistribution from the West, even though its anti-growth policy prescriptions will needlessly prolong poverty, hunger, sickness and illiteracy for the world’s poorest.
Figueres said at a climate summit in Melbourne recently that she was “truly counting on Australia’s leadership” to ensure most coal stayed in the ground.
Hopefully, like India’s Prime Minister Narendra Modi, Tony Abbott isn’t listening. India knows the importance of cheap energy and is set to overtake China as the world’s leading importer of coal. Even Germany is about to commission the most coal-fired power stations in 20 years.
There is a real chance Figueres and those who share her centralised power ambitions will succeed. As the UN’s December climate change conference in Paris approaches, Australia will be pressed to sign even more futile job-destroying climate change treaties.
Resisting will be politically difficult. But resist we should. We are already paying an unnecessary social and economic price for empty gestures. Enough is enough.
Maurice Newman is chairman of the Prime Minister’s Business Advisory Council. The views expressed here are his own.
6 Things You Probably Didn’t Know About Monsanto
Seed manufacturer Monsanto Company has been the target of a lot of criticism over the past few years, including a couple of articles that I wrote when I first started writing for Forward Progressives. In 2013, the first annual March Against Monsanto took place. It was supposedly in response to the failure of California Proposition 37, in 2012 which would have mandated labeling of foods that came from seeds that were genetically enhanced.
After a couple of articles on the subject in which I expressed concern over certain Monsanto practices, I was urged by people who have a background in science to “do some research” – so since then I’ve spent literally hundreds of hours researching not only Monsanto itself, but GE technology as well. As a result of this research, I came to the conclusion that Monsanto – and genetic engineering of seed technology – isn’t the horrible Frankenstein experiment the March Against Monsanto crowd would have people believe.
Here are a few of the things that I’ve learned that I want to share with you.
6. You’ll often hear about how Monsanto is suing farmers for alleged cross-contamination. However, out of the hundreds of thousands of farmers the company sells seed to annually, they’ve only sued 144 between 1997 and 2010 and that was for violating their patent rights or contract with the company. The company also notes that out of all of those lawsuits, only 9 have gone to trial and any recovered funds are donated.
Even though Monsanto’s policy of enforcing patents might seem strict, other companies that sell biotechnology-enhanced seeds enforce their patents. In other words, it’s not a some evil plot, it’s simply a business being a business. “Monsanto is not the only seed company that enforces its intellectual property rights,” Baucum said. “Pioneer, Agripro, Syngenta, all these companies have engaged in enforcement actions against other people who had violated their rights in seed and seed products they’re creating.”Baucum also said people should weigh the small number of lawsuits against the “hundreds of thousands of people” to whom the company has licensed seed to over the past ten years.
Overall, both Baucum and Reat agree growers are usually more than willing to settle matters with Monsanto representatives in a polite, respectable way out-of-court. “A lot of times growers are worried that Monsanto is going to take their farm, but we will do everything possible to reach a settlement in these matters,” Reat said.
Whether the farmer settles directly with Monsanto, or the case goes to trial, the proceeds are donated to youth leadership initiatives including scholarship programs. (Source)5. Monsanto is not the only seed company out there that uses biotechnology to modify seed lines to create plants that are more resistant to drought and pests. Dow, Syngenta and Pioneer are just a few other companies that do the same thing, but you will probably never hear a March Against Monsanto activist talk about them. I wonder why that is?
4. Monsanto received a 100 percent rating from the Human Rights campaign for LGBT equality in the workplace in 2013, and this wasn’t a one-time fluke either.
It’s the fourth consecutive time the company has been designated a “Best Places to Work for LGBT Equality” by the Human Rights Campaign.
The campaign’s Corporate Equality Index rates companies based on LGBT-friendly policies and procedures. Monsanto, for example, offers domestic partner and transgender-inclusive health care coverage to employees.
“We are proud of our company’s diversity and our focus on inclusion to insure that every voice is heard and every person is treated equally as these are critical to our success,” said Nicole Ringenberg, Monsanto vice president and controller, as well as the executive sponsor for the company’s LGBT employee network, Encompass. “We’re thrilled to share the news that we are being recognized again by the Human Rights Campaign.”3. Monsanto and GE technology have often been blamed for the decline of the monarch butterfly, but the actual decline of the butterfly is due to farming practices which have killed off a lot of the plant they depend on, the milkweed.
Milkweed is the only plant on which monarch butterflies will lay their eggs, and it is the primary food source for monarch caterpillars. Despite its necessity to the species, the plant decreased 21 percent in the United States between 1995 and 2013. Scientists, conservationists, and butterfly enthusiasts are encouraging people to grow the plant in their own yards and gardens.Monsanto has since pledged $4 million to restore the habitat needed for monarch butterflies and is encouraging people to leave patches of milkweed intact whenever possible.
2. Monsanto is often vilified by Big Organic activists (Yes, organic is a very real industry with a global market of $63 billion dollars. Been to Whole Foods lately?) as trying to starve or poison the world, but they’ve actually done a lot to combat hunger and promote agriculture in the developing world, including countries like Bangladesh.
Monsanto actually supports common sense labeling laws, but does not agree with labels lobbied for by the organic industry which attempts to vilify GE technology despite the absence of any scientifically proven risks – and no, the retracted Seralini study doesn’t count.
1. Monsanto doesn’t control the United States government, including the FDA. You’re thinking of defense contractors, the oil industry, and Wall Street. While it is true they’re a multi-billion dollar company and may have some lobbyists, they pale in comparison to companies like Exxon, Lockheed Martin, Verizon, or Goldman Sachs.
In the interest of fairness, Monsanto isn’t a perfect company. In their past, they’ve been involved in lawsuits over PCBs contaminating creeks from their chemical division Solutia, which was spun off in 1997 and is now owned by the Eastman Chemical Company (which itself was spun off from Eastman Kodak in 1994). Another surprising fact is that their transition from a chemical company that notoriously produced Agent Orange for the United States (along with other companies) as well as some other environmentally-damaging products, to a bio-tech corporation was partially steered by one Mitt Romney who worked for Bain Capital at the time. In other words, they’ve moved from a polluting chemical giant to a player in the green biotech world along with companies like Dow, Pioneer, Syngenta and others with the help of a former presidential candidate.
Many are also concerned about the active ingredient in Monsanto’s Roundup weed killer, saying it has been linked to cancer – but there’s more to that story as well. The U.S. Environmental Protection Agency’s official stance currently is that there’s no evidence that glyphosate can be linked to cancer. However, the UN’s International Agency for Research on Cancer declared in March that glyphosate “probably” raises the risk of cancer in people exposed. What’s important to note here is that the levels of exposure according to the UN’s findings have to be extremely high and sustained over some period of time, which means farmworkers are the main group who would be most at risk – if there is a risk. Due to the new findings from the UN, the EPA is reviewing glyphosate’s risks and expects to release a new assessment later this year, which could include new restrictions on use but will not call for an outright ban on the chemical.
Another common claim by the organics industry and their blogs is that GMOs are killing off the bees. That claim is false. Neonicotinoid pesticides (which are not a feature of GE seeds) have been implicated as a possible culprit in colony collapse along with a variety of parasites and fungi. It’s also worth pointing out that Rotenone, a similar pesticide based on nicotine, has been used for decades in the organic industry and has also likely killed its fair share of pollinating insects. When I was a kid and we did organic farming, we used Rotenone pretty heavily on crops and nobody ever discussed how toxic the pesticide could be to the ecosystem, including bees.
Now, I understand that vast majority of my audience is left of center and most of them laugh at conservatives who believe climate change is a liberal conspiracy, despite the fact that science has shown over and over again that climate change is a very real thing. What is very troubling is that these same people who believe Fox News viewers are idiots for denying climate change or evolution, deny science themselves when it comes to topics like GE technology or even vaccines. It is also important to point out that corporations can and do change over time based on science, profit, and public image. That’s why Monsanto isn’t producing industrial insulators these days, forgoing PCBs and designing new strains of vegetables, including organic vegetables using traditional cross-breeding methods.
I support sustainable, locally sourced produce and independent farmers wholeheartedly every chance I get. Trust me, I’d much rather spend my money on food where I can go see the farm from which it came, but that’s because I prefer food that hasn’t been in a storage locker for months or possibly produced in countries using slave labor. However, it’s arrogant and ignorant to insist that farmers and consumers in developing countries follow first world ideals, which aren’t based on ethical concerns, but rather on the inability of some to understand science, business law, or how capitalism works.
There are important things for us as liberals and progressives to work on, including making sure that all Americans have access to fresh, nutritious food – but demonizing a company out of uninformed fear peddled by conspiracy nuts and snake oil salesmen like Dr. Oz takes us backwards, not forward.
Subscribe to:
Posts (Atom)
Archetype
From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Archetype The concept of an archetyp...
-
From Wikipedia, the free encyclopedia Islamic State of Iraq and the Levant الدولة الإسلامية في العراق والشام ( ...
-
From Wikipedia, the free encyclopedia A reproduction of the palm -leaf manuscript in Siddham script ...