Search This Blog

Wednesday, May 13, 2015

Researchers create first neural-network chip built just with memristors

May 07, 2015 by Bob Yirka report
Original link:  http://phys.org/news/2015-05-neural-network-chip-built-memristors.html#inlRlv


A memristive neural network. The cartoon depicts a fragment of Prezioso and colleagues’ artificial neural network, which consists of crossing horizontal and vertical wires that have memristor devices (yellow) at the junctions. Input voltages …more

(Phys.org)—A team of researchers working at the University of California (and one from Stony Brook University) has for the first time created a neural-network chip that was built using just memristors. In their paper published in the journal Nature, the team describes how they built their chip and what capabilities it has.

Memristors may sound like something from a sci-fi movie, but they actually exist—they are electronic analog memory devices that are modeled on human neurons and synapses. Human consciousness, some believe, is in reality, nothing more than an advanced form of memory retention and processing, and it is analog, as opposed to computers, which of course are digital. The idea for memristors was first dreamed up by University of California professor Leon Chua back in 1971, but it was not until a team working at Hewlett-Packard in 2008, first built one. Since then, a lot of research has gone into studying the technology, but until now, no one had ever built a neural-network chip based exclusively on them.

Up till now, most neural networks have been software based, Google, Facebook and IBM, for example, are all working on computer systems running such learning networks, mostly meant to pick faces out of a crowd, or return an answer based on a human phrased question. While the gains in such technology have been obvious, the limiting factor is the hardware—as neural networks grow in size and complexity, they begin to tax the abilities of even the fastest computers. The next step, most in the field believe, is to replace transistors with memristors—each on its own is able to learn, in ways similar to the way neurons in the brain learn when presented with something new. Putting them on a chip would of course reduce the overhead needed to run such a network.

The new chip, the team reports, was created using transistor-free metal-oxide memristor crossbars and represents a basic neural network able to perform just one task—to learn and recognize patterns in very simple 3 × 3-pixel black and white images. The experimental chip, they add, is an important step towards the creation of larger neural networks that tap the real power of remristors. It also makes possible the idea of building computers in lock-step with advances in research looking into discovering just how exactly our neurons work at their most basic level.


More information: Training and operation of an integrated neuromorphic network based on metal-oxide memristors, Nature 521, 61–64 (07 May 2015) doi:10.1038/nature14441

Abstract
Despite much progress in semiconductor integrated circuit technology, the extreme complexity of the human cerebral cortex, with its approximately 1014 synapses, makes the hardware implementation of neuromorphic networks with a comparable number of devices exceptionally challenging. To provide comparable complexity while operating much faster and with manageable power dissipation, networks based on circuits combining complementary metal-oxide-semiconductors (CMOSs) and adjustable two-terminal resistive devices (memristors) have been developed. In such circuits, the usual CMOS stack is augmented with one or several crossbar layers, with memristors at each crosspoint. There have recently been notable improvements in the fabrication of such memristive crossbars and their integration with CMOS circuits, including first demonstrations of their vertical integration. Separately, discrete memristors have been used as artificial synapses in neuromorphic networks. Very recently, such experiments have been extended to crossbar arrays of phase-change memristive devices. The adjustment of such devices, however, requires an additional transistor at each crosspoint, and hence these devices are much harder to scale than metal-oxide memristors, whose nonlinear current–voltage curves enable transistor-free operation. Here we report the experimental implementation of transistor-free metal-oxide memristor crossbars, with device variability sufficiently low to allow operation of integrated neural networks, in a simple network: a single-layer perceptron (an algorithm for linear classification). The network can be taught in situ using a coarse-grain variety of the delta rule algorithm to perform the perfect classification of 3 × 3-pixel black/white images into three classes (representing letters). This demonstration is an important step towards much larger and more complex memristive neuromorphic networks.

Study reveals how rivers regulate global carbon cycle

Original link:  http://phys.org/news/2015-05-reveals-rivers-global-carbon.html


River Kvirila at Sachkhere, Georgia. Credit: Wikipedia

Humans concerned about climate change are working to find ways of capturing excess carbon dioxide (CO2) from the atmosphere and sequestering it in the Earth. But Nature has its own methods for the removal and long-term storage of carbon, including the world's river systems, which transport decaying organic material and eroded rock from land to the ocean.
-->Free Multiphysics Simulation Resource While river transport of to the ocean is not on a scale that will bail humans out of our CO2 problem, we don't actually know how much carbon the world's rivers routinely flush into the ocean - an important piece of the .

But in a study published May 14 in the journal Nature, scientists from Woods Hole Oceanographic Institution (WHOI) calculated the first direct estimate of how much and in what form is exported to the ocean by rivers. The estimate will help modelers predict how the carbon export from global rivers may shift as Earth's climate changes.

"The world's rivers act as Earth's circulatory system, flushing carbon from land to the ocean and helping reduce the amount that returns to the in the form of heat-trapping ," said lead author and geochemist Valier Galy. "Some of that carbon—'new' carbon—is from decomposed plant and soil material that is washed into the river and then out to sea. But some of it comes from carbon that has long been stored in the environment in the form of rocks— 'old' carbon—that have been eroded by weather and the force of the river."

The scientists, who included Bernhard Peucker-Ehrenbrink, and Timothy Eglinton (now at ETH Zürich), amassed data on sediments flowing out of 43 river systems all over the world, which cumulatively account for 20 percent of the total sediments discharged by rivers. The representative rivers also encompassed a broad range of climates, vegetation, geological conditions, and levels of disturbance by people.

From these river sediment flow measurements, the research team calculated amounts of particles of carbon-containing plant and rock debris that each river exported. They estimated that the world's rivers annually transport 200 megatons (200 million tons) of carbon to the ocean. The total equals about .02 percent of the total mass of carbon in the atmosphere. That may not seem like a lot, but over 1000 to 10,000 years, it continues to add up to significant amounts of carbon (20 and 200 percent) extracted from the atmosphere.

Generally, plants convert CO2 from the atmosphere into organic carbon via photosynthesis. But most of this carbon eventually returns to the atmosphere when plant material (or animals that eat plants) decompose. A small fraction of this material, however, ends up in rivers. They carry it out to sea, where some settles to the seafloor and is buried and disconnected from the atmosphere for millions of years and eventually makes its way back to the surface in the form of rocks.

At the same time, rivers also erode carbon-containing rocks into particles carried downstream. The process exposes carbon to air, oxidizing the previously locked-up carbon into carbon dioxide that can leak back out to the atmosphere. Until now, scientists had no way to distinguish how much of the carbon whisked away by rivers comes from either the biospheric or petrogenic (rock) sources. Without this information, scientists' ability to model or quantitatively predict carbon sequestration under different scenarios was limited.

To solve this dilemma, the scientists found a novel way to distinguish for the first time the sources of that carbon—either from eroded rocks or from decomposed plant and soil material. They analyzed the amounts of carbon-14, a radioactive isotope, in the river particles. Carbon-14 decays away within about 60,000 years, so it is present only in material that came from living things, and not rocks. Subtracting the portion of particles that did not contain carbon-14, the scientists calculated the percentage that was derived from the terrestrial biosphere: about 80 percent.

But even though biospheric carbon is the major source of carbon exported by rivers, the scientists also discovered that rivers surrounded by greater amounts of vegetation didn't necessarily transport more carbon to the ocean. Instead, the export was "primarily controlled by the capacity of rivers to mobilize and transport" particles. Erosion is the key factor—the more erosion occurs along the river, the more carbon it transfers to sea and sequesters from the air.

"The atmosphere is a small reservoir of carbon compared to rocks, soils, the biosphere, and the ocean," the scientists wrote in Nature. "As such, its size is sensitive to small imbalances in the exchange with and between these larger reservoirs."

The new study gives scientists a firmer handle on measuring the important, and heretofore elusive, role of global in the planetary and enhances their ability to predict how riverine carbon export may shift as Earth's climate changes.

"This study will provide geochemical modelers with new insights on an important link between the global carbon and water cycles," says Don Rice, program director in the National Science Foundation's Division of Ocean Sciences, a major funder of the research.


More information: Global carbon export from the terrestrial biosphere controlled by erosion, Nature, DOI: 10.1038/nature14400

A metal composite that will (literally) float your boat

Original link:  http://phys.org/news/2015-05-metal-composite-literally-boat.html
A metal composite that will (literally) float your boat 
The first metal matrix syntactic foam is so light it can float, strong enough to withstand the rigors of a marine environment, and resistant to heat, which makes it a candidate for automakers seeking to shed weight to improve fuel economy. The composite was the work of NYU Polytechnic School of Engineering Professor Nikhil Gupta and Deep Springs Technology (DST) in collaboration with the U.S. Army Research Laboratory. Credit: NYU Polytechnic School of Engineering

Read more at: http://phys.org/news/2015-05-metal-composite-literally-boat.html#jCp

Researchers have demonstrated a new metal matrix composite that is so light that it can float on water. A boat made of such lightweight composites will not sink despite damage to its structure. The new material also promises to improve automotive fuel economy because it combines light weight with heat resistance.

Although syntactic foams have been around for many years, this is the first development of a lightweight metal matrix syntactic foam. It is the work of a team of researchers from Deep Springs Technology (DST) and the New York University Polytechnic School of Engineering.

Their magnesium alloy matrix composite is reinforced with hollow particles and has a density of only 0.92 grams per cubic centimeter compared to 1.0 g/cc of water. Not only does it have a density lower than that of water, it is strong enough to withstand the rigorous conditions faced in the marine environment.

Significant efforts in recent years have focused on developing lightweight polymer matrix composites to replace heavier metal-based components in automobiles and marine vessels. The technology for the new composite is very close to maturation and could be put into prototypes for testing within three years. Amphibious vehicles such as the Ultra Heavy-lift Amphibious Connector (UHAC) being developed by the U.S. Marine Corps can especially benefit from the light weight and high buoyancy offered by the new syntactic foams, the researchers explained.

"This new development of very light metal matrix composites can swing the pendulum back in favor of metallic materials," forecasted Nikhil Gupta, an NYU School of Engineering professor in the Department of Mechanical and Aerospace Engineering and the study's co-author. "The ability of metals to withstand higher temperatures can be a huge advantage for these composites in engine and exhaust components, quite apart from structural parts."

The syntactic foam made by DST and NYU captures the lightness of foams, but adds substantial strength. The secret of this syntactic foam starts with a matrix made of a , which is then turned into foam by adding strong, lightweight silicon carbide hollow spheres developed and manufactured by DST. A single sphere's shell can withstand pressure of over 25,000 pounds per square inch (PSI) before it ruptures—one hundred times the maximum pressure in a fire hose.

The hollow particles also offer impact protection to the syntactic foam because each shell acts like an energy absorber during its fracture. The composite can be customized for density and other properties by adding more or fewer shells into the metal matrix to fit the requirements of the application. This concept can also be used with other magnesium alloys that are non-flammable.

The new composite has potential applications in boat flooring, automobile parts, and buoyancy modules as well as vehicle armor.

The authors recently published their findings in the International Journal of Impact Engineering. "Dynamic Properties of Silicon Carbide Hollow Particle Filled Magnesium Alloy (AZ91D) Matrix Syntactic Foams" is available here.

Genetically Engineering Almost Anything

Original link:  http://www.pbs.org/wgbh/nova/next/evolution/crispr-gene-drives/

When it comes to genetic engineering, we’re amateurs. Sure, we’ve known about DNA’s structure for more than 60 years, we first sequenced every A, T, C, and G in our bodies more than a decade ago, and we’re becoming increasingly adept at modifying the genes of a growing number of organisms.

But compared with what’s coming next, all that will seem like child’s play. A new technology just announced today has the potential to wipe out diseases, turn back evolutionary clocks, and reengineer entire ecosystems, for better or worse. Because of how deeply this could affect us all, the scientists behind it want to start a discussion now, before all the pieces come together over the next few months or years. This is a scientific discovery being played out in real time.

dna-repair-machinery
 
Scientists have figured out how to use a cell's DNA repair mechanisms to spread traits throughout a population.
Today, researchers aren’t just dropping in new genes, they’re deftly adding, subtracting, and rewriting them using a series of tools that have become ever more versatile and easier to use. In the last few years, our ability to edit genomes has improved at a shockingly rapid clip. So rapid, in fact, that one of the easiest and most popular tools, known as CRISPR-Cas9, is just two years old. Researchers once spent months, even years, attempting to rewrite an organism’s DNA. Now they spend days.
Soon, though, scientists will begin combining gene editing with gene drives, so-called selfish genes that appear more frequently in offspring than normal genes, which have about a 50-50 chance of being passed on. With gene drives—so named because they drive a gene through a population—researchers just have to slip a new gene into a drive system and let nature take care of the rest. Subsequent generations of whatever species we choose to modify—frogs, weeds, mosquitoes—will have more and more individuals with that gene until, eventually, it’s everywhere.
Cas9-based gene drives could be one of the most powerful technologies ever discovered by humankind. “This is one of the most exciting confluences of different theoretical approaches in science I’ve ever seen,” says Arthur Caplan, a bioethicist at New York University. “It merges population genetics, genetic engineering, molecular genetics, into an unbelievably powerful tool.”

We’re not there yet, but we’re extraordinarily close. “Essentially, we have done all of the pieces, sometimes in the same relevant species.” says Kevin Esvelt, a postdoc at Harvard University and the wunderkind behind the new technology. “It’s just no one has put it all together.”

It’s only a matter of time, though. The field is progressing rapidly. “We could easily have laboratory tests within the next few months and then field tests not long after that,” says George Church, a professor at Harvard University and Esvelt’s advisor. “That’s if everybody thinks it’s a good idea.”

It’s likely not everyone will think this is a good idea. “There are clearly people who will object,” Caplan says. “I think the technique will be incredibly controversial.” Which is why Esvelt, Church, and their collaborators are publishing papers now, before the different parts of the puzzle have been assembled into a working whole.

“If we’re going to talk about it at all in advance, rather than in the past tense,” Church says, “now is the time.”

“Deleterious Genes”

The first organism Esvelt wants to modify is the malaria-carrying mosquito Anopheles gambiae. While his approach is novel, the idea of controlling mosquito populations through genetic modification has actually been around since the late 1970s. Then, Edward F. Knipling, an entomologist with the U.S. Department of Agriculture, published a substantial handbook with a chapter titled “Use of Insects for Their Own Destruction.” One technique, he wrote, would be to modify certain individuals to carry “deleterious genes” that could be passed on generation after generation until they pervaded the entire population. It was an idea before its time. Knipling was on the right track, but he and his contemporaries lacked the tools to see it through.

The concept surfaced a few more times before being picked up by Austin Burt, an evolutionary biologist and population geneticist at Imperial College London. It was the late 1990s, and Burt was busy with his yeast cells, studying their so-called homing endonucleases, enzymes that facilitate the copying of genes that code for themselves. Self-perpetuating genes, if you will. “Through those studies, gradually, I became more and more familiar with endonucleases, and I came across the idea that you might be able to change them to recognize new sequences,” Burt recalls.

Other scientists were investigating endonucleases, too, but not in the way Burt was. “The people who were thinking along those lines, molecular biologists, were thinking about using these things for gene therapy,” Burt says. “My background in population biology led me to think about how they could be used to control populations that were particularly harmful.”

In 2003, Burt penned an influential article that set the course for an entire field: We should be using homing endonucleases, a type of gene drive, to modify malaria-carrying mosquitoes, he said, not ourselves. Burt saw two ways of going about it—one, modify a mosquito’s genome to make it less hospitable to malaria, and two, skew the sex ratio of mosquito populations so there are no females for the males to reproduce with. In the following years, Burt and his collaborators tested both in the lab and with computer models before they settled on sex ratio distortion. (Making mosquitoes less hospitable to malaria would likely be a stopgap measure at best; the Plasmodium protozoans could evolve to cope with the genetic changes, just like they have evolved resistance to drugs.)

Burt has spent the last 11 years refining various endonucleases, playing with different scenarios of inheritance, and surveying people in malaria-infested regions. Now, he finally feels like he is closing in on his ultimate goal.
“There’s a lot to be done still,” he says. “But on the scale of years, not months or decades.”

Cheating Natural Selection

Cas9-based gene drives could compress that timeline even further. One half of the equation—gene drives—are the literal driving force behind proposed population-scale genetic engineering projects. They essentially let us exploit evolution to force a desired gene into every individual of a species. “To anthropomorphize horribly, the goal of a gene is to spread itself as much as possible,” Esvelt says. “And in order to do that, it wants to cheat inheritance as thoroughly as it can.” Gene drives are that cheat.

Without gene drives, traits in genetically-engineered organisms released into the wild are vulnerable to dilution through natural selection. For organisms that have two parents and two sets of chromosomes (which includes humans, many plants, and most animals), traits typically have only a 50-50 chance of being inherited, give or take a few percent. Genes inserted by humans face those odds when it comes time to being passed on. But when it comes to survival in the wild, a genetically modified organism’s odds are often less than 50-50. Engineered traits may be beneficial to humans, but ultimately they tend to be detrimental to the organism without human assistance. Even some of the most painstakingly engineered transgenes will be gradually but inexorably eroded by natural selection.
Some naturally occurring genes, though, have over millions of years learned how to cheat the system, inflating their odds of being inherited. Burt’s “selfish” endonucleases are one example. They take advantage of the cell’s own repair machinery to ensure that they show up on both chromosomes in a pair, giving them better than 50-50 odds when it comes time to reproduce.
 
A gene drive (blue) always ends up in all offspring, even if only one parent has it. That means that, given enough generations, it will eventually spread through the entire population.
 
Here’s how it generally works. The term “gene drive” is fairly generic, describing a number of different systems, but one example involves genes that code for an endonuclease—an enzyme which acts like a pair of molecular scissors—sitting in the middle of a longer sequence of DNA that the endonculease is programmed to recognize. If one chromosome in a pair contains a gene drive but the other doesn’t, the endonuclease cuts the second chromosome’s DNA where the endonuclease code appears in the first.

The broken strands of DNA trigger the cell’s repair mechanisms. In certain species and circumstances, the cell unwittingly uses the first chromosome as a template to repair the second. The repair machinery, seeing the loose ends that bookend the gene drive sequence, thinks the middle part—the code for the endonuclease—is missing and copies it onto the broken chromosome. Now both chromosomes have the complete gene drive. The next time the cell divides, splitting its chromosomes between the two new cells, both new cells will end up with a copy of the gene drive, too. If the entire process works properly, the gene drive’s odds of inheritance aren’t 50%, but 100%.

gene-drive-schematic
 
Here, a mosquito with a gene drive (blue) mates with a mosquito without one (grey). In the offspring, one chromosome will have the drive. The endonuclease then slices into the drive-free DNA. When the strand gets repaired, the cell's machinery uses the drive chromosome as a template, unwittingly copying the drive into the break.

Most natural gene drives are picky about where on a strand of DNA they’ll cut, so they need to be modified if they’re to be useful for genetic engineering. For the last few years, geneticists have tried using genome-editing tools to build custom gene drives, but the process was laborious and expensive. With the discovery of CRISPR-Cas9 as a genome editing tool in 2012, though, that barrier evaporated. CRISPR is an ancient bacterial immune system which identifies the DNA of invading viruses and sends in an endonuclease, like Cas9, to chew it up. Researchers quickly realized that Cas9 could easily be reprogrammed to recognize nearly any sequence of DNA. All that’s needed is the right RNA sequence—easily ordered and shipped overnight—which Cas9 uses to search a strand of DNA for where to cut. This flexibility, Esvelt says, “lets us target, and therefore edit, pretty much anything we want.” And quickly.

Gene drives and Cas9 are each powerful on their own, but together they could significantly change biology. CRISRP-Cas9 allows researchers to edit genomes with unprecedented speed, and gene drives allow engineered genes to cheat the system, even if the altered gene weakens the organism. Simply by being coupled to a gene drive, an engineered gene can race throughout a population before it is weeded out. “Eventually, natural selection will win,” Esvelt says, but “gene drives just let us get ahead of the game.”

Beyond Mosquitoes

If there’s anywhere we could use a jump start, it’s in the fight against malaria. Each year, the disease kills over 200,000 people and sickens over 200 million more, most of whom are in Africa. The best new drugs we have to fight it are losing ground; the Plasmodium parasite is evolving resistance too quickly. And we’re nowhere close to releasing an effective vaccine. The direct costs of treating the disease are estimated at $12 billion, and the economies of affected countries grew 1.3% less per year, a substantial amount.

Which is why Esvelt and Burt are both so intently focused on the disease. “If we target the mosquito, we don’t have to face resistance on the parasite itself. The idea is, we can just take out the vector and stop all transmission. It might even lead to eradication,” Esvelt says.

Esvelt initially mulled over the idea of building Cas9-based gene drives in mosquitoes to do just that. He took the idea to to Flaminia Catteruccia, a professor who studies malaria at the Harvard School of Public Health, and the two grew increasingly certain that such a system would not only work, but work well. As their discussions progressed, though, Esvelt realized they were “missing the forest for the trees.” Controlling malaria-carrying mosquitoes was just the start. Cas9-based gene drives were the real breakthrough. “If it let’s us do this for mosquitos, what is to stop us from potentially doing it for almost anything that is sexually reproducing?” he realized.
In theory, nothing. But in reality, the system works best on fast-reproducing species, Esvelt says. Short generation times allow the trait to spread throughout a population more quickly. Mosquitoes are a perfect test case. If everything were to work perfectly, deleterious traits could sweep through populations of malaria-carrying mosquitoes in as few as five years, wiping them off the map.

Other noxious species could be candidates, too. Certain invasive species, like mosquitoes in Hawaii or Asian carp in the Great Lakes, could be targeted with Cas9-based gene drives to either reduce their numbers or eliminate them completely. Agricultural weeds like horseweed that have evolved resistance to glyphosate, a herbicide that is broken down quickly in the soil, could have their susceptibility to the compound reintroduced, enabling more farmers to adopt no-till practices, which help conserve topsoil. And in the more distant future, Esvelt says, weeds could even be engineered to introduce vulnerabilities to completely benign substances, eliminating the need for toxic pesticides. The possibilities seem endless.

The Decision

Before any of that can happen, though, Esvelt and Church are adamant that the public help decide whether the research should move forward. “What we have here is potentially a general tool for altering wild populations,” Esvelt says. “We really want to make sure that we proceed down this path—if we decide to proceed down this path—as safely and responsibly as possible.”

To kickstart the conversation, they partnered with the MIT political scientist Kenneth Oye and others to convene a series of workshops on the technology. “I thought it might be useful to get into the room people with slightly different material interests,” Oye says, so they invited regulators, nonprofits, companies, and environmental groups. The idea, he says, was to get people to meet several times, to gain trust and before “decisions harden.” Despite the diverse viewpoints, Oye says there was surprising agreement among participants about what the important outstanding questions were.

As the discussion enters the public sphere, tensions are certain to intensify. “I don’t care if it’s a weed or a blight, people still are going to say this is way too massive a genetic engineering project,” Caplan says. “Secondly, it’s altering things that are inherited, and that’s always been a bright line for genetic engineering.” Safety, too, will undoubtedly be a concern. As the power of a tool increases, so does its potential for catastrophe, and Cas9-based gene drives could be extraordinarily powerful.

There’s also little in the way of precedent that we can use as a guide. Our experience with genetically modified foods would seem to be a good place to start, but they are relatively niche organisms that are heavily dependent on water and fertilizer. It’s pretty easy to keep them contained to a field. Not so with wild organisms; their potential to spread isn’t as limited.

Aware of this, Esvelt and his colleagues are proposing a number of safeguards, including reversal drives that can undo earlier engineered genes. “We need to really make sure those work if we’re proposing to build a drive that is intended to modify a wild population,” Esvelt says.

There are still other possible hurdles to surmount—lab-grown mosquitoes may not interbreed with wild ones, for example—but given how close this technology is to prime time, Caplan suggests researchers hew to a few initial ethical guidelines. One, use species that are detrimental to human health and don’t appear to fill a unique niche in the wild. (Malaria-carrying mosquitoes seem fit that description.) Two, do as much work as possible using computer models. And three, researchers should continue to be transparent about their progress, as they have been. “I think the whole thing is hugely exciting,” Caplan says. “But the time to really get cracking on the legal/ethical infrastructure for this technology is right now.”

Church agrees, though he’s also optimistic about the potential for Cas9-based gene drives. “I think we need to be cautious with all new technologies, especially all new technologies that are messing with nature in some way or another. But there’s also a risk of doing nothing,” Church says. “We have a population of 7 billion people. You have to deal with the environmental consequences of that.”

Tuesday, May 12, 2015

22 Very Inconvenient Climate Truths


Original link (for more detail):  http://wattsupwiththat.com/2015/05/12/22-very-inconvenient-climate-truths/

Here are 22 good reasons not to believe the statements made by the Intergovernmental Panel on Climate Change (IPCC)
22-inconvenienttruths-on-global-warming 
Guest essay by Jean-Pierre Bardinet.

According to the official statements of the IPCC “Science is clear” and non-believers cannot be trusted.

Quick action is needed! For more than 30 years we have been told that we must act quickly and that after the next three or five years it will be too late (or even after the next 500 days according to the French Minister of foreign affairs speaking in 2014) and the Planet will be beyond salvation and become a frying pan -on fire- if we do not drastically reduce our emissions of CO2, at any cost, even at the cost of economic decline, ruin and misery.

But anyone with some scientific background who takes pains to study the topics at hand is quickly led to conclude that the arguments of the IPCC are inaccurate, for many reasons of which here is a non-exhaustive list.

The 22 Inconvenient Truths

1. The Mean Global Temperature has been stable since 1997, despite a continuous increase of the CO2 content of the air: how could one say that the increase of the CO2 content of the air is the cause of the increase of the temperature? (discussion: p. 4)

2. 57% of the cumulative anthropic emissions since the beginning of the Industrial revolution have been emitted since 1997, but the temperature has been stable. How to uphold that anthropic CO2 emissions (or anthropic cumulative emissions) cause an increase of the Mean Global Temperature?

[Note 1: since 1880 the only one period where Global Mean Temperature and CO2 content of the air increased simultaneously has been 1978-1997. From 1910 to 1940, the Global Mean Temperature increased at about the same rate as over 1978-1997, while CO2 anthropic emissions were almost negligible. Over 1950-1978 while CO2 anthropic emissions increased rapidly the Global Mean Temperature dropped. From Vostok and other ice cores we know that it’s the increase of the temperature that drives the subsequent increase of the CO2 content of the air, thanks to ocean out-gassing, and not the opposite. The same process is still at work nowadays] (discussion: p. 7)

3. The amount of CO2 of the air from anthropic emissions is today no more than 6% of the total CO2 in the air (as shown by the isotopic ratios 13C/12C) instead of the 25% to 30% said by IPCC. (discussion: p. 9)

4. The lifetime of CO2 molecules in the atmosphere is about 5 years instead of the 100 years said by IPCC. (discussion: p. 10)

5. The changes of the Mean Global Temperature are more or less sinusoidal with a well defined 60 year period. We are at a maximum of the sinusoid(s) and hence the next years should be cooler as has been observed after 1950. (discussion: p. 12)

6. The absorption of the radiation from the surface by the CO2 of the air is nearly saturated. Measuring with a spectrometer what is left from the radiation of a broadband infrared source (say a black body heated at 1000°C) after crossing the equivalent of some tens or hundreds of meters of the air, shows that the main CO2 bands (4.3 µm and 15 µm) have been replaced by the emission spectrum of the CO2 which is radiated at the temperature of the trace-gas. (discussion: p. 14)

7. In some geological periods the CO2 content of the air has been up to 20 times today’s content, and there has been no runaway temperature increase! Why would our CO2 emissions have a cataclysmic impact? The laws of Nature are the same whatever the place and the time. (discussion: p. 17)

8. The sea level is increasing by about 1.3 mm/year according to the data of the tide-gauges (after correction of the emergence or subsidence of the rock to which the tide gauge is attached, nowadays precisely known thanks to high precision GPS instrumentation); no acceleration has been observed during the last decades; the raw measurements at Brest since 1846 and at Marseille since the 1880s are slightly less than 1.3 mm/year. (discussion: p. 18)

9. The “hot spot” in the inter-tropical high troposphere is, according to all “models” and to the IPCC reports, the indubitable proof of the water vapour feedback amplification of the warming: it has not been observed and does not exist. (discussion: p. 20)

10. The water vapour content of the air has been roughly constant since more than 50 years but the humidity of the upper layers of the troposphere has been decreasing: the IPCC foretold the opposite to assert its “positive water vapour feedback” with increasing CO2. The observed “feedback” is negative. (discussion: p.22)

11. The maximum surface of the Antarctic ice-pack has been increasing every year since we have satellite observations. (discussion: p. 24)

12. The sum of the surfaces of the Arctic and Antarctic icepacks is about constant, their trends are phase-opposite; hence their total albedo is about constant. (discussion: p. 25)

13. The measurements from the 3000 oceanic ARGO buoys since 2003 may suggest a slight decrease of the oceanic heat content between the surface and a depth 700 m with very significant regional differences. (discussion: p. 27)

14. The observed outgoing longwave emission (or thermal infrared) of the globe is increasing, contrary to what models say on a would-be “radiative imbalance”; the “blanket” effect of CO2 or CH4 “greenhouse gases” is not seen. (discussion:p. 29)

15. The Stefan Boltzmann formula does not apply to gases, as they are neither black bodies, nor grey bodies: why does the IPCC community use it for gases ? (discussion: p. 30)

16. The trace gases absorb the radiation of the surface and radiate at the temperature of the air which is, at some height, most of the time slightly lower that of the surface. The trace-gases cannot “heat the surface“, according to the second principle of thermodynamics which prohibits heat transfer from a cooler body to a warmer body. (discussion: p. 32)

17. The temperatures have always driven the CO2 content of the air, never the reverse. Nowadays the net increment of the CO2 content of the air follows very closely the inter-tropical temperature anomaly. (discussion: p. 33)

18. The CLOUD project at the European Center for Nuclear Research is probing the Svensmark-Shaviv hypothesis on the role of cosmic rays modulated by the solar magnetic field on the low cloud coverage; the first and encouraging results have been published in Nature. (discussion: p. 36)

19. Numerical “Climate models” are not consistent regarding cloud coverage which is the main driver of the surface temperatures. Project Earthshine (Earthshine is the ghostly glow of the dark side of the Moon) has been measuring changes of the terrestrial albedo in relation to cloud coverage data; according to cloud coverage data available since 1983, the albedo of the Earth has decreased from 1984 to 1998, then increased up to 2004 in sync with the Mean Global Temperature. (discussion: p. 37)

20. The forecasts of the “climate models” are diverging more and more from the observations. A model is not a scientific proof of a fact and if proven false by observations (or falsified) it must be discarded, or audited and corrected. We are still waiting for the IPCC models to be discarded or revised; but alas IPCC uses the models financed by the taxpayers both to “prove” attributions to greenhouse gas and to support forecasts of doom. (discussion: p. 40)

21. As said by IPCC in its TAR (2001) “we are dealing with a coupled non-linear chaotic system, and therefore the long-term prediction of future climate states is not possible.” Has this state of affairs changed since 2001? Surely not for scientific reasons. (discussion: p. 43)

22. Last but not least the IPCC is neither a scientific organization nor an independent organization: the summary for policy makers, the only part of the report read by international organizations, politicians and media is written under the very close supervision of the representative of the countries and of the non-governmental pressure groups.

The governing body of the IPCC is made of a minority of scientists almost all of them promoters of the environmentalist ideology, and a majority of state representatives and of non-governmental green organizations. (discussion: p. 46)

Appendix

Jean Poitou and François-Marie Bréon are distinguished members of the climate establishment and redactors of parts of the IPCC fifth assessment report report (AR5).

Jean Poitou is a physicist and climatologist, graduated from Ecole Supérieure de Physique et Chimie (Physics and Chemistry engineering college) and is climatologist at the Laboratory of the climate and environment sciences at IPSL, a joint research lab from CEA, CNRS, and UVSQ (*). He has written a book on the Climate for the teachers of secondary schools

François-Marie Bréon at CEA since 1993, has published 85 articles, is Directeur de recherche at CNRS, and author of the IPCC report 2013; he has been scientific manager of the ICARE group (CNES, CNRS, University of Lille), and of the POLDER and MicroCarb Space missions

The con in consensus: Climate change consensus among the misinformed is not worth much


Original link:  http://business.financialpost.com/fp-comment/climate-change-consensus-among-the-misinformed-is-not-worth-much

Ross McKitrick: Lots of people get called “climate experts” and contribute to the appearance of consensus, without necessarily being knowledgeable about core issues. A consensus among the misinformed is not worth much.
AP Photo/Jim Cole, FileRoss McKitrick: Lots of people get called “climate experts” and contribute to the appearance of consensus, without necessarily being knowledgeable about core issues. A consensus among the misinformed is not worth much.

Not only is there no 97 per cent consensus among climate scientists; many misunderstand core issues

In the lead-up to the Paris climate summit, massive activist pressure is on all governments, especially Canada’s, to fall in line with the global warming agenda and accept emission targets that could seriously harm our economy. One of the most powerful rhetorical weapons being deployed is the claim that 97 per cent of the world’s scientists agree what the problem is and what we have to do about it. In the face of such near-unanimity, it would be understandable if Prime Minister Harper and the Canadian government were simply to capitulate and throw Canada’s economy under the climate change bandwagon. But it would be a tragedy because the 97 per cent claim is a fabrication.

Like so much else in the climate change debate, one needs to check the numbers. First of all, on what exactly are 97 per cent of experts supposed to agree? In 2013 President Obama sent out a tweet claiming 97 per cent of climate experts believe global warming is “real, man-made and dangerous.” As it turns out the survey he was referring to didn’t ask that question, so he was basically making it up. At a recent debate in New Orleans I heard climate activist Bill McKibben claim there was a consensus that greenhouse gases are “a grave danger.” But when challenged for the source of his claim, he promptly withdrew it.

The Intergovernmental Panel on Climate Change asserts the conclusion that most (more than 50 per cent) of the post-1950 global warming is due to human activity, chiefly greenhouse gas emissions and land use change. But it does not survey its own contributors, let alone anyone else, so we do not know how many experts agree with it. And the statement, even if true, does not imply that we face a crisis requiring massive restructuring of the worldwide economy. In fact it is consistent with the view that the benefits of fossil fuel use greatly outweigh the climate-related costs.

One commonly-cited survey asked if carbon dioxide is a greenhouse gas and human activities contribute to climate change. But these are trivial statements that even many IPCC skeptics agree with. And again, both statements are consistent with the view that climate change is harmless. So there are no policy implications of such surveys, regardless of the level of agreement.

More than half acknowledge that their profession is split on the issue

The most highly-cited paper supposedly found 97 per cent of published scientific studies support man-made global warming. But in addition to poor survey methodology, that tabulation is often misrepresented. Most papers (66 per cent) actually took no position. Of the remaining 34 per cent, 33 per cent supported at least a weak human contribution to global warming. So divide 33 by 34 and you get 97 per cent, but this is unremarkable since the 33 per cent includes many papers that critique key elements of the IPCC position.

Two recent surveys shed more light on what atmospheric scientists actually think. Bear in mind that on a topic as complex as climate change, a survey is hardly a reliable guide to scientific truth, but if you want to know how many people agree with your view, a survey is the only way to find out.

In 2012 the American Meteorological Society (AMS) surveyed its 7,000 members, receiving 1,862 responses. Of those, only 52 per cent said they think global warming over the 20th century has happened and is mostly manmade (the IPCC position). The remaining 48 per cent either think it happened but natural causes explain at least half of it, or it didn’t happen, or they don’t know. Furthermore, 53 per cent agree that there is conflict among AMS members on the question.

So no sign of a 97 per cent consensus. Not only do about half reject the IPCC conclusion, more than half acknowledge that their profession is split on the issue.

The Netherlands Environmental Agency recently published a survey of international climate experts. 6550 questionnaires were sent out, and 1868 responses were received, a similar sample and response rate to the AMS survey. In this case the questions referred only to the post-1950 period. 66 per cent agreed with the IPCC that global warming has happened and humans are mostly responsible. The rest either don’t know or think human influence was not dominant. So again, no 97 per cent consensus behind the IPCC.

But the Dutch survey is even more interesting because of the questions it raises about the level of knowledge of the respondents. Although all were described as “climate experts,” a large fraction only work in connected fields such as policy analysis, health and engineering, and may not follow the primary physical science literature.

But in addition to poor survey methodology, that tabulation is often misrepresented

Regarding the recent slowdown in warming, here is what the IPCC said: “The observed global mean surface temperature (GMST) has shown a much smaller increasing linear trend over the past 15 years than over the past 30 to 60 years.” Yet 46 per cent of the Dutch survey respondents – nearly half – believe the warming trend has stayed the same or increased. And only 25 per cent agreed that global warming has been less than projected over the past 15 to 20 years, even though the IPCC reported that 111 out of 114 model projections overestimated warming since 1998.

Three quarters of respondents disagreed or strongly disagreed with the statement “Climate is chaotic and cannot be predicted.” Here is what the IPCC said in its 2003 report: “In climate research and modelling, we should recognize that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible.”

Looking into further detail there are other interesting ways in which the so-called experts are unaware of unresolved discrepancies between models and observations regarding issues like warming in the tropical troposphere and overall climate sensitivity.
 
What can we take away from all this? First, lots of people get called “climate experts” and contribute to the appearance of consensus, without necessarily being knowledgeable about core issues. A consensus among the misinformed is not worth much.
Second, it is obvious that the “97 per cent” mantra is untrue. The underlying issues are so complex it is ludicrous to expect unanimity. The near 50/50 split among AMS members on the role of greenhouse gases is a much more accurate picture of the situation. The phony claim of 97 per cent consensus is mere political rhetoric aimed at stifling debate and intimidating people into silence.

The Canadian government has the unenviable task of defending the interest of the energy producers and consumers of a cold, thinly-populated country, in the face of furious, deafening global warming alarmism. Some of the worst of it is now emanating from the highest places. Barack Obama’s website (barackobama.com) says “97 per cent of climate scientists agree that climate change is real and man-made…Find the deniers near you — and call them out today.” How nice. But what we really need to call out is the use of false propaganda and demagoguery to derail factual debate and careful consideration of all facets of the most complex scientific and policy issue of our time.

Ross McKitrick is a professor of economics at the University of Guelph, a senior fellow at the Fraser Institute and an adjunct scholar of the Cato Institute.

Monday, May 11, 2015

Empirical Evidence: Oceans Make Climate

Updated May 11 with text added at the end.
You only have to compare Sea Surface Temperatures (SST) from HADSST3 with estimates of Global Mean Surface Temperatures (GMST) from Hadcrut4 and RSS.

This first graph shows how global SST has varied since 1850. There are obvious changepoints where the warming or cooling periods have occurred.

This graph shows in green Hadcrut4 estimates of global surface temperature, including ocean SST, and near surface air temperatures over land. The blue line from RSS tracks lower tropospheric air temperatures measured by satellites, not near the surface but many meters higher. Finally, the red line is again Hadsst3 global SST All lines use 30-month averages to reduce annual noise and display longer term patterns.

Strikingly, SST and GMST are almost synonymous from the beginning until about 1980. Then GMST diverges with more warming than global SST. Satellite TLT shows the same patterns but with less warming than the surface. Curious as to the post 1980s patterns, I looked into HADSST3 and found NH SST warmed much more strongly during that period.

This graph shows how warming from circulations in the Northern Pacific and Northern Atlantic drove GMST since 1980. And it suggests that since 2005 NH SST is no longer increasing, and may turn toward cooling.

Surface Heat Flux from Ocean to Air

Now one can read convoluted explanations about how rising CO2 in the atmosphere can cause land surface heating which is then transported over the ocean and causes higher SST. But the interface between ocean and air is well described and measured. Not surprisingly it is the warmer ocean water sending heat into the atmosphere, and not the other way around.

The graph displays measures of heat flux in the sub-tropics during a 21-day period in November. Shortwave solar energy shown above in green labeled radiative is stored in the upper 200 meters of the ocean. The upper panel shows the rise in SST (Sea Surface Temperature) due to net incoming energy. The yellow shows latent heat cooling the ocean, (lowering SST) and transferring heat upward, driving convection.

From

An Investigation of Turbulent Heat Exchange in the Subtropics
James B. Edson

“One can think of the ocean as a capacitor for the MJO (Madden-Julian Oscillation), where the energy is being accumulated when there is a net heat flux into the ocean (here occurring to approximately November 24) after which it is released to the atmosphere during the active phase of the MJO under high winds and large latent heat exchange.”

http://www.onr.navy.mil/reports/FY13/mmedson.pdf

Conclusion

As we see in the graphs ocean circulations change sea surface temperatures which then cause global land and sea temperatures to change. Thus, oceans make climate by making temperature changes.

On another post I describe how oceans also drive precipitation, the other main determinant of climate. Oceans make rain, and the processes for distributing rain over land are shown here: https://rclutz.wordpress.com/2015/04/30/here-comes-the-rain-again/

And a word from Dr. William Gray:

“Changes in the ocean’s deep circulation currents appears to be, by far, the best physical explanation for the observed global surface temperature changes (see Gray 2009, 2011, 2012, 2012). It seems ridiculous to me for both the AGW advocates and us skeptics to so closely monitor current weather and short-time climate change as indication of CO2’s influence on our climate. This assumes that the much more dominant natural climate changes that have always occurred are no longer in operation or have relevance.”

http://www.icecap.us/
 Indeed, Oceans Make Climate, or as Dr. Arnd Bernaerts put it:
“Climate is the continuation of oceans by other means.”
Update May 11, 2015

Kenneth Richards provided some supporting references in a comment at Paul Homewood’s site. They are certainly on point especially this one:

“Examining data sets of surface heat flux during the last few decades for the same region, we find that the SST warming was not a consequence of atmospheric heat flux forcing. Conversely, we suggest that long-term SST warming drives changes in atmosphere parameters at the sea surface, most notably an increase in latent heat flux, and that an acceleration of the hydrological cycle induces a strengthening of the trade winds and an acceleration of the Hadley circulation.”

That quote is from Servain et al, unfortunately behind a paywall.  The paper is discussed here:
http://hockeyschtick.blogspot.ca/2014/09/new-paper-finds-climate-of-tropical.html

Full comment from Richards:
http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-13-00651.1

The surface of the world’s oceans has been warming since the beginning of industrialization. In addition to this, multidecadal sea surface temperature (SST) variations of internal [natural] origin exist. Evidence suggests that the North Atlantic Ocean exhibits the strongest multidecadal SST variations and that these variations are connected to the overturning circulation. This work investigates the extent to which these internal multidecadal variations have contributed to enhancing or diminishing the trend induced by the external radiative forcing, globally and in the North Atlantic. A model study is carried out wherein the analyses of a long control simulation with constant radiative forcing at preindustrial level and of an ensemble of simulations with historical forcing from 1850 until 2005 are combined. First, it is noted that global SST trends calculated from the different historical simulations are similar, while there is a large disagreement between the North Atlantic SST trends. Then the control simulation is analyzed, where a relationship between SST anomalies and anomalies in the Atlantic meridional overturning circulation (AMOC) for multidecadal and longer time scales is identified. This relationship enables the extraction of the AMOC-related SST variability from each individual member of the ensemble of historical simulations and then the calculation of the SST trends with the AMOC-related variability excluded. For the global SST trends this causes only a little difference while SST trends with AMOC-related variability excluded for the North Atlantic show closer agreement than with the AMOC-related variability included. From this it is concluded that AMOC [Atlantic meridional overturning circulation] variability has contributed significantly to North Atlantic SST trends since the mid nineteenth century.

 http://link.springer.com/article/10.1007%2Fs00382-014-2168-7
After a decrease of SST by about 1 °C during 1964–1975, most apparent in the northern tropical region, the entire tropical basin warmed up. That warming was the most substantial (>1 °C) in the eastern tropical ocean and in the longitudinal band of the intertropical convergence zone. Examining data sets of surface heat flux during the last few decades for the same region, we find that the SST [sea surface temperature] warming was not a consequence of atmospheric heat flux forcing [greenhouse gases]. Conversely, we suggest that long-term SST warming drives changes in atmosphere parameters at the sea surface, most notably an increase in latent heat flux, and that an acceleration of the hydrological cycle induces a strengthening of the trade winds and an acceleration of the Hadley circulation. These trends are also accompanied by rising sea levels and upper ocean heat content over similar multi-decadal time scales in the tropical Atlantic. Though more work is needed to fully understand these long term trends, especially what happens from the mid-1970’s, it is likely that changes in ocean circulation involving some combination of the Atlantic meridional overtuning circulation [AMOC] and the subtropical cells are required to explain the observations.

 http://www.nature.com/ncomms/2014/141208/ncomms6752/full/ncomms6752.html
The Atlantic Meridional Overturning Circulation (AMOC) is a key component of the global climate system, responsible for a large fraction of the 1.3 PW northward heat transport in the Atlantic basin. Numerical modelling experiments suggest that without a vigorous AMOC, surface air temperature in the North Atlantic region would cool by around 1–3 °C, with enhanced local cooling of up to 8 °C in regions with large sea-ice changes. Substantial weakening of the AMOC would also cause a southward shift of the inter-tropical convergence zone, encouraging Sahelian drought, and dynamic changes in sea level of up to 80 cm along the coasts of North America and Europe.

Lie group

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Lie_group In mathematics , a Lie gro...