Search This Blog

Thursday, January 30, 2014

Vestas says record powerful wind turbine in operation

    

A photo taken on June 29, 2012 shows a Vestas wind turbine near Baekmarksbro in Jutland
A photo taken on June 29, 2012 shows a Vestas wind turbine near Baekmarksbro in Jutland
Danish wind technology giant Vestas said on Thursday that the world's most powerful wind turbine has begun operating, sweeping an area equivalent to three football fields. [DJS -- that's almost three or four per sq. kilometer of space the turbines take, not to mention the pollution in a poor part of southern China where all the "dirty and toxic" industrial and mining work is done to build them -- but what do we care about China?]
A prototype for the group's first V164 8 offshore wind has successfully produced its first electricity, the Aarhus-based group said.
"We expect that it will reduce the cost of energy for our customers," spokesman Michael Zarin said.[Only if you don't count the massive tax breaks and direct subsidies they get -- nobody pays for that, right?]
"You can have fewer turbines to have the same amount of electricity. ... You can save a lot of the expense on things like the foundations, the cabling or the substation," he added.[To produce the same amount of electricity as what?  A 1,000,000 megawatt standard nuclear or fossil plant?  I -- well, there's just this thing called lying, which you can away with here because so many of us are so scientifically illiterate, and too lazy to check facts before rousing up opinions.]
The 8 megawatt turbine, which will be the flagship product for a joint venture between Vestas and Mitsubishi Heavy Industries, has the capacity to produce electricity for 7,500 European households.[Let's see.  7500 households is a very small town.  A single, decent sized city -- we're not even close to the entire country -- will have some 100 times that many households, plus industry and business of all sides, and public transportation.  So we will need 100-200 of these behemoths, using up to a total 25-60 sq kilometers worth of land, or a square 5-8 kilometers on side.  Actually more, because you will need space between them.  With all that spare land Europe has, especially kilometers and kilometers of flat, windy, unpopulated country, this should be no problem. Even when you scale it to national and continental levels.  Oh, and those poor Chinese workers don't mind dying and suffering at the 1-200 times rate -- much more, in the end -- they did for one turbine.  And once again, with all that infinite taxpayer money, no problem with finances.
It's been installed on land at the Danish National Test Centre for Large Wind Turbines in Oesterild in northwestern Denmark. Vestas said serial production could begin in 2015 if there is enough demand.
The most powerful onshore wind turbine on the market is currently the 7.5 megawatt E-126 by Germany's Enercon, while the largest offshore turbines are the 6 megawatt models produced by Germany's Siemens and France's Alstom.[Interesting, Germany is.  Although lauded for the greatest use of wind power and other "recyclables", they are still, year after year, among the top CO2 emitters of the West, especially per capita, because of all the coal and oil they have to burn to sustain their lifestyle and security.  Even with their nuclear plants helping -- I know, they're all so dangerous, we have to get rid of them! -- as the people who know nothing about the history, science, and technology of nuclear will holler; even those don't help much.  Perhaps trying to support thousand of utterly stupendous, economically unsustainable turbines will collapse the German -- and then European -- economy, and they can make work tearing down the plants that actually served her electricity needs.  Brilliant!]
 
Competition in the sector is fierce: South Korea's Samsung Heavy Industries installed a 7 megawatt offshore wind prototype turbine in Scotland last year.
France's Areva and Spain's Gamesa said last week they were holding talks on combining their offshore wind turbine activities, and that they planned to accelerate development of an 8 megawatt turbine.

PubChemRDF is Launched

PubChem Blog

News, updates and tutorials about PubChem


 
Posted on by  by Peter Murray-Rust @petermurrayrust
http://pubchemblog.ncbi.nlm.nih.gov/2014/01/30/pubchemrdf-is-launched/ 

Introducing PubChemRDF!
The PubChemRDF project encodes PubChem information using the Resource Description Framework (RDF).  One of the aims of the PubChemRDF project is to help researchers work with PubChem data on local computing resources using semantic web technologies.  Another aim is to harness ontological frameworks to help facilitate PubChem data sharing, analysis, and integration with resources external to the National Center for Biotechnology (NCBI) and across scientific domains.

What is RDF?
RDF stands for resource description framework and constitutes a family of World Wide Web Consortium (W3C) specifications for data interchange on the Web. RDF breaks down knowledge into machine readable discrete pieces, called “triples.” Each “triple” is organized as a trio of “subject-predicate-object.” For example, in the phrase “atorvastatin may treat hypercholesterolemia,” the subject is “atorvastatin,” the predicate is “may treat,” and the object is “hypercholesterolemia.” RDF uses a Uniform Resource Identifier (URI) to name each part of the “subject-predicate-object” triple. A URI looks just like a typical web URL.

RDF is a core part of semantic web standards.  As an extension of the existing World Wide Web, the semantic web attempts to make it easier for users to find, share, and combine information.  Semantic web leverages the following technologies: Extensible Markup Language (XML), which provides syntax for RDF; Web Ontology Language (OWL), which extends the ability of RDF to encode information; Resource Description Framework (RDF), which expresses knowledge; and RDF query language (SPARQL), which enables query and manipulation of RDF content.

How can PubChemRDF help your research?
PubChem users have frequently expressed interest in having a downloadable, schema-less database. PubChemRDF enables the NoSQL database access and query of PubChem databases.  Using PubChemRDF, one can download the desired RDF formatted data files from the PubChem FTP site, import them into a triplestore, and query using a SPARQL query interface. There are a number of open-source or commercial triplestores, such as Apache Jena TDB and OpenLink Virtuoso (a list of triplestores can be found here: http://en.wikipedia.org/wiki/Triplestore).
Other than triplestores, PubChemRDF data can also be loaded into RDF-aware graph databases such as Neo4j, and the graph traversal algorithms can be used to query the RDF graphs. At last but not least, the ontological representation of PubChem knowledge base allows logical inference, such as forward/backward chaining.

The RDF data on the PubChem FTP site is arranged in such a way that you only need to download the type of information in which you are interested, so you can avoid downloading parts of PubChem data you will not use.  For example, if you are just interested in computed chemical properties, you only need to download PubChemRDF data in compound descriptor subdomain. In addition to bulk download, PubChemRDF also provides programmatic data access through REST-full interface.

Where can you learn more about this?
To get an overview of the PubChemRDF project, please view this presentation.  To learn more about detailed aspects of PubChemRDF and how to use it, please view this presentation. The PubChemRDF Release Notes provide additional technical information about the project.

Additional blog posts will follow on PubChemRDF project topics, including: the FTP site layout, the REST-full interface, and ways to utilize PubChemRDF for research purposes including using SPARQL queries.

Wednesday, January 29, 2014

Extending Fermat to Other Powers

File:Pierre de Fermat.jpg

January 29'th, David Strumfels, A Medley of Potpourri blog

We all know Fermat's famous Last Theorem, solved almost 20 years ago (!) by Andrew Wiles (using mathematical techniques the Fermat did not have, so his proof remains a mystery to history).  The theorem states that no three positive integers a, b, and c can satisfy the equation an + bn = cn for any integer value of n greater than two.  Examples:  3^2 + 4^2 = 5^2, and 12^2 + 5^2 = 13^2.

I'd been toying around with variations of combining various integers raised to various powers for some time (when you enjoy doing something you don't notice how much time), when I observed that:
3^3 + 4^3 + 5^3 = 6^3.  In other words, here I had found (undoubtedly not for the first time in history) a group of four positive integers, a, b, and c, can satisfy the equation a^n + b^n + c^n = d^n whenever n = three.

But does Fermat's modified Theorem apply here too?  Can n never exceed three?  Could a similar method be used to prove this Theorem?  And furthermore, does the fact of squares and cubes having these relationships, mean they keep on going up the line, infinitely.  E.g., can five integers, raised to the fourth power, be found with this relationship?  And on and on?  (The 5/4 set is false for 2,3,4,5,6, if you are curious; try it.)

Now I am no mathematician, but a little math instinct tells me something interesting is going on here.  A very large theorem, encompassing Fermat's and our third power analogue and possibly beyond feels ... well, like a genuine mathematical conjecture at least, if I use the word correctly.  First, I shall look for a fourth power analogue, for if it doesn't exist then I am blowing smoke (I have to assume it will be found, if at all, with fairly small integers, as with the second and third powers.)

I will work on this, and feel free to give it your all too, if you want to.  That's all for now.

David J. Strumfels

Abuse of Statistics in Obscuring ~2000-2013 Warming Plateau

David J Strumfels Again, note the plateau over the last 10-12 years. For that period we are told and shown that this decade+ is the hottest period for centuries, and is the result of over a centuries' worth of warming. Almost any peak in it, like 2013, 2010, or 2005 will be among the hottest years over those centuries, but just due to statistics. Clearly, from ~2000 onwards however, the warming has plateaued for some reason, possibly a ~30 year cycle in global temperature (this can also be seen 1880-1910 and 1940-1975 cooling periods, interspersed among stronger warming trends). It is nowhere near a straight warming line throughout the last century and into this, although AGWs and others often try to fit lines. It has been mathematical sign wave combined with a straight line describes the warming much better -- and even predicted the (albeit, perhaps temporary) hiatus in worming in the 21'st century; all supportive of this particular model (also, the straight line, starting at 1975-80 would predict a world today ~0.2C warmer, while the sine-modified line is right on target).
 
Wriiten in response to:
Asteroid Initiatives @AsteroidEnergy 12m
RT @EarthVitalSigns 2013 global surface temp tied for 7th warmest year on record http://1.usa.gov/Mg6Pe7 ‪#‎NASA‬ pic.twitter.com/aygxwowGA0
David J Strumfels I've explained the fallacy behind these statements enough that you should recognize them for the misleading propaganda they are.  If not, look above.

First Weather Map of Brown Dwarf

ESO’s VLT charts surface of nearest brown dwarf
29 January 2014
ESO's Very Large Telescope has been used to create the first ever map of the weather on the surface of the nearest brown dwarf to Earth. An international team has made a chart of the dark and light features on WISE J104915.57-531906.1B, which is informally known as Luhman 16B and is one of two recently discovered brown dwarfs forming a pair only six light-years from the Sun. The new results are being published in the 30 January 2014 issue of the journal Nature.
Brown dwarfs fill the gap between giant gas planets, such as Jupiter and Saturn, and faint cool stars. They do not contain enough mass to initiate nuclear fusion in their cores and can only glow feebly at infrared wavelengths of light. The first confirmed brown dwarf was only found twenty years ago and only a few hundred of these elusive objects are known.
The closest brown dwarfs to the Solar System form a pair called Luhman 16AB [1] that lies just six light-years from Earth in the southern constellation of Vela (The Sail). This pair is the third closest system to the Earth, after Alpha Centauri and Barnard's Star, but it was only discovered in early 2013. The fainter component, Luhman 16B, had already been found to be changing slightly in brightness every few hours as it rotated — a clue that it might have marked surface features.
Now astronomers have used the power of ESO's Very Large Telescope (VLT) not just to image these brown dwarfs, but to map out dark and light features on the surface of Luhman 16B.
Ian Crossfield (Max Planck Institute for Astronomy, Heidelberg, Germany), the lead author of the new paper, sums up the results: “Previous observations suggested that brown dwarfs might have mottled surfaces, but now we can actually map them. Soon, we will be able to watch cloud patterns form, evolve, and dissipate on this brown dwarf — eventually, exometeorologists may be able to predict whether a visitor to Luhman 16B could expect clear or cloudy skies.”
To map the surface the astronomers used a clever technique. They observed the brown dwarfs using the CRIRES instrument on the VLT. This allowed them not just to see the changing brightness as Luhman 16B rotated, but also to see whether dark and light features were moving away from, or towards the observer. By combining all this information they could recreate a map of the dark and light patches of the surface.
The atmospheres of brown dwarfs are very similar to those of hot gas giant exoplanets, so by studying comparatively easy-to-observe brown dwarfs [2] astronomers can also learn more about the atmospheres of young, giant planets — many of which will be found in the near future with the new SPHERE instrument that will be installed on the VLT in 2014.
Crossfield ends on a personal note: “Our brown dwarf map helps bring us one step closer to the goal of understanding weather patterns in other solar systems. From an early age I was brought up to appreciate the beauty and utility of maps. It's exciting that we're starting to map objects out beyond the Solar System!”

Notes

[1] This pair was discovered by the American astronomer Kevin Luhman on images from the WISE infrared survey satellite. It is formally known as WISE J104915.57-531906.1, but a shorter form was suggested as being much more convenient. As Luhman had already discovered fifteen double stars the name Luhman 16 was adopted. Following the usual conventions for naming double stars, Luhman 16A is the brighter of the two components, the secondary is named Luhman 16B and the pair is referred to as Luhman 16AB.
[2] Hot Jupiter exoplanets lie very close to their parent stars, which are much brighter. This makes it almost impossible to observe the faint glow from the planet, which is swamped by starlight. But in the case of brown dwarfs there is nothing to overwhelm the dim glow from the object itself, so it is much easier to make sensitive measurements.

More information

This research was presented in a paper, “A Global Cloud Map of the Nearest Known Brown Dwarf”, by Ian Crossfield et al. to appear in the journal Nature.
The team is composed of I. J. M. Crossfield (Max Planck Institute for Astronomy [MPIA], Heidelberg, Germany), B. Biller (MPIA; Institute for Astronomy, University of Edinburgh, United Kingdom), J. Schlieder (MPIA), N. R. Deacon (MPIA), M. Bonnefoy (MPIA; IPAG, Grenoble, France), D. Homeier (CRAL-ENS, Lyon, France), F. Allard (CRAL-ENS), E. Buenzli (MPIA), Th. Henning (MPIA), W. Brandner (MPIA), B. Goldman (MPIA) and T. Kopytova (MPIA; International Max-Planck Research School for Astronomy and Cosmic Physics at the University of Heidelberg, Germany).
ESO is the foremost intergovernmental astronomy organisation in Europe and the world's most productive ground-based astronomical observatory by far. It is supported by 15 countries: Austria, Belgium, Brazil, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates three unique world-class observing sites in Chile: La Silla, Paranal and Chajnantor. At Paranal, ESO operates the Very Large Telescope, the world's most advanced visible-light astronomical observatory and two survey telescopes. VISTA works in the infrared and is the world's largest survey telescope and the VLT Survey Telescope is the largest telescope designed to exclusively survey the skies in visible light. ESO is the European partner of a revolutionary astronomical telescope ALMA, the largest astronomical project in existence. ESO is currently planning the 39-metre European Extremely Large optical/near-infrared Telescope, the E-ELT, which will become “the world's biggest eye on the sky”.

Links

Contacts

Ian Crossfield
Max Planck Institute for Astronomy
Heidelberg, Germany
Tel: +49 6221 528 406
Email: ianc@mpia.de
Richard Hook
ESO Public Information Officer
Garching bei München, Germany
Tel: +49 89 3200 6655
Cell: +49 151 1537 3591
Email: rhook@eso.org

Is Industrial Hemp The Ultimate Energy Crop?

By Thomas Prade, Swedish University of Agricultural Sciences

Bioenergy is currently the fastest growing source of renewable energy. Cultivating energy crops on arable land can decrease dependency on depleting fossil resources and it can mitigate climate change.
But some biofuel crops have bad environmental effects: they use too much water, displace people and create more emissions than they save. This has led to a demand for high-yielding energy crops with low environmental impact. Industrial hemp is said to be just that.

Enthusiasts have been promoting the use of industrial hemp for producing bioenergy for a long time now. With its potentially high biomass yield and its suitability to fit into existing crop rotations, hemp could not only complement but exceed other available energy crops.
Hemp, Cannabis sativa, originates from western Asia and India and from there spread around the globe. For centuries, fibres were used to make ropes, sails, cloth and paper, while the seeds were used for protein-rich food and feed. Interest in hemp declined when other fibres such as sisal and jute replaced hemp in the 19th century.

Abuse of hemp as a drug led to the prohibition of its cultivation by the United Nations in 1961. When prohibition was revoked in the 1990s in the European Union, Canada and later in Australia, industrially used hemp emerged again.

This time, the car industry’s interest in light, natural fibre promoted its use. For such industrial use, modern varieties with insignificant content of psychoactive compounds are grown. Nonetheless, industrial hemp cultivation is still prohibited in some industrialised countries like Norway and the USA.

Energy use of industrial hemp is today very limited. There are few countries in which hemp has been commercialised as an energy crop. Sweden is one, and has a small commercial production of hemp briquettes. Hemp briquettes are more expensive than wood-based briquettes, but sell reasonably well on regional markets.

Large-scale energy uses of hemp have also been suggested.

Biogas production from hemp could compete with production from maize, especially in cold climate regions such as Northern Europe and Canada. Ethanol production is possible from the whole hemp plant, and biodiesel can be produced from the oil pressed from hemp seeds. Biodiesel production from hemp seed oil has been shown to overall have a much lower environmental impact than fossil diesel.

Indeed, the environmental benefits of hemp have been praised highly, since hemp cultivation requires very limited amounts of pesticide. Few insect pests are known to exist in hemp crops and fungal diseases are rare.

Since hemp plants shade the ground quickly after sowing, they can outgrow weeds, a trait interesting especially for organic farmers. Still, a weed-free seedbed is required. And without nitrogen fertilisation hemp won´t grow as vigorously as is often suggested.

So, as with any other crop, it takes good agricultural practice to grow hemp right.

Hemp has a broad climate range and has been cultivated successfully from as far north as Iceland to warmer, more tropical regions. Flickr: Gregory Jordan
Being an annual crop, hemp functions very well in crop rotations. Here it may function as a break crop, reducing the occurance of pests, particularly in cereal production. Farmers interested in cultivating energy crops are often hesitant about tying fields into the production of perennial energy crops such as willow. Due to the high self-tolerance of hemp, cultivation over two to three years in the same field does not lead to significant biomass yield losses.

Small-scale production of hemp briquettes has also proven economically feasible. However, using whole-crop hemp (or any other crop) for energy production is not the overall solution.
Before producing energy from the residues it is certainly more environmentally friendly to use fibres, oils or other compounds of hemp. Even energy in the fibre products can be used when the products become waste.

Recycling plant nutrients to the field, such as in biogas residue, can contribute to lower greenhouse gas emissions from crop production.

Sustainable bioenergy production is not easy, and a diversity of crops will be needed. Industrial hemp is not the ultimate energy crop. Still, if cultivated on good soil with decent fertilisation, hemp can certainly be an environmentally sound crop for bioenergy production and for other industrial uses as well.

Thomas Prade receives funding from the Swedish Farmers’ Foundation for Agricultural Research, the EU commission, the Skåne Regional Council and Partnership Alnarp.
The Conversation

This article was originally published at The Conversation.

Read the original article.

Monday, January 27, 2014

Study examines the development of children’s prelife reasoning

Boston University / January 28, 2014 / Cognition
Pregnant woman
Most people, regardless of race, religion or culture, believe they are immortal. That is, people believe that part of themselves–some indelible core, soul or essence–will transcend the body’s death and live forever.  But what is this essence?  Why do we believe it survives?  And why is this belief so unshakable?

A new Boston University study led by postdoctoral fellow Natalie Emmons and published in the January 16, 2014 online edition of Child Development sheds light on these profound questions by examining children’s ideas about “prelife,” the time before conception.  By interviewing 283 children from two distinct cultures in Ecuador, Emmons’s research suggests that our bias toward immortality is a part of human intuition that naturally emerges early in life.  And the part of us that is eternal, we believe, is not our skills or ability to reason, but rather our hopes, desires and emotions.  We are, in fact, what we feel.

Emmons’ study fits into a growing body of work examining the cognitive roots of religion.  Although religion is a dominant force across cultures, science has made little headway in examining whether religious belief–such as the human tendency to believe in a creator–may actually be hard-wired into our brains.

“This work shows that it’s possible for science to study religious belief,” said Deborah Kelemen, an Associate Professor of Psychology at Boston University and co-author of the paper.  “At the same time, it helps us understand some universal aspects of human cognition and the structure of the mind.”

Most studies on immortality or “eternalist” beliefs have focused on people’s views of the afterlife.  Studies have found that both children and adults believe that bodily needs, such as hunger and thirst, end when people die, but mental capacities, such as thinking or feeling sad, continue in some form. 
But these afterlife studies leave one critical question unanswered: where do these beliefs come from?   Researchers have long suspected that people develop ideas about the afterlife through cultural exposure, like television or movies, or through religious instruction.  But perhaps, thought Emmons, these ideas of immortality actually emerge from our intuition.  Just as children learn to talk without formal instruction, maybe they also intuit that part of their mind could exist apart from their body.
Emmons tackled this question by focusing on “prelife,” the period before conception, since few cultures have beliefs or views on the subject.   “By focusing on prelife, we could see if culture causes these beliefs to appear, or if they appear spontaneously,” said Emmons.

“I think it’s a brilliant idea,” said Paul Bloom, a Professor of Psychology and Cognitive Science at Yale who was not involved with the study.  “One persistent belief is that children learn these ideas through school or church.  That’s what makes the prelife research so cool.  It’s a very clever way to get at children’s beliefs on a topic where they aren’t given answers ahead of time.”

Emmons interviewed children from an indigenous Shuar village in the Amazon Basin of Ecuador.  She chose the group because they have no cultural prelife beliefs, and she suspected that indigenous children, who have regular exposure to birth and death through hunting and farming, would have a more rational, biologically-based view of the time before they were conceived.  For comparison, she also interviewed children from an urban area near Quito, Ecuador.  Most of the urban children were Roman Catholic, a religion that teaches that life begins only at conception.  If cultural influences were paramount, reasoned Emmons, both urban and indigenous children should reject the idea of life before birth.

Emmons showed the children drawings of a baby, a young woman, and the same woman while pregnant, then asked a series of questions about the child’s abilities, thoughts and emotions during each period: as babies, in the womb, and before conception.

The results were surprising.  Both groups gave remarkably similar answers, despite their radically different cultures.  The children reasoned that their bodies didn’t exist before birth, and that they didn’t have the ability to think or remember. However, both groups also said that their emotions and desires existed before they were born. For example, while children generally reported that they didn’t have eyes and couldn’t see things before birth, they often reported being happy that they would soon meet their mother, or sad that they were apart from their family.

“They didn’t even realize they were contradicting themselves,” said Emmons. “Even kids who had biological knowledge about reproduction still seemed to think that they had existed in some sort of eternal form.  And that form really seemed to be about emotions and desires.”

Why would humans have evolved this seemingly universal belief in the eternal existence of our emotions?  Emmons said that this human trait might be a by-product of our highly developed social reasoning. “We’re really good at figuring out what people are thinking, what their emotions are, what their desires are,” she said.  We tend to see people as the sum of their mental states, and desires and emotions may be particularly helpful when predicting their behavior.  Because this ability is so useful and so powerful, it flows over into other parts of our thinking.  We sometimes see connections where potentially none exist, we hope there’s a master plan for the universe, we see purpose when there is none, and we imagine that a soul survives without a body.

These ideas, while nonscientific, are natural and deep-seated. “I study these things for a living but even find myself defaulting to them. I know that my mind is a product of my brain but I still like to think of myself as something independent of my body,” said Emmons.

“We have the ability to reflect and reason scientifically, and we have the ability to reason based on our gut and intuition,” she added.  “And depending on the situation, one may be more useful than the other.”

The Strategic Sourceror: 3 strategies for boosting sustainability across company operations

The Strategic Sourceror: 3 strategies for boosting sustainability across company operations

The Strategic Sourceror
The Strategic Sourceror is a news outlet & blog dedicated to procurement, finance & strategic sourcing professionals. We cover industry news, procurement solutions and best practices without heavily focusing on software solutions and providers. The Strategic Sourceror covers topics such as: cost reduction, strategic sourcing, purchasing best practices, spend management, mergers & acquisitions, supply chain innovations, commodity pricing and general procurement news.

3 strategies for boosting sustainability across company operations

on Friday, January 24, 2014
3 strategies for boosting sustainability across company operations












Segregating sustainable product sourcing practices from the larger company strategy and culture can be an easy mistake to make. However, businesses that have long been working to decrease the environmental impact of their procurement processes are well aware that isolated attempts at sustainability are often left behind or discredited over time. What many companies find they need is an emphasis on eco-friendliness that is central to their operations rather than peripheral.
While each firm's approach will differ based on its unique supplier network, some tried and true principles can be adapted and applied across the board. Here are three strategies that can help ensure sustainable sourcing is as effective and fully integrated as possible.

1. Increase communication

In order to avoid sustainability initiatives taking place in a vacuum, they need to be understood across the company. Visibility and communication have been a critical part of the success that Campbell Soup has enjoyed with its green initiatives. Dave Stangis, the company's vice president of public affairs and corporate responsibility, recently gave Sustainable Brands an instructive example of this principle.
"Someone in our communications department aims to tell a better sustainability story from our plant level; he's taken this objective to understand what sustainability work is happening in our 30 manufacturing plants around the world and help communicate that better externally," Stangis told the news source.

Without a solid understanding of sustainable practices internally, it's extremely difficult to communicate these efforts to the public in a cohesive way.

2. Make strategic partnerships

Firms also need to bear in mind that consumers will jud
ge them based on the business relationships they foster. Growing companies looking to make acquisitions, for instance, need to do so with sustainability in mind. According to Sustainable Brands, Campbell chose to acquire Plum Organics, a well regarded company with a reputation for its green mission and employees' involvement in those goals.

3. Consider supplier practices

Lastly, it's key that companies consider the processes of their distribution and sourcing partners as an extension of their own operations. Suppliers need to be factored in when firms calculate their environmental impact and plan for improvements. Sustainability consultant Bill Barry recently began working with book publisher Macmillan to lower carbon emissions across the company's production chain, GreenBiz reported. Barry helped the firm calculate and plan to reduce its direct and indirect environmental impacts by 65 percent over the next five years. He did this by helping Macmillian restructure its paper milling, transportation and other partnerships.

Sustainable supplier management is critical, rather than optional, for companies that are serious about green procurement.

What Killed the Woolly Mammoth?

Professor finds some evidence to support a comet collision as the trigger for the Younger Dryas, which may have contributed to megafauna extinction
Monday, January 27, 2014 - 11:30
Santa Barbara, CA

Nanodiamonds ENH.jpg

Nanodiamonds
Nanodiamond textures observed with high-resolution transmission electron microscopy: A) star twin; B) multiple linear twins.  Photo Credit: Bement et al.
Bull Creek, OK excavation
The excavation at Bull Creek, Okla., shows the paleosol — ancient buried soil; the dark black layer in the side of the cliff — that formed during the Younger Dryas.
Alex Simms
Alexander Simms
Could a comet have been responsible for the extinction of North America’s megafauna — woolly mammoths, giant ground sloths and saber-tooth tigers? UC Santa Barbara’s James Kennett, professor emeritus in the Department of Earth Science, posited that such an extraterrestrial event occurred 12,900 years ago.
Originally published in 2007, Kennett’s controversial Younger Dryas Boundary (YDB) hypothesis suggests that a comet collision precipitated the Younger Dryas period of global cooling, which, in turn, contributed to the extinction of many animals and altered human adaptations. The nanodiamond is one type of material that could result from an extraterrestrial collision, and the presence of nanodiamonds along Bull Creek in the Oklahoma Panhandle lends credence to the YDB hypothesis.

More recently, another group of earth scientists, including UCSB’s Alexander Simms and alumna Hanna Alexander, re-examined the distribution of nanodiamonds in Bull Creek’s sedimentological record to see if they could reproduce the original study’s evidence supporting the YDB hypothesis. Their findings appear in the Proceedings of the National Academy of Science.

“We were able to replicate some of their results and we did find nanodiamonds right at the Younger Dryas Boundary,” said Simms, an associate professor in UCSB’s Department of Earth Science.
“However, we also found a second spike of nanodiamonds more recently in the sedimentary record, sometime within the past 3,000 years.”

The researchers analyzed 49 sediment samples representing different time periods and environmental and climactic settings, and identified high levels of nanodiamonds immediately below and just above YDB deposits and in late-Holocene near-surface deposits. The late Holocene began at the end of the Pleistocene 11,700 years ago and continues to the present. The researchers found that the presence of nanodiamonds is not caused by environmental setting, soil formation, cultural activities, other climate changes or the amount of time in which the landscape is stable. The discovery of high concentrations of nanodiamonds from two distinct time periods suggests that whatever process produced the elevated concentrations of nanodiamonds at the onset of the Younger Dryas sediments may have also been active in recent millennia in Bull Creek.

“Nanodiamonds are found in high abundances at the YDB, giving some support to that theory,” Simms said. “However, we did find it at one other site, which may or may not be caused by a smaller but similar event nearby.”

A “recent” meteorite impact did occur near Bull Creek but scientists don’t know exactly when. The fact that the study’s second nanodiamond spike occurred sometime during the past 3,000 years suggests that the distribution of nanodiamonds is not unique to the Younger Dryas.

Contact Info: 

Julie Cohen
julie.cohen@ucsb.edu
(805) 893-7220
- See more at: http://www.news.ucsb.edu/node/013899/what-killed-woolly-mammoth#sthash.Y5CsZSPd.dpuf

Sensitivity of carbon cycle to tropical temperature variations has doubled

Sensitivity of carbon cycle to tropical temperature variations has doubled
Jan 26, 2014 
           Earth
The tropical carbon cycle has become twice as sensitive to temperature variations over the past 50 years, new research has revealed.

The research shows that a one degree rise in tropical temperature leads to around two billion extra tonnes of carbon being released per year into the atmosphere from tropical ecosystems, compared with the same tropical warming in the 1960s and 1970s.

Professor Pierre Friedlingstein and Professor Peter Cox, from the University of Exeter, collaborated with an international team of researchers from China, Germany, France and the USA, to produce the new study, which is published in the leading academic journal Nature.

Existing Earth System Model simulations indicate that the ability of tropical land ecosystems to store carbon will decline over the 21st century. However, these models are unable to capture the increase in the sensitivity of carbon dioxide to that is reported in this new study.

Research published last year by Professors Cox and Friedlingstein showed that these variations in can reveal the sensitivity of tropical ecosystems to future climate change.
Taken together, these studies suggest that the sensitivity of tropical ecosystems to climate change has increased substantially in recent decades.

Professor Cox, from the College of Engineering, Mathematics and Physical Sciences said "The year-to-year variation in is a very useful way to monitor how tropical ecosystems are responding to climate.

"The increase in variability in the last few decades suggests that tropical ecosystems have become more vulnerable to warming".

Professor Friedlingstein, who is an expert in studies added: "Current land carbon cycle models do not show this increase over the last 50 years, perhaps because these models underestimate emerging drought effects on ".

The lead author of the study, Xuhui Wang of Peking University, added: "This enhancement is very unlikely to have resulted from chance, and may provide a new perspective on a possible shift in the terrestrial carbon cycle over the past five decades".
Explore further: Lungs of the planet reveal their true sensitivity to global warming
More information: A two-fold increase of carbon cycle sensitivity to tropical temperature variations, DOI: 10.1038/nature12915

Picture of how our climate is affected by greenhouse gases is a 'cloudy' one -- ScienceDaily

Picture of how our climate is affected by greenhouse gases is a 'cloudy' one -- ScienceDaily

Date:
January 26, 2014
Source:
Hebrew University of Jerusalem
Summary:
The warming effect of human-induced greenhouse gases is a given, but to what extent can we predict its future influence? That is an issue on which science is making progress, but the answers are still far from exact, say researchers.
 


Recent studies have revealed a highly complicated picture of aerosol-cloud interactions.
Credit: © Maksim Shebeko / Fotolia
Recent studies have revealed a highly complicated picture of aerosol-cloud interactions.Credit: © Maksim Shebeko / Fotolia
 
The warming effect of human-induced greenhouse gases is a given, but to what extent can we predict its future influence? That is an issue on which science is making progress, but the answers are still far from exact, say researchers from the Hebrew University of Jerusalem, the US and Australia who have studied the issue and whose work which has just appeared in the journal Science.

Indeed, one could say that the picture is a "cloudy" one, since the determination of the greenhouse gas effect involves multifaceted interactions with cloud cover.

To some extent, aerosols -- particles that float in the air caused by dust or pollution, including greenhouse gases -- counteract part of the harming effects of climate warming by increasing the amount of sunlight reflected from clouds back into space. However, the ways in which these aerosols affect climate through their interaction with clouds are complex and incompletely captured by climate models, say the researchers. As a result, the radiative forcing (that is, the disturbance to Earth's "energy budget" from the sun) caused by human activities is highly uncertain, making it difficult to predict the extent of global warming.

And while advances have led to a more detailed understanding of aerosol-cloud interactions and their effects on climate, further progress is hampered by limited observational capabilities and coarse climate models, says Prof. Daniel Rosenfeld of the Fredy and Nadine Herrmann Institute of Earth Sciences at the Hebrew University of Jerusalem, author of the article in Science. Rosenfeld wrote this article in cooperation with Dr. Steven Sherwood of the University of New South Wales, Sydney, Dr. Robert Wood of the University of Washington, Seattle, and Dr. Leo Donner of the US National Oceanic and Atmospheric Administration. .

Their recent studies have revealed a much more complicated picture of aerosol-cloud interactions than considered previously. Depending on the meteorological circumstances, aerosols can have dramatic effects of either increasing or decreasing the cloud sun-deflecting effect, the researchers say. Furthermore, little is known about the unperturbed aerosol level that existed in the preindustrial era. This reference level is very important for estimating the radiative forcing from aerosols.

Also needing further clarification is the response of the cloud cover and organization to the loss of water by rainfall. Understanding of the formation of ice and its interactions with liquid droplets is even more limited, mainly due to poor ability to measure the ice-nucleating activity of aerosols and the subsequent ice-forming processes in clouds.

Explicit computer simulations of these processes even at the scale of a whole cloud or multi-cloud system, let alone that of the planet, require hundreds of hours on the most powerful computers available. Therefore, a sufficiently accurate simulation of these processes at a global scale is still impractical.
Recently, however, researchers have been able to create groundbreaking simulations in which models were formulated presenting simplified schemes of cloud-aerosol interactions, This approach offers the potential for model runs that resolve clouds on a global scale for time scales up to several years, but climate simulations on a scale of a century are still not feasible. The model is also too coarse to resolve many of the fundamental aerosol-cloud processes at the scales on which they actually occur. Improved observational tests are essential for validating the results of simulations and ensuring that modeling developments are on the right track, say the researchers.

While it is unfortunate that further progress on understanding aerosol-cloud interactions and their effects on climate is limited by inadequate observational tools and models, achieving the required improvement in observations and simulations is within technological reach, the researchers emphasize, provided that the financial resources are invested. The level of effort, they say, should match the socioeconomic importance of what the results could provide: lower uncertainty in measuring human-made climate forcing and better understanding and predictions of future impacts of aerosols on our weather and climate.

Story Source:
The above story is based on materials provided by Hebrew University of Jerusalem. Note: Materials may be edited for content and length.

Journal Reference:
  1. D. Rosenfeld, S. Sherwood, R. Wood, L. Donner. Climate Effects of Aerosol-Cloud Interactions. Science, 2014; 343 (6169): 379 DOI: 10.1126/science.1247490

Global Warming Alarmists Caught Doctoring '97-Percent Consensus' Claims

                   
Peter Ferrara


Global warming graphic
 (Photo credit: Wikipedia)
 
Global warming alarmists and their allies in the liberal media have been caught doctoring the results of a widely cited paper asserting there is a 97-percent scientific consensus regarding human-caused global warming. After taking a closer look at the paper, investigative journalists report the authors’ claims of a 97-pecent consensus relied on the authors misclassifying the papers of some of the world’s most prominent global warming skeptics. At the same time, the authors deliberately presented a meaningless survey question so they could twist the responses to fit their own preconceived global warming alarmism.

Global warming alarmist John Cook, founder of the misleadingly named blog site Skeptical Science, published a paper with several other global warming alarmists claiming they reviewed nearly 12,000 abstracts of studies published in the peer-reviewed climate literature. Cook reported that he and his colleagues found that 97 percent of the papers that expressed a position on human-caused global warming “endorsed the consensus position that humans are causing global warming.”
As is the case with other ‘surveys’ alleging an overwhelming scientific consensus on global warming, the question surveyed had absolutely nothing to do with the issues of contention between global warming alarmists and global warming skeptics. The question Cook and his alarmist colleagues surveyed was simply whether humans have caused some global warming. The question is meaningless regarding the global warming debate because most skeptics as well as most alarmists believe humans have caused some global warming. The issue of contention dividing alarmists and skeptics is whether humans are causing global warming of such negative severity as to constitute a crisis demanding concerted action.

Either through idiocy, ignorance, or both, global warming alarmists and the liberal media have been reporting that the Cook study shows a 97 percent consensus that humans are causing a global warming crisis. However, that was clearly not the question surveyed.

Investigative journalists at Popular Technology looked into precisely which papers were classified within Cook’s asserted 97 percent. The investigative journalists found Cook and his colleagues strikingly classified papers by such prominent, vigorous skeptics as Willie Soon, Craig Idso, Nicola Scafetta, Nir Shaviv, Nils-Axel Morner and Alan Carlin as supporting the 97-percent consensus.
Cook and his colleagues, for example, classified a peer-reviewed paper by scientist Craig Idso as explicitly supporting the ‘consensus’ position on global warming “without minimizing” the asserted severity of global warming. When Popular Technology asked Idso whether this was an accurate characterization of his paper, Idso responded, “That is not an accurate representation of my paper.

The papers examined how the rise in atmospheric CO2 could be inducing a phase advance in the spring portion of the atmosphere’s seasonal CO2 cycle. Other literature had previously claimed a measured advance was due to rising temperatures, but we showed that it was quite likely the rise in atmospheric CO2 itself was responsible for the lion’s share of the change. It would be incorrect to claim that our paper was an endorsement of CO2-induced global warming.”

When Popular Technology asked physicist Nicola Scafetta whether Cook and his colleagues accurately classified one of his peer-reviewed papers as supporting the ‘consensus’ position, Scafetta similarly criticized the Skeptical Science classification.

“Cook et al. (2013) is based on a straw man argument because it does not correctly define the IPCC AGW theory, which is NOT that human emissions have contributed 50%+ of the global warming since 1900 but that almost 90-100% of the observed global warming was induced by human emission,” Scafetta responded. “What my papers say is that the IPCC [United Nations Intergovernmental Panel on Climate Change] view is erroneous because about 40-70% of the global warming observed from 1900 to 2000 was induced by the sun.”

What it is observed right now is utter dishonesty by the IPCC advocates. … They are gradually engaging into a metamorphosis process to save face. … And in this way they will get the credit that they do not merit, and continue in defaming critics like me that actually demonstrated such a fact since 2005/2006,” Scafetta added.

Astrophysicist Nir Shaviv similarly objected to Cook and colleagues claiming he explicitly supported the ‘consensus’ position about human-induced global warming. Asked if Cook and colleagues accurately represented his paper, Shaviv responded, “Nope… it is not an accurate representation. The paper shows that if cosmic rays are included in empirical climate sensitivity analyses, then one finds that different time scales consistently give a low climate sensitivity. i.e., it supports the idea that cosmic rays affect the climate and that climate sensitivity is low. This means that part of the 20th century [warming] should be attributed to the increased solar activity and that 21st century warming under a business as usual scenario should be low (about 1°C).”

“I couldn’t write these things more explicitly in the paper because of the refereeing, however, you don’t have to be a genius to reach these conclusions from the paper,” Shaviv added.

To manufacture their misleading asserted consensus, Cook and his colleagues also misclassified various papers as taking “no position” on human-caused global warming. When Cook and his colleagues determined a paper took no position on the issue, they simply pretended, for the purpose of their 97-percent claim, that the paper did not exist.

Morner, a sea level scientist, told Popular Technology that Cook classifying one of his papers as “no position” was “Certainly not correct and certainly misleading. The paper is strongly against AGW [anthropogenic global warming], and documents its absence in the sea level observational facts. Also, it invalidates the mode of sea level handling by the IPCC.”

Soon, an astrophysicist, similarly objected to Cook classifying his paper as “no position.”

“I am sure that this rating of no position on AGW by CO2 is nowhere accurate nor correct,” said Soon.

I hope my scientific views and conclusions are clear to anyone that will spend time reading our papers. Cook et al. (2013) is not the study to read if you want to find out about what we say and conclude in our own scientific works,” Soon emphasized.

Viewing the Cook paper in the best possible light, Cook and colleagues can perhaps claim a small amount of wiggle room in their classifications because the explicit wording of the question they analyzed is simply whether humans have caused some global warming. By restricting the question to such a minimalist, largely irrelevant question in the global warming debate and then demanding an explicit, unsolicited refutation of the assertion in order to classify a paper as a ‘consensus’ contrarian, Cook and colleagues misleadingly induce people to believe 97 percent of publishing scientists believe in a global warming crisis when that is simply not the case.

Misleading the public about consensus opinion regarding global warming, of course, is precisely what the Cook paper sought to accomplish. This is a tried and true ruse perfected by global warming alarmists. Global warming alarmists use their own biased, subjective judgment to misclassify published papers according to criteria that is largely irrelevant to the central issues in the global warming debate. Then, by carefully parsing the language of their survey questions and their published results, the alarmists encourage the media and fellow global warming alarmists to cite these biased, subjective, totally irrelevant surveys as conclusive evidence for the lie that nearly all scientists believe humans are creating a global warming crisis.

These biased, misleading, and totally irrelevant “surveys” form the best “evidence” global warming alarmists can muster in the global warming debate. And this truly shows how embarrassingly feeble their alarmist theory really is.

Sunday, January 26, 2014

Climate change needs a new kind of scientist

Jan 17 2014

Scientific discoveries of recent decades have generated a wealth of knowledge on forests and climate change spanning many different sectors and disciplines. Sustainable development, poverty eradication, the rights of indigenous and local communities to land and resources, conservation of biodiversity, governance, water management, pollution (and all the policies and economic factors related to these sectors) are just some of the issues that scientists studying the relationship between forests climate change must consider.

Such knowledge generation has also laid the foundation for a broader mission to assist in developing integrated solutions. This is not something that we, as climate scientists, have been traditionally trained to do.

It’s clear that developing integrated solutions to such complex problems will require a new kind of climate scientist. A scientist who can think across biophysical and social disciplines. A scientist who can work across scales to engage all members of society in their research. A scientist who can understand the policy implications of their work.

One group that seems particularly enthusiastic to take on such a role is young researchers. In this article, I’ll go through a few examples where young researchers have led the “out of the box” thinking needed to tackle climate change problems.

Interdisciplinary science is at the heart of CIFOR’s Global Comparative Study on REDD+, which aims to inform policy makers, practitioners and donors about what works in reducing emissions from deforestation and forest degradation, and enhancement of forest carbon stocks (REDD+) in tropical countries.

In this study, a diverse group of foresters, biologists, sociologists, economists, political scientists, and anthropologists works together to understand how REDD+ can be implemented effectively, efficiently, equitably, and promote both social and environmental co-benefits.

I help coordinate a component of the study that focuses on measuring the impacts of subnational REDD+ initiatives. Through this part of the study, we have collected data in 170 villages with over 4,000 families in 6 countries: Brazil, Peru, Cameroon, Tanzania, Vietnam and Indonesia.
From 2010 to 2012, we hired nearly 80 undergraduate, Masters and PhD students in Latin America to collect baseline data on livelihoods and land-use at multiple sites across the Amazon. These young scientists have been critical in helping us share project knowledge in different ways.

KNOWLEDGE SHARING
As I was finishing my PhD six years ago, I joined forces with bunch of graduate students who were thinking about how, within our confined academic research environment, we could share knowledge in different ways.

The “knowledge exchange pyramid” (Fig 1) outlines how graduate students can exchange knowledge with local stakeholders during research.

There are three levels of knowledge exchange: (1) information sharing; (2) skill building; and (3) knowledge generation. The black circle represents the researchers while the white circle represents local stakeholders — communities, practitioners, policymakers.
Duchelle, A.E, K. Biedenweg, C. Lucas, A. Virapongse, J. Radachowsky, D. Wojcik, M. Londres, W.L. Bartels, D. Alvira, K.A. Kainer. 2009. Graduate students and knowledge exchange with local stakeholders: Possibilities and preparation. Biotropica 41(5): 578-585.

At the base of the pyramid (the simplest form of knowledge exchange) is information sharing – a primarily one-way transmission of ideas to stakeholders using presentations, brochures and posters. Our experience showed that these tools are particularly appropriate when time is limited, specific facts need to be shared, and information is not controversial.

If your goal is to change attitudes, you need to ensure that stakeholders are given a more active role in interpreting information either through community forums, presentations allowing discussion time, and short workshops.

An example of information sharing is returning research results to local stakeholders. One year after the teams collected baseline data for the Global Comparative Study on REDD+, we went back to all of the research sites and shared the results with the local communities and with the organizations (NGOs and governments) that implement the REDD+ activities.

While many researchers do go back to the places where they collected their data to share results, it has bothered me for years when you go to communities where you know there have been multiple research groups and they say to you, ‘you guys are the first group that come back with the information’.

While many researchers do go back to the places where they collected their data to share results, it has bothered me for years when you go to communities where you know there have been multiple research groups and they say to you, ‘you guys are the first group that come back with the information’.

Sharing results with local stakeholders is an incredibly important learning process and it makes good scientific sense. It allows community members to interpret the information and verify survey data before final analysis. We have found that when a community says ‘that doesn’t make sense, why would it be that way?’, it helps us rethink our interpretation of the data. All it takes is some time, some creativity, and a bit of money.

In the Global Comparative Study on REDD+, I envisaged us returning the results in a pretty conventional way. What blew me away was the innovation of our young researchers – they used art, games and even theatre to make the science interesting and relevant to local communities.
In the second part of the pyramid (slightly more complex knowledge exchange) is skill building, which encourages stakeholders to use knowledge to develop new skills. This is often a response to local demands for skills such as data collection and analysis, grant writing, or manuscript preparation.
While conducting research in Ucayali, Peru forest communities requested training in Global Positioning Systems (GPS) to help them locate and record specific trees in timber harvest areas required by their forest management plans. They also wanted to learn how to measure timber so they could determine a reasonable purchase price (and ensure they weren’t being swindled when it came time to sell the tree trunks or boards).

Skill building activities require more time, resources, and preparation than information sharing but can be incredibly important for building trust with communities. They can also be fun (friendly soccer games are a common feature of fieldwork in Latin America!)
At the highest level of the pyramid is knowledge generation, which includes communities, practitioners or policy makers as partners in different aspects of the research process (one example is “action research”). Together with the graduate researcher, they can create the research questions, implement the research, and analyze and disseminate the results.
While this is the most innovative type of knowledge exchange, it is also the hardest for young researchers to get involved in. Graduate students and their research partners will need to invest a lot of time and energy as well as obtain institutional support.

I know a Brazilian researcher who, before and during her Masters in The University of Florida’s Tropical Conservation and Development Program, developed long-term action research on the ecology of locally important tree species with one remote community in the Amazon estuary. She involved community members in all steps of the research process from setting research priorities, collecting data and training.

After assessing the research findings, community members took several actions. They more than doubled the number of local volunteers collecting data (and diversified to include youth, women, and community leaders), they presented research findings at community meetings and they shared those findings with nearby communities struggling to improve their livelihoods in a sustainable way.
Engaging local stakeholders in research is possible at any level. Young researchers are the ones who can help us think of new and innovative ways to do this.

MAKE USE OF THIS INFORMATION
If you are in academia, encourage your students to embark on this kind of knowledge exchange in their research. Plug them into your networks and create courses to help broaden their skill sets.
If you are a practitioner, welcome students and young researchers into your work. Be willing to develop research with them, be willing to learn from students and help them become better professionals.

If you are a donor, support the research that shows real and genuine knowledge exchange with relevant stakeholders.

If you are a student or young researcher, recognize the very unique moment that you are in right now in your career and expand your skill set. Some of the conventional academic pressures are less placed on you at early stages of your career, so use the opportunity you have now to engage and innovate.

This article was first published on ForestsClimateChange.org

Antidisestablishmentarianism

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Antidisestablishmentarianism Arms of...