Search This Blog

Sunday, February 2, 2014

Starving hives: Pesticides cause bees to collect 57 percent less pollen, study says

Published time: February 02, 2014 21:15                         
 
Reuters / Leonhard Foeger
Reuters / Leonhard Foeger
 

In a spin-off of their earlier study, a team of British scientists have revealed how the neurotoxic chemicals contained in agricultural neonicotinoids affect the very basic function of the honeybees – the gathering of pollen, or flower nectar.

“Pollen is the only source of protein that bees have, and it is vital for rearing their young. Collecting it is fiddly, slow work for the bees and intoxicated bees become much worse at it. Without much pollen, nests will inevitably struggle,” explained University of Sussex professor Dave Goulson, who has led the study. His comments were made in a statement released alongside the research.
Goulson’s latest paper called “Field realistic doses of pesticide imidacloprid reduce bumblebee pollen foraging efficiency” was published at the end of January in peer-reviewed journal Ecotoxicology.

The scientists exposed some of the studied bees to low doses of imidacloprid and tracked their movement with the help of electronic tags. Unexposed bees were also tracked, and each insect flying out and returning to a hive was weighed to find out the amount of pollen it gathered.

It turned out that bees exposed to the neonicotinoid brought back pollen from only 40 percent of their trips asopposed to 63 percent of useful trips which their “healthy” counterparts undertook.
Intoxicated bees cut the amount of pollen gathered by nearly a third - overall, the comparative study showed that the hives exposed to the pesticide received 57 percent less pollen.

“Even near-infinitesimal doses of these neurotoxins seem to be enough to mess up the ability of bees to gather food. Given the vital importance of bumblebees as pollinators, this is surely a cause for concern,” Hannah Feltham of the University of Stirling, another member of the research team, stated.

For bees themselves, the cut appeared to represent a sharp decline in the amount of food that the hive’s population received.

Feltham believed the study adds “another piece to the jigsaw” of why the bees have been in sharp decline lately.

Three types of controversial neonicotinoids have been temporarily banned in the European Union after the European Food Safety Authority carried out peer review of several studies showing that widely-used pesticides could harm the bees’ populations.

“It is unclear what will happen when the [EU ban] expires, as the agrochemical companies that produce them are in a legal dispute with the EU over their decision. Our new study adds to the weight of evidence for making the ban permanent,” Goulson said.

But the dispute over the role of pesticides in the so-called Colony Collapse Disorder (CCD), or mass extinction of bees, is far from being over, the reaction to the study has shown.

“This is a very important study, because it provides further detail on how bumblebee foraging is made less efficient by exposure to imidacloprid at these levels,” said Lynn Dicks, an ecologist at the University of Cambridge.

However, she then questioned the “field-realistic” dose of chemical used by the UK scientists in their study.

“The [levels in this study], particularly the pollen level, are at the upper end of what is found in the field, and likely to be higher than what bumblebee colonies are actually exposed to, because they don’t feed exclusively on oilseed rape,” Dicks argued.

Pesticide manufacturers appeared to be even more dismissive of the study’s results, comparing it to a practice of force-feeding in laboratory conditions.

“It would appear the bumble bees are essentially force-fed relatively high levels of the pesticide in sugar solutions, rather than allowing them to forage on plants treated with a seed treatment. Real field studies, such as those being initiated this autumn in the UK will give more realistic data on this subject,” Julian Little, a spokesman for major German imidacloprid producer Bayer AG has said.

Whether such open-field tests could provide a more balanced data is another issue the researchers have been arguing over. Some say that properly controlled field trials are difficult to conduct, as neonicotinoids have been widely used and bees range over wide areas to gather pollen.

Obama's Great Conflation and What it Means for You

| February 2, 2014

If there’s one dead-of-winter public spectacle even more soul-sapping and self-congratulatory than the Grammys —now taking its cues, however well-intentioned, from the late Rev. Sun Myung Moon by staging mass weddings —it’s the annual State of the Union address (due to be delivered tonight at 9 p.m. ET).

Like high school graduation speeches, State of the Union addresses are typically forgotten in real time, even as they are being delivered. Perhaps realizing his time in office is dwindling down with little to show for it, Obama will take a page from Lady Gaga at the 2011 Grammys and emerge from a translucent egg.

Alas, that’s as unlikely as his declaring an end to the federal war on pot. By all accounts, Obama will instead talk a lot about economic inequality, the increasing spread in income and wealth between the richest and poorest Americans that he calls the “defining challenge of our time” and that has only gotten worse on his watch.

If past pronouncements are any indication, the president will immediately—and erroneously—conflate growing income inequality with reduced economic mobility. As he said in a speech last December, “The problem is that alongside increased inequality, we’ve seen diminished levels of upward mobility in recent years.”

This is flatly wrong. Research published last week by economists at Harvard (Raj Chetty, Nathaniel Hendren) and Berkeley (Patrick Kline, Emmanuel Saez) concludes that rates of mobility among income quintiles have not in fact changed in decades. As the Washington Post summarized it,
“Children growing up in America today are just as likely—no more, no less—to climb the economic ladder as children born more than a half-century ago, a team of economists reported Thursday.”

While noting large variations in mobility based on geographic location and other factors (the biggest being “the fraction of single parents in the area”), Chetty, et al. conclude “a child born in the bottom fifth of the income distribution has a 7.8% chance of reaching the top fifth in the U.S. as a whole.”

That chance at going from bottom to top may strike you as unacceptably low—it does me, for sure—but the larger point is that it hasn’t changed over time. Elsewhere, the researchers show similarly constant rates of mobility for people born into middle and higher-income quintiles. Growing inequality doesn’t mean that mobility has declined, much less stopped altogether, and policies designed to level or redistribute income won’t increase mobility (if they even succeed at actually squeezing income disparities).

It’s important to stress that the new study by Chetty et al. simply confirms what other researchers have been finding for years. For instance, Scott Winship, who has worked at Pew and Brookings and now is a scholar at the Manhattan Institute, compared mobility for Americans born in the early 1960s and early 1980s. He found “that upward mobility from poverty to the middle class rose from 51 percent to 57 percent between the early-'60s cohorts and the early-'80s ones. Rather than assert that mobility has increased, I want to simply say—at this stage of my research (which is ongoing)—that it has not declined.”

As Winship told me in a 2012 interview, “You can be concerned that there’s not enough [economic] mobility or enough opportunity, but you don’t have to also believe that things are getting worse.” Winship also underscored what is clear from the past 50 years or more: It’s actually incredibly hard to figure out exactly how to increase mobility rates.

Tonight, don’t expect President Obama to cite any research showing that mobility has remained constant. Instead, expect him to echo his December speech, which was filled with lines about “a dangerous and growing inequality and lack of upward mobility that has jeopardized middle-class America’s basic bargain—that if you work hard, you have a chance to get ahead.”

From a political perspective, the erroneous but strategic conflation of inequality and mobility makes obvious sense. After all, if mobility is as alive and well as it has been in the post-war era, then the sense of urgency the president needs to sell any legislation is largely undercut. As important, constant mobility rates also make a mockery of the president’s long-preferred strategy of redistributing income from the top of the income ladder down to the lower rungs. Whether he’s talking to Joe the Plumber (god, that seems like a different planet, doesn’t it?) or addressing Congress, Obama rarely misses an opportunity to ask richer Americans to “do a little bit more.”

But as it stands, the United States already has one of the very most progressive tax systems in the world. Even the liberals at Wonkblog grant that much. The real problem, they and others note, is that rather than give direct cash payments to the less well-off, the U.S. prefers to dole out favors via tax breaks that are far more likely to benefit the wealthy and not the middle or lower classes (think mortgage-interest deductions on not just one but two homes).

Don’t expect Obama to talk seriously about reining in tax breaks or reforming entitlements that benefit the wealthy even as he says they must pay “a bit more.” In fact, don’t expect anything new from tonight’s speech. This is a president who is long on revealed truth and exceptionally short of wisdom borne out of his experience in office.

Instead, get ready for a long list of calls to maintain and increase many programs that have been in place since before Obama took office: extending unemployment benefits (without paying for them by, say, cutting defense spending), making it easier for people to buy or stay in homes whose prices are inflated by government policies, and increasing access to higher education in ways that continue to increase prices far higher than the rate of inflation. Pump more money into a broken K-12 education system whose per-pupils costs rise as results stay flat (certainly the president won’t call for giving parents and children the right to choose their own schools).

In short, expect Obama to invoke income inequality and supposed declines in upward mobility as a way of maintaining a status quo that has managed to increase inequality without affecting mobility rates.

The upside to tonight’s speech? We all will have forgotten it by the weekend, when we still might be talking about the Grammys’ mass weddings.

Nick Gillespie is the editor in chief of Reason.com and Reason TV and the co-author of The Declaration of Independents: How Libertarian Politics Can Fix What's Wrong With America, just out in paperback.

George Lakoff on Communication: "Liberals Do Everything Wrong"

Progressives have got it wrong — and if they don't start to get it right, the conservatives will maintain the upperhand.
Photo Credit: By Pop!Tech (Flickr: Pop!Tech 2008 - George Lakoff) [CC-BY-2.0 (http://creativecommons.org/licenses/by/2.0)], via Wikimedia Commons
 
"The progressive mindset is screwing up the world. The progressive mindset is guaranteeing no progress on global warming. The progressive mindset is saying, 'Yes, fracking is fine.' The progressive mindset is saying, 'Yes, genetically modified organisms are OK', when, in fact, they're horrible, and the progressive mindset doesn't know how to describe how horrible they are. There's a difference between progressive morality, which is great, and the progressive mindset, which is half OK and half awful."

George Lakoff, professor of cognitive science at the University of California, Berkeley, has been working on moral frames for 50 years. In  Communicating Our American Values and Vision, he gives this precis: "Framing is not primarily about politics or political messaging or communication. It is far more fundamental than that: frames are the mental structures that allow human beings to understand reality – and sometimes to create what we take to be reality. But frames do have an enormous bearing on politics … they structure our ideas and concepts, they shape the way we reason … For the most part, our use of frames is unconscious and automatic."

Lakoff is affable and generous. In public meetings he greets every question with: "That is an extremely good question." But he cannot keep the frustration out of his voice: the left, he argues, is losing the political argument – every year, it cedes more ground to the right, under the mistaken impression that this will bring everything closer to the centre. In fact, there is no centre: the more progressives capitulate, the more boldly the conservatives express their vision, and the further to the right the mainstream moves. The reason is that conservatives speak from an authentic moral position, and appeal to voters' values. Liberals try to argue against them using evidence; they are embarrassed by emotionality. They think that if you can just demonstrate to voters how their self-interest is served by a socially egalitarian position, that will work, and everyone will vote for them and the debate will be over. In fact, Lakoff asserts, voters don't vote for bald self-interest; self-interest fails to ignite, it inspires nothing – progressives, of all people, ought to understand this.

When he talks about the collapse of the left, he clearly doesn't mean that those parties have disintegrated: they could be in government, as the Democrats are in the US. But their vision of progressive politics is compromised and weak. So in the UK there have been racist "Go home" vans and there is an immigration bill going through parliament, unopposed, that mandates doctors, the DVLA, banks and landlords to interrogate the immigration status of us all; Hungary has vigilante groups attacking Roma, and its government recently tried to criminalise homelessness; the leaders of the Golden Dawn in Greece have only just been arrested, having been flirting with fascism since the collapse of the eurozone. We see, time and again, people in need being dehumanised, in a way that seems like a throwback to 60 or 70 years ago. Nobody could say the left was winning.

Lakoff predicted all this in Moral Politics, first published in 1996. In it, he warned that "if liberals do not concern themselves very seriously and very quickly with the unity of their own  philosophy and with morality and the family, they will not merely continue to lose elections but will as well bear responsibility for the success of conservatives in turning back the clock of progress in America."
Since then, the left has cleaved moderately well to established principles around the politics of the individual – women are equal, racism is wrong, homophobia is wrong. But everything else: a fair day's work for a fair day's pay, the essential dignity of all humans, even if they're foreign people or young people, education as a public good, the natural world as a treasure rather than an instrument of our convenience, the existence of motives besides profit, the pointlessness and poison of privatisation, the profundity, worth and purpose of pooling resources … this stuff is an embarrassment to centre-left parties, even when they're in government, let alone when they're in opposition. When unions reference these ideas, they are dismissed as dinosaurs.

This is just page one.

Breastfeeding Is Now Required By Law In The United Arab Emirates

Posted:   |  Updated: 01/30/2014 11:59 pm EST 
                                                                                                                                                                                                                                         
The Emirates' Federal National Council has passed a clause, part of their new Child Rights Law, requiring new moms to breastfeed their babies for two full years, The National reports. Now, men can sue their wives if they don't breastfeed.

According to the National, there was a "marathon debate" over the legislation, but it was ultimately decided that it is every child's right to be breastfed.
Research has found many benefits of breastfeeding for baby, from reducing the risk of obesity to better language and motor development.

However, not all new moms are able to nurse. In those instances, if a woman is prohibited by health reasons, the council will provide a wet nurse to her. It's unclear exactly how a mother's ability to breastfeed will be determined though. Carrie Murphy at Mommyish raises some additional questions about those exceptions:
Where do the wet nurses come from? Do they live with UAE women and their families? How and who determines if you need one? Who pays their salary? .... And what about formula? Will it be sold in the country? Will it be contraband? Will you need a prescription for it? Some babies actually need formula rather breast milk and some babies can’t digest anything with milk at all, either formula OR breast milk.
Council members are trying to improve rights for working moms to make the legislation more practical. But, unsurprisingly, mothers' support groups have raised issues that go beyond logistics.

Because breastfeeding is universally accepted as the healthiest option for moms and babies, new mothers already face great pressure to nurse, a Dubai-based group, Out of the Blues explains. Their purpose is to help women suffering from postnatal illness, many who have trouble breastfeeding.
In an article, also on the National, Out of the Blues writes:
New mothers are extremely vulnerable and need more support, encouragement and education. It is our opinion that, while encouraging women to breastfeed is a laudable aim, it is by supporting those who can and want to breastfeed, and not by punishing those who can’t, that we will reap the benefits we all want to see in our society.
                    
Here in the U.S., breastfeeding has never been legally required, but politicians have intervened to increase the number of babies being breastfed. In 2012, former NYC Mayor Bloomberg introduced "Latch On NYC," a program that encouraged hospitals to make it difficult for new moms to obtain formula "goody bags." Instead of traditional take-home bottles being handed out, mothers have to request it like medication, and listen to a lecture from hospital staff discouraging formula feeding, unless absolutely necessary.

At the time, the initiative faced its own backlash. Many argued that Bloomberg's tactics would make mothers feel guilty, and as blogger Lenore Skenazy put it, "suck the choice out of parenting."

Marie-Claire Bakker, a member of U.S.-based breastfeeding support group, La Leche League, has echoed those sentiments in response to the Emirates' new legislation. “At this vulnerable time, to think of criminalising a new mother who, for whatever reason, is struggling with breastfeeding is not helpful ... She needs informed support, not threats," she told the National.

We Need To Regulate Guns, Not Women's Bodies


This past week marked the forty-first anniversary of Roe v. Wade, the Supreme Court decision that expanded the right to privacy to include a woman’s right to an abortion. It also marked yet another tragic shooting, this one at a mall in Columbia, Md., that left three dead, including the gunman who committed suicide. On the surface, it may seem like abortion and gun violence don’t have anything in common but the way these issues have historically been framed — abortion as murder and the right to bear arms as essential — reflects how tightly we clutch our guns and Bibles in an effort to maintain founding principles, ones whose merit should be challenged based on our ever-evolving society.
If there’s anything that needs comprehensive reform, it’s current gun laws — not abortion rights.
Our nation is no stranger to gun violence — in fact, it’s often our bedfellow, and the facts are startling. In the first nine months of 2013, there were six mass shootings in the United States with at least 20 of those having occurred since President Obama was elected. What’s even more startling is that half of the deadliest shootings in our nation have occurred since 2007.

After 26 people — including 20 children — were slaughtered at Sandy Hook Elementary School in Newtown, Connecticut in December 2012, the media, public and to a degree, political response had the same message: it’s time for comprehensive gun control laws to be enacted. But, we hear the same message after every mass shooting — we heard it after the shootings at Columbine, Virginia Tech, Aurora, Tuscon, the Washington Navy Yard, the Sikh Temple, the Seattle Coffee Shop, Fort Hood and more. While mass shootings are the form of gun violence that most frequently dominates the media, it’s important to note that a plethora of U.S. cities having higher rates of gun violence than entire nations.

Given the consistent gun violence in our country, it’s getting harder and harder to support the Second Amendment without recognizing that it badly needs to be reevaluted. As Harvey Wasserman wrote for The Huffington Post in 2011 after attempted assassination of Rep. Gabbie Giffords (D-AZ), the Second Amendment “is granted only in the context of a well-regulated militia and thus the security of a free state.” He uses the National Guard as an example of this kind of organization, not armed individuals seeking personal retribution. Wasserman argues that this kind of “murder and mayhem” that has escalated “has been made possible by the claim to a Constitutional right that is not there,” meaning that outside the ‘well-regulated militia,’ the Second Amendment holds little relevancy, especially in our current society.

If you take all this into consideration, paired with the fact that nine in 10 Americans support expanded background checks on guns, it’s logical to assume something is being done to curb gun violence, right?

Wrong. Just over a month ago, The Huffington Post reported that despite legislation being proposed, including one that would have enacted comprehensive background checks, Congress has not passed any gun control laws since the Sandy Hook shooting in 2012.
 
In a stark paradox, the Guttmacher Institute reported that in the first half of 2013, 43 state provisions were enacted that restrict access to abortion. By the end of 2013, NARAL Pro-Choice America reports that a total of 53 anti-choice measures were enacted across 24 states despite a recent poll that found Americans support Roe’s landmark decision. You can’t debate the facts and if these numbers are any indication of our legislator’s priorities, it’s saying that they’re more afraid of women with reproductive rights than a nation with lax gun control laws.

Since Roe was decided, the anti-choice movement has been carefully executing a plan to turn back the clock on women’s reproductive rights. This effort can best be described as an assault on women’s bodies but is often presented by the anti-choice community as a “safety concern.” It’s a known fact that abortion is one of thesafest medical procedures but opponents have created myths to drum up support for their restrictions, like abortion being linked to breast cancer (it isn’t), a fetus being able to feel pain at 20 weeks (it can’t), in an all-out attempt to frame abortions as unsafe, murderous and morally negligent. Statistics prove that restricting abortion is what makes it unsafe, but facts, shmacts — who needs them when we’ve got all-male Congressional panels debating women’s health care (hint: not this guy)?

It seems pretty asinine that we live in a country whose state legislators are more likely to restrict abortion than enact comprehensive gun control laws (I’m looking at you, Rick Perry). Take Texas, for example. Last summer, the state imposed catastrophic abortion restrictions under the omnibus anti-choice bill HB 2 that have closed clinics and left patients to travel hundreds of miles to get an abortion. Texas is currently in a reproductive state of emergency but gun rights advocates fear not, because it’s easier to get a gun than an abortion.

How’s that for warped logic, especially when you consider that family planning and women’s reproductive services are good for the economy. Restricting access to reproductive health care can force women to carry unwanted or intended pregnancies to term, thus limiting her ability to fully participate in the workforce and contribute to the economy. A woman’s right to bodily autonomy and agency surrounding her decisions about pregnancy isn’t just good economics, it’s necessary as we build a more gender equal society. Viewing women as primarily baby-making receptacles does nothing for equality. If anything, it promotes the idea that a woman’s place is in the kitchen, not the workforce.

Efforts that restrict access to abortion and reproductive rights but uphold dangerous gun laws magnifies the disconnect between public opinion and actual representation and reflects the desperation of status quo stakeholders to maintain the three Ps: power, privilege and the patriarchy.
There’s no bigger threat to these structures than disrupting their history of power but our current social, political and economic environment calls for change. If we want to save more American lives, it’s not women’s bodies that need regulation — it’s gun laws.

(Phys.org) —A Friday report in Nature News handles a well-publicized topic, the air quality in Beijing. That may seem like rather old news, but the Friday report has new information on the city's troubling air quality. Scientists were looking for information about the potential for pathogens and allergens in Beijing's air and they turned to "metagenomics" as their study tool. The research team described what they were seeking in their paper, "Inhalable Microorganisms in Beijing's PM2.5 and PM10 Pollutants during a Severe Smog Event," for Environmental Science & Technology. "While the physical and chemical properties of PM pollutants have been extensively studied, much less is known about the inhalable microorganisms. Most existing data on airborne microbial communities using 16S or 18S rRNA gene sequencing to categorize bacteria or fungi into the family or genus levels do not provide information on their allergenic and pathogenic potentials. Here we employed metagenomic methods to analyze the microbial composition of Beijing's PM pollutants during a severe January smog event."

by Nancy Owano
A woman wearing a face mask walks on an overpass in Beijing on January 16, 2014       
A woman wearing a face mask walks on an overpass in Beijing on January 16, 2014

(Phys.org) —A Friday report in Nature News handles a well-publicized topic, the air quality in Beijing. That may seem like rather old news, but the Friday report has new information on the city's troubling air quality. Scientists were looking for information about the potential for pathogens and allergens in Beijing's air and they turned to "metagenomics" as their study tool. The research team described what they were seeking in their paper, "Inhalable Microorganisms in Beijing's PM2.5 and PM10 Pollutants during a Severe Smog Event," for Environmental Science & Technology. "While the physical and chemical properties of PM pollutants have been extensively studied, much less is known about the inhalable microorganisms. Most existing data on airborne microbial communities using 16S or 18S rRNA gene sequencing to categorize bacteria or fungi into the family or genus levels do not provide information on their allergenic and pathogenic potentials. Here we employed metagenomic methods to analyze the microbial composition of Beijing's PM pollutants during a severe January smog event."

They took 14 air samples over seven consecutive days. Using genome sequencing, they found about 1,300 different microbial species in the heavy smog period of early last year. The scientists compared their results with a large gene database. What about their findings? Most of the microbes they found were benign but a few were responsible for allergies and respiratory disease. As Nature News reported, the most abundant species identified was Geodermatophilus obscurus, That is a common soil bacterium. Streptococcus pneumonia, however, was also part of the brew, which can cause pneumonia, along with Aspergillus fumigatus, a fungal allergen, and other bacteria typically found in faeces. "Our results," wrote the researchers, "suggested that the majority of the inhalable microorganisms were soil-associated and nonpathogenic to humans. Nevertheless, the sequences of several respiratory microbial allergens and pathogens were identified and their relative abundance appeared to have increased with increased concentrations of PM pollution."

The authors suggested that their findings may provide an important reference for environmental scientists, health workers and city planners. The researchers also suggested, according to Nature News, that clinical studies explore signs of the same microbes in the sputum of patients with respiratory tract infections, to assess whether smoggier days lead to more infections.
Metagenomics, which can analyze regardless of the ability of member organisms to be cultured in the laboratory, is recognized as a powerful approach. Nature Reviews also describes metagenomics as "based on the genomic analysis of microbial DNA that is extracted directly from communities in environmental samples."
Explore further: Delhi says air 'not as bad' as Beijing after smog scrutiny
More information: Environmental Science & Technology paper: pubs.acs.org/doi/abs/10.1021/es4048472

Nature News story: www.nature.com/news/beijing-smog-contains-witches-brew-of-microbes-1.14640
Journal reference: Environmental Science & Technology

Saturday, February 1, 2014

The State of Obama's Energy Thinking
























“And when our children’s children look us in the eye,” said President Obama in his State of the Union address, “and ask if we did all we could to leave them a safer, more stable world, with new sources of energy, I want us to be able to say yes, we did.”

I do, too.

But the way to do this is the exact opposite of Obama’s prescribed policies of restricting fossil fuel use and giving energy welfare to producers of unreliable energy from solar and wind. Just consider the last three decades.

When I was born 33 years ago, politicians and leaders were telling my parents’ generation the same thing we’re being told today: that for the sake of future generations, we need to end our supposed addiction to fossil fuels.

Fossil fuels, they explained, were a fast-depleting resource that would inevitably run out. Fossil fuels, they explained, would inevitably make our environment dirtier and dirtier. And of course, fossil fuels would change the climate drastically and for the worse. It was the responsibility of my parents’ generation, they were told, to drastically restrict fossil fuels and live on solar and wind energy.
My generation should be eternally grateful they did the exact opposite.

Since 1980, when I was born, fossil fuel use here and around the world has only grown. In the U.S., between 1980 and 2012 the consumption of oil increased 3.9%, the consumption of natural gas increased 28.5%, and the consumption of coal increased 12.6%. (Had it not been for the economic downturn these numbers would be higher.)

Globally, oil consumption increased 38.6%, natural gas increased 130.5%, and coal increased 106.8%. (Source: BP Statistical Review of World Energy, June 2013)

My parents’ generation was told to expect disastrous consequences.

In 1980, the “Global 2000 Report to the President” wrote: “If present trends continue, . . . the world
in 2000 will be more crowded, more polluted, less stable ecologically, and more vulnerable to disruption than the world we live in now. Serious stresses involving population, resources, and environment are clearly visible ahead.”

In 1989, the New Yorker’s star journalist Bill McKibben, claiming to represent a scientific consensus, prophesized:
We stand at the end of an era—the hundred years’ binge on oil, gas, and coal…The choice of doing nothing—of continuing to burn ever more oil and coal—is not a choice, in other words. It will lead us, if not straight to hell, then straight to a place with a similar temperature.
Al Gore, just before becoming Vice President, said the use of fossil fuels put us on the verge of an “ecological holocaust.”

What actually happened? Thanks in large part to our increasing use of fossil fuel energy and the technologies they power, life has gotten a lot better across the board for billions of people around the globe.

Life expectancy is way up. World life expectancy at birth was just 63 years in 1980. That number has increased to over 70. The US, already far above average with 73 in 1980, today enjoys an average life expectancy of 78. The infant mortality rate of mankind is less than half of what is used to be in 1980 (from 80 to 35 per 1000 live births).

Malnutrition and undernourishment have plummeted. Access to electricity, a proxy for development and health, is constantly increasing. Access to improved water sources, a necessity for basic hygiene standards and human health, has been possible for ever increasing portions of mankind, especially in poor countries (1990 = 76%, 2012 = 89%).

GDP per capita has constantly increased. The percentage of people who live on $2 a day in South Asia, a region with massive fossil fuel use, has been steadily on the decline from 87% in the early 1980s to 67% in 2010. (Source: World Bank Data.)

And then there is the statistic that the climate doomsayers will never acknowledge. Thanks to energy and technology, climate-related deaths have been plummeting for decades: you are 50X less likely to die from a climate-related cause (heat wave, cold snap, drought, flood, storm) than you would have been 80 years ago.

We are the longest-living, best-fed, healthiest, richest generation of humans on this planet and there is unending potential for further progress.

If, that is, we don’t listen to the anti-fossil-fuel “experts.”

Here’s a scary story. In the 1970s, arguably the most revered intellectuals on energy and environment were men named Amory Lovins and Paul Ehrlich. (They are still revered, which, for reasons that will become apparent within a paragraph, is a moral crime.)

In 1977, Lovins, considered an energy wunderkind for his supposedly innovative criticisms of fossil fuels and his support of solar power and reduced energy use, explained that we already used too much energy. And in particular, the kind of energy we least needed was . . . electricity: “[W]e don’t need any more big electric generating stations. We already have about twice as much electricity as we can use to advantage.”

Environmentalist legend Paul Ehrlich had famously declared “The battle to feed all of humanity is over. In the 1970s hundreds of millions of people will starve to death in spite of any crash programs embarked upon now.” Thanks in large part to a surge in energy use that led to a massive alleviation of global hunger, that prediction did not come to pass. But it might have, had we followed Ehrlich’s advice: “Except in special circumstances, all construction of power generating facilities should cease immediately, and power companies should be forbidden to encourage people to use more power.
Power is much too cheap. It should certainly be made more expensive and perhaps rationed, in order to reduce its frivolous use.”

Had we listened to these two “experts,” billions of people—ultimately, every single person alive—would be worse off today. Or dead. You would certainly not be reading this on an “unnecessary” electronic device connecting to the electricity-hungry Internet.

This is what is at stake with energy. And while President Obama wasn’t extreme enough in his recent rhetoric for many environmentalist groups, he is on record as calling for outlawing 80% of fossil fuel use. Which would mean not only prohibiting future power plants, but shutting down existing ones.

Whether he knows it or not, every time he attacks fossil fuels, President Obama is working to make his grandchildren’s world much, much worse.

Alex Epstein, an energy philosopher, debater, and communications consultant, is Founder and President of the Center for Industrial Progress, head of the I Love Fossil Fuels Campaign, and author of Fossil Fuels Improve the Planet, “The Moral Case for Fossil Fuels: The Key to Winning
Hearts and Minds,” and the forthcoming book The Case for Fossil Fuels (Penguin/Portfolio, 2014). Contact him here.

A scientific first: Team engineers photograph radiation beams in human body through Cherenkov effect


           A scientific first: Team engineers photograph radiation beams in human body through Cherenkov effect

















          A scientific first: Team engineers photograph radiation beams in human body through    
          Cherenkov effect.  Long exposure image of Cherenkov emission and induced fluorescence
          from Fluorescein dissolving in water during irradiation from a therapeutic LINAC beam.
          Credit: Adam Glaser.

Jan 23, 2014
 
A scientific breakthrough may give the field of radiation oncology new tools to increase the precision and safety of radiation treatment in cancer patients by helping doctors "see" the powerful beams of a linear accelerator as they enter or exit the body.

We don't have X-ray vision. When we have an X-ray or mammogram, we cannot detect the beam that passes through our bone or soft tissue, neither can our doctor. But what if we could see X-rays? When we use powerful X-rays for cancer treatment, we could see how they hit the tumor. If we were off target, we could stop and make adjustments to improve accuracy. Pinpoint precision is important. The goal of radiation is to kill cancer cells without harming healthy tissue.

Safety in Radiation Oncology
As a way to make radiation safer and better, Dartmouth began to investigate a scientific phenomenon called the Cherenkov effect in 2011. Our scientists and engineers theorized that by using Cherenkov emissions the beam of radiation could be "visible" to the treatment team. The ability to capture an X-ray would show:
  • how the radiation signals travel through the body
  • the dose of radiation to the skin
  • any errors in dosage.
TV viewers may have seen images of sunken fuel rods from a nuclear power plant emitting a blue-green glow. That is the Cherenkov effect. When a particle with an electric charge travels faster than the speed of light through something that does not conduct electricity, like the human body or water, it glows. As the matter relaxes from polarization, it emits light. (Yes, for a brief period people glow during radiation.)

The Cherenkov effect in the laboratory
As a first step, engineers at the Thayer School of Engineering at Dartmouth modified a regular camera with a night vision scope to take photos of radiation beams as they passed through water. What appeared on the photos is the Cherenkov effect, a luminescent blue glow. (An engineering student, Adam Glaser, explains how it works in this video.)

To refine the approach for use in radiation treatment, scientists used a mannequin of the human body. They measured and studied the results to refine their ability to capture the luminescence, experimenting with beam size, position, and wavelength.

Cherenkov imaging used for first time in treatment setting
   
A scientific first: Team engineers photograph radiation beams in human body through Cherenkov effect
A radiation beam treatment is visualized here in the first in human use of the technique. The blue represents the treatment area. As the dose fades the treatment area becomes a dark gray shadow. The lingering blue in the lower right hand area …more

With the clinical aspects refined, Geisel School of Medicine researchers photographed luminescence during the routine of a dog with an oral tumor.

This was the first time Cherenkov studies came out of the laboratory and into a treatment setting. The scientists coined the approach Cherenkoscopy. As they anticipated, during the session they were able to see detailed information about the treatment field and the dose. The results were published in the November 2013 issue of the Journal of Biomedical Optics.

"This first observation in the dog proved that we could image a during treatment in real time," said David Gladstone, ScD, chief of Clinical Physics at Norris Cotton Cancer Center. "The images verified the shape of the beam as well as intended motion of the treatment machine."

First image of Cherenkov emissions during treatment of human breast
Now ready to use the technology with a human patient, the team prepared to view radiation as it entered the body of a female breast cancer patient undergoing radiation in July 2013.

"Breast cancer is suited for this because the imaging visualizes the superficial dose of radiation to the skin," said Lesley A. Jarvis, MD, radiation oncologist, Norris Cotton Cancer Center. Skin reactions, similar to sunburn, are a common and bothersome side effect during breast radiation. "By imaging and quantitating the surface dose in a way that has never been done before," said Jarvis, "we hope to learn more about the physical factors contributing to this skin reaction."

By seeing the effect of radiation on the body, radiation oncologists and physicists can make adjustments to avoid side effects to the skin. Most radiation patients undergo somewhere between 8-20 sessions. The Cherenkov images of the breast cancer patient showed a hot spot in her underarm, which physicians and physicists could work to prevent in future sessions.

"The actual images show that we are treating the exact correct location, with the appropriate beam modifications and with the precise dose of radiation," said Jarvis.

Clinical use of Cherenkov emissions proves successful
This trial showed that the Cherenkov effect is feasible for use real-time during radiation. "We have learned the imaging is easy to incorporate into the patient's treatment, adding only minimal time to the treatments," said Jarvis.

"The time needed to acquire this information is negligible, even with our experimental, non-integrated system," said Gladstone. "Cherenkov images were found to contain much richer information than anticipated, specifically, we did not expect to visualize internal blood vessels."

Mapping blood vessels opens up opportunities for to use a person's internal anatomy to confirm precise locations. Skin tattoos on patients determine a preliminary alignment that is verified with X-rays, which show bony anatomy or implanted markers. Cherenkov imaging allow technicians to visualize soft tissue and internal vasculature.

A possible safety net for radiation treatment
By integrating Cherenkov imaging into routine clinical care, Gladstone says the technology could be used to verify that the proper dose is being delivered to patients, helping to avoid misadministration of radiation therapy, a rare, but dangerous occurrence.

Twelve patients are participating in a pilot study, which is almost complete. The research team plans to publish the results in a peer reviewed journal. The Cherenkov effect project team includes Lesley Jarvis, MD, assistant professor of Medicine, Geisel School of Medicine; Brian Pogue, PhD, professor of Engineering, Thayer School, professor of Physics & Astronomy, Dartmouth College, professor of Surgery, Geisel School of Medicine; David J. Gladstone, ScD, DABMP associate professor of Medicine, Geisel School of Medicine; Adam Glaser, engineering student; Rongxiao Zhang, physics student; Whitney Hitchcock, medical school student.

With each trial the team gathers more information on the utility of the approach. "Stay tuned, we are now planning more definitive studies to determine the impact of this new imaging technology to improve safety of radiation," said Jarvis.
Explore further: Balloon mis-positioning during prostate cancer treatment could affect success of radiation delivery

Journal reference: Journal of Biomedical Optics search and more info website

New solar cell technology captures high-energy photons more efficiently

Jan 24, 2014 by Jared Sagoff
New solar cell technology captures high-energy photons more efficiently       

















Most simple solar cells handle the bluish hues of the electromagnetic spectrum inefficiently. This is because blue photons — incoming particles of light that strike the solar cell — actually have excess energy that a conventional solar cell can’t capture.


Scientists at the U.S. Department of Energy's Argonne National Laboratory and the University of Texas at Austin have together developed a new, inexpensive material that has the potential to capture and convert —particularly from the bluer part of the spectrum—much more efficiently than ever before.

Most simple solar cells handle these bluish hues of the inefficiently. This is because blue photons—incoming particles of light that strike the solar cell—actually have excess energy that a conventional solar cell can't capture.
"Photons of different energies kick electrons up by different amounts," said University of Texas Professor Brian Korgel. "Some photons come in with more energy than the cell is optimized to handle, and so a lot of that energy is lost as heat."

Because of this limitation, scientists had originally believed that simple would never be able to convert more than about 34 percent of to electricity. However, about a decade ago, researchers saw the potential for a single high-energy photon to stimulate multiple "excitons" (pairs of an electron and a positively-charged partner called a "hole") instead of just one. "This was a very exciting discovery, but we were still skeptical that we could get the electrons out of the material," Korgel said.

In their study, Korgel and his team used specialized spectroscopic equipment at Argonne's Center for Nanoscale Materials to look at multiexciton generation in copper indium selenide, a material closely related to another more commonly produced thin film that holds the record for the most efficient thin-film semiconductor. "This is one of the first studies done of multiple exciton generation in such a familiar and inexpensive material," said Argonne nanoscientist Richard Schaller.

"Argonne's spectroscopic techniques played a critical role in the detection of the multiexcitons," Korgel said. "These kinds of measurements can't be made many places."
In order to deposit thin films of the nanocrystalline material, the researchers used a process known as "photonic curing," which involves the split-second heating up and cooling down of the top layer of the material. This curing process not only prevents the melting of the glass that contains the nanocrystals, but also vaporizes organic molecules that inhibit multiple exciton extraction.

Although the study mostly proves that the efficiency boost provided by multiple exciton extraction is possible in mass-producible materials, the major hurdle will be to incorporate these into actual real-world devices.

"The holy grail of our research is not necessarily to boost efficiencies as high as they can theoretically go, but rather to combine increases in efficiency to the kind of large-scale roll-to-roll printing or processing technologies that will help us drive down costs," Korgel said.
Explore further: New study could lead to paradigm shift in organic solar cell research

Scientific Pride and Prejudice

Olimpia Zagnola
David Strumfels:  Watch Jonathan Eisen writhe his hands over this.  What?  Scientists are above cognitive biases, unlike ordinary mortals (who required them to function).  And they -- the NYT -- present no evidence!  I have bad news for him.  Cognitive bias is built into how the human brain functions.  High intelligence and years of education and research do no make it just go away -- far from it, by creating the illusion of being above it, the Eisens and every other scientist in the world (including me) are that much more susceptible to it.  The truth is, it should be Eisen to be the one presenting evidence that scientists really are just that far superior.  I point to his outrage that he is probably (albeit unconsciously) knows he hasn't a leg to stand on.

It is the cross-checking, competition, and self-correctiveness of science that allows it to make progress, not the perfect lack of biases and other human frailties among the people who do it.  Eisen, and every scientist, should know this, know it by heart one might say, if at least not by mind.
End Strumfels

SCIENCE is in crisis, just when we need it most. Two years ago, C. Glenn Begley and Lee M. Ellis reported in Nature that they were able to replicate only six out of 53 “landmark” cancer studies. Scientists now worry that many published scientific results are simply not true. The natural sciences often offer themselves as a model to other disciplines. But this time science might look for help to the humanities, and to literary criticism in particular.

A major root of the crisis is selective use of data. Scientists, eager to make striking new claims, focus only on evidence that supports their preconceptions. Psychologists call this “confirmation bias”: We seek out information that confirms what we already believe. “We each begin probably with a little bias,” as Jane Austen writes in “Persuasion,” “and upon that bias build every circumstance in favor of it.”

Despite the popular belief that anything goes in literary criticism, the field has real standards of scholarly validity. In his 1967 book “Validity in Interpretation,” E. D. Hirsch writes that “an interpretive hypothesis,” about a poem “is ultimately a probability judgment that is supported by evidence.” This is akin to the statistical approach used in the sciences; Mr. Hirsch was strongly influenced by John Maynard Keynes’s “A Treatise on Probability.”

However, Mr. Hirsch also finds that “every interpreter labors under the handicap of an inevitable circularity: All his internal evidence tends to support his hypothesis because much of it was constituted by his hypothesis.” This is essentially the problem faced by science today. According to Mr. Begley and Mr. Ellis’s report in Nature, some of the nonreproducible “landmark” studies inspired hundreds of new studies that tried to extend the original result without verifying if the original result was true. A claim is not likely to be disproved by an experiment that takes that claim as its starting point. Mr. Hirsch warns about falling “victim to the self-confirmability of interpretations.”

It’s a danger the humanities have long been aware of. In his 1960 book “Truth and Method,” the influential German philosopher Hans-Georg Gadamer argues that an interpreter of a text must first question “the validity — of the fore-meanings dwelling within him.” However, “this kind of sensitivity involves neither ‘neutrality’ with respect to content nor the extinction of one’s self.”
Rather, “the important thing is to be aware of one’s own bias.” To deal with the problem of selective use of data, the scientific community must become self-aware and realize that it has a problem. In literary criticism, the question of how one’s arguments are influenced by one’s prejudgments has been a central methodological issue for decades.

Sometimes prejudgments are hard to resist. In December 2010, for example, NASA-funded researchers, perhaps eager to generate public excitement for new forms of life, reported the existence of a bacterium that used arsenic instead of phosphorus in its DNA. Later, this study was found to have major errors. Even if such influences don’t affect one’s research results, we should at least be able to admit that they are possible.

Austen might say that researchers should emulate Mr. Darcy in “Pride and Prejudice,” who submits, “I will venture to say that my investigations and decisions are not usually influenced by my hopes and fears.” At least Mr. Darcy acknowledges the possibility that his personal feelings might influence his investigations.

But it would be wrong to say that the ideal scholar is somehow unbiased or dispassionate. In my freshman physics class at Caltech, David Goodstein, who later became vice provost of the university, showed us Robert Millikan’s lab notebooks for his famed 1909 oil drop experiment with Harvey Fletcher, which first established the electric charge of the electron.

The notebooks showed many fits and starts and many “results” that were obviously wrong, but as they progressed, the results got cleaner, and Millikan could not help but include comments such as “Best yet — Beauty — Publish.” In other words, Millikan excluded the data that seemed erroneous and included data that he liked, embracing his own confirmation bias.

Mr. Goodstein’s point was that the textbook “scientific method” of dispassionately testing a hypothesis is not how science really works. We often have a clear idea of what we want the results to be before we run an experiment. We freshman physics students found this a bit hard to take. What Mr. Goodstein was trying to teach us was that science as a lived, human process is different from our preconception of it. He was trying to give us a glimpse of self-understanding, a moment of self-doubt.
When I began to read the novels of Jane Austen, I became convinced that Austen, by placing sophisticated characters in challenging, complex situations, was trying to explicitly analyze how people acted strategically. There was no fancy name for this kind of analysis in Austen’s time, but today we call it game theory. I believe that Austen anticipated the main ideas of game theory by more than a century.

As a game theorist myself, how do I know I am not imposing my own way of thinking on Austen? I present lots of evidence to back up my claim, but I cannot deny my own preconceptions and training. As Mr. Gadamer writes, a researcher “cannot separate in advance the productive prejudices that enable understanding from the prejudices that hinder it.” We all bring different preconceptions to our inquiries, whether about Austen or the electron, and these preconceptions can spur as well as blind us.
Perhaps because of its self-awareness about what Austen would call the “whims and caprices” of human reasoning, the field of psychology has been most aggressive in dealing with doubts about the validity of its research. In an open email in September 2012 to fellow psychologists, the Nobel laureate Daniel Kahneman suggests that “to deal effectively with the doubts you should acknowledge their existence and confront them straight on, because a posture of defiant denial is self-defeating.”
Everyone, including natural scientists, social scientists and humanists, could use a little more self-awareness. Understanding science as fundamentally a human process might be necessary to save science itself.

Michael Suk-Young Chwe, a political scientist at U.C.L.A., is the author of “Jane Austen, Game Theorist. ”

Princeton’s Nanomesh Triples Solar Cell Efficiency

December 26, 2012
 
Princeton’s Nanomesh Triples Solar Cell Efficiency
Credit: Jumanji Solar

Researchers at Princeton University have used nanotechnology to develop a nanotechnological mesh that increases efficiency over organic solar cells nearly threefold.
The scientists published their findings in the journal Optics Express¹. The team was able to reduce reflexivity and capture more of the light that isn’t reflected. The resulting solar cell is thinner, less reflective and utilizes sandwiched plastic and metal with the nanomesh. The so-called “Plasmonic Cavity with Subwavelength Hole array,” or “PlaCSH,” reduces the potential for losing light itself and reflects only 4% of direct sunlight, leading to a 52% increase in efficiency compared to conventional, organic solar cells.
Benefits of using the nanomesh
Benefits of using the nanomesh

PlaCSH is capable of capturing large amounts of sunlight even when the sunlight is dispersed on cloudy days, which translates to an increase of 81% in efficiency in indirect lighting conditions. PlaCSH is up to 175% more efficient than traditional solar cells.

The mesh is only 30 nanometers thick and the holes in it are only 175 nm in diameter. This replaces the thicker, traditional top layer that is made out of indium-tin-oxide (ITO). Since this mesh is smaller than the wavelength of the light it’s trying to collect, it is able to exploit the bizarre way that light works in subwavelength structures, allowing the cell to capture light once it enters the holes in the mesh instead of letting much of it reflect away.

The scientists believe that the cells can be made cost effectively and that this method should work for silicon and gallium arsenide solar cells as well.

References
  1. Chou, S. Y., et al., Optics Express, Vol. 21, Issue S1, pp. A60-A76 (2013), dx.doi.org/10.1364/OE.21.000A60

Researchers Discover a Simple Way to Increase Solar Cell Efficiency

January 3, 2014
Researchers Find a Simple Way to Increase Solar Cell Efficiency
By modifying the molecular structure of a polymer used in solar cells, an international team of researchers has increased solar efficiency by more than thirty percent.

Researchers from North Carolina State University and the Chinese Academy of Sciences have found an easy way to modify the molecular structure of a polymer commonly used in solar cells. Their modification can increase solar cell efficiency by more than 30 percent.

Polymer-based solar cells have two domains, consisting of an electron acceptor and an electron donor material. Excitons are the energy particles created by solar cells when light is absorbed. In order to be harnessed effectively as an energy source, excitons must be able to travel quickly to the interface of the donor and acceptor domains and retain as much of the light’s energy as possible.

One way to increase solar cell efficiency is to adjust the difference between the highest occupied molecular orbit (HOMO) of the acceptor and lowest unoccupied molecular orbit (LUMO) levels of the polymer so that the exciton can be harvested with minimal loss. One of the most common ways to accomplish this is by adding a fluorine atom to the polymer’s molecular backbone, a difficult, multi-step process that can increase the solar cell’s performance, but has considerable material fabrication costs.

A team of chemists led by Jianhui Hou from the Chinese Academy of Sciences created a polymer known as PBT-OP from two commercially available monomers and one easily synthesized monomer. Wei Ma, a post-doctoral physics researcher from NC State and corresponding author on a paper describing the research, conducted the X-ray analysis of the polymer’s structure and the donor:acceptor morphology.

PBT-OP was not only easier to make than other commonly used polymers, but a simple manipulation of its chemical structure gave it a lower HOMO level than had been seen in other polymers with the same molecular backbone. PBT-OP showed an open circuit voltage (the voltage available from a solar cell) value of 0.78 volts, a 36 percent increase over the ~ 0.6 volt average from similar polymers.

According to NC State physicist and co-author Harald Ade, the team’s approach has several advantages. “The possible drawback in changing the molecular structure of these materials is that you may enhance one aspect of the solar cell but inadvertently create unintended consequences in devices that defeat the initial intent,” he says. “In this case, we have found a chemically easy way to change the electronic structure and enhance device efficiency by capturing a lager fraction of the light’s energy, without changing the material’s ability to absorb, create and transport energy.”

The researchers’ findings appear in Advanced Materials. The research was funded by the U.S. Department of Energy, Office of Science, Basic Energy Science and the Chinese Ministry of Science and Technology. Dr. Maojie Zhang synthesized the polymers; Xia Guo,Shaoqing Zhang and Lijun Huo from the Chinese Academy of Sciences also contributed to the work.

Publication: Maojie Zhang, et al., “An Easy and Effective Method to Modulate Molecular Energy Level of the Polymer Based on Benzodithiophene for the Application in Polymer Solar Cells,” Advanced Matererilas, 2013; doi: 10.1002/adma.201304631
Source: Tracey Peake, North Carolina State University
Image: Maojie Zhang, et al. doi: 10.1002/adma.201304631

To calculate long-term conservation pay off, factor in people

           To calculate long-term conservation pay off, factor in people












This is a village in Wolong, Sichuan Province, China, where many residents are paid for their actions which help with conservation efforts. Credit: Jianguo
Paying people to protect their natural environment is a popular conservation tool around the world – but figure out that return on investment, for both people and nature, is a thorny problem, especially since such efforts typically stretch on for years.

"Short attention-span worlds with long attention-span problems" is how Xiaodong Chen, a former Michigan State University doctoral student now on faculty at the University of North Carolina-Chapel Hill sums it up.

Chen, with his adviser Jianguo "Jack" Liu, director of the MSU Center for Systems Integration and Sustainability (CSIS) and others, have developed a new way to evaluate and model the long-term effectiveness of conservation investments. Their achievement is not only factoring in ecological gains – like, more trees growing – but also putting the actions and reactions of people into the equation.

The paper, Assessing the Effectiveness of Payments for Ecosystem Services: an Agent-Based Modeling Approach, appears in this week's online edition of Ecology and Society.

The paper examines payments for – the practice of paying people to perform tasks or engage in practices that aid conservation. The authors examined one of China's most sweeping – the National Forest Conservation Program, in which residents in Wolong Nature Reserve are paid to stop chopping down trees for timber and fuel wood.

Chen explained they tapped into both social data and environmental information to be able to create a computer model to simulate how the policy would fare over many years in a variety of scenarios.
Studies documenting results on land cover change and panda habitat dynamics were merged with studies revealing how people were likely to behave if new households were formed or incentives for conservation activities were varied.

"Usually studies are developed in either the social sciences or the natural sciences, and the importance of the other perspectives are not built into scientific exploration," Chen said. "We were able to develop this kind of simulation because of collaborative interdisciplinary research - by putting people with different backgrounds together."

He also said the model's ability to run scenarios about how policy could work over decades is crucial because many goals of , like restoring wildlife habitat, can take decades. In the meantime, the actions of individuals living in the area can change.
Explore further: Tallying the wins and losses of policy


Read more at: http://phys.org/news/2014-01-long-term-factor-people.html#jCp

Energy is the Key to Solving Income Inequality

Posted on by David Holt in Electricity, featured, Jobs & Career Advice, People, Politics/Policy   

What about the cost of energy?

According to the Bureau of Labor Statistics, in 2012 the average U.S. family spent over $4,600 or about 9 percent of their budget to heat and power their homes and fuel their vehicles. Families in the bottom fifth of income earners spent nearly 33 percent more of their budget on energy costs than average $2,500 a year or 12% of their annual budget.

Reference the chart to the left and you will find that low-income families spend two and half times more on energy than on health services. Unlike food and housing, consumers cannot shop around for the lowest cost energy. Bargains can be found in the supermarket, but, prices at the pump do not vary from one station to the next. Conservation similarly is not an option when it’s a choice between driving to work or saving a gallon of gasoline.

A solution to remedying income inequality is tackling rising energy costs. The U.S. Energy Information Administration projects the price of electricity will rise 13.6 percent and the price of gasoline by 15.7 percent from now until 2040. Rising global demand, aging and insufficient energy infrastructure and restrictive government policies all play a role in increasing costs.President Obama has the ability to reverse this trend and lessen the blow to all consumers.

Take the shale gas boom for example. Increasing access to private and state lands and sound state regulatory programs have boosted production of natural gas and led to a significant lowering of prices. IHS CERA predicted that the shale revolution lifted household income by more than $1,200 in 2012 through lower energy costs, more job opportunities and greater federal and state tax revenues.

Policy makers should promote responsible energy development with the knowledge that it will have a positive affect on even the most vulnerable. The president has the power to act. Permitting energy infrastructure – including the Keystone XL Pipeline, opening new offshore areas to oil and natural gas development, and finalizing the nuclear waste confidence rulemaking, could transform the energy economy.

If policy makers want to take meaningful action to help our nation’s low income families, they must pursue actions that help lower – not raise – the cost of energy.

Cryogenics

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Cryogenics...