Search This Blog

Sunday, February 2, 2014

Breastfeeding Is Now Required By Law In The United Arab Emirates

Posted:   |  Updated: 01/30/2014 11:59 pm EST 
                                                                                                                                                                                                                                         
The Emirates' Federal National Council has passed a clause, part of their new Child Rights Law, requiring new moms to breastfeed their babies for two full years, The National reports. Now, men can sue their wives if they don't breastfeed.

According to the National, there was a "marathon debate" over the legislation, but it was ultimately decided that it is every child's right to be breastfed.
Research has found many benefits of breastfeeding for baby, from reducing the risk of obesity to better language and motor development.

However, not all new moms are able to nurse. In those instances, if a woman is prohibited by health reasons, the council will provide a wet nurse to her. It's unclear exactly how a mother's ability to breastfeed will be determined though. Carrie Murphy at Mommyish raises some additional questions about those exceptions:
Where do the wet nurses come from? Do they live with UAE women and their families? How and who determines if you need one? Who pays their salary? .... And what about formula? Will it be sold in the country? Will it be contraband? Will you need a prescription for it? Some babies actually need formula rather breast milk and some babies can’t digest anything with milk at all, either formula OR breast milk.
Council members are trying to improve rights for working moms to make the legislation more practical. But, unsurprisingly, mothers' support groups have raised issues that go beyond logistics.

Because breastfeeding is universally accepted as the healthiest option for moms and babies, new mothers already face great pressure to nurse, a Dubai-based group, Out of the Blues explains. Their purpose is to help women suffering from postnatal illness, many who have trouble breastfeeding.
In an article, also on the National, Out of the Blues writes:
New mothers are extremely vulnerable and need more support, encouragement and education. It is our opinion that, while encouraging women to breastfeed is a laudable aim, it is by supporting those who can and want to breastfeed, and not by punishing those who can’t, that we will reap the benefits we all want to see in our society.
                    
Here in the U.S., breastfeeding has never been legally required, but politicians have intervened to increase the number of babies being breastfed. In 2012, former NYC Mayor Bloomberg introduced "Latch On NYC," a program that encouraged hospitals to make it difficult for new moms to obtain formula "goody bags." Instead of traditional take-home bottles being handed out, mothers have to request it like medication, and listen to a lecture from hospital staff discouraging formula feeding, unless absolutely necessary.

At the time, the initiative faced its own backlash. Many argued that Bloomberg's tactics would make mothers feel guilty, and as blogger Lenore Skenazy put it, "suck the choice out of parenting."

Marie-Claire Bakker, a member of U.S.-based breastfeeding support group, La Leche League, has echoed those sentiments in response to the Emirates' new legislation. “At this vulnerable time, to think of criminalising a new mother who, for whatever reason, is struggling with breastfeeding is not helpful ... She needs informed support, not threats," she told the National.

We Need To Regulate Guns, Not Women's Bodies


This past week marked the forty-first anniversary of Roe v. Wade, the Supreme Court decision that expanded the right to privacy to include a woman’s right to an abortion. It also marked yet another tragic shooting, this one at a mall in Columbia, Md., that left three dead, including the gunman who committed suicide. On the surface, it may seem like abortion and gun violence don’t have anything in common but the way these issues have historically been framed — abortion as murder and the right to bear arms as essential — reflects how tightly we clutch our guns and Bibles in an effort to maintain founding principles, ones whose merit should be challenged based on our ever-evolving society.
If there’s anything that needs comprehensive reform, it’s current gun laws — not abortion rights.
Our nation is no stranger to gun violence — in fact, it’s often our bedfellow, and the facts are startling. In the first nine months of 2013, there were six mass shootings in the United States with at least 20 of those having occurred since President Obama was elected. What’s even more startling is that half of the deadliest shootings in our nation have occurred since 2007.

After 26 people — including 20 children — were slaughtered at Sandy Hook Elementary School in Newtown, Connecticut in December 2012, the media, public and to a degree, political response had the same message: it’s time for comprehensive gun control laws to be enacted. But, we hear the same message after every mass shooting — we heard it after the shootings at Columbine, Virginia Tech, Aurora, Tuscon, the Washington Navy Yard, the Sikh Temple, the Seattle Coffee Shop, Fort Hood and more. While mass shootings are the form of gun violence that most frequently dominates the media, it’s important to note that a plethora of U.S. cities having higher rates of gun violence than entire nations.

Given the consistent gun violence in our country, it’s getting harder and harder to support the Second Amendment without recognizing that it badly needs to be reevaluted. As Harvey Wasserman wrote for The Huffington Post in 2011 after attempted assassination of Rep. Gabbie Giffords (D-AZ), the Second Amendment “is granted only in the context of a well-regulated militia and thus the security of a free state.” He uses the National Guard as an example of this kind of organization, not armed individuals seeking personal retribution. Wasserman argues that this kind of “murder and mayhem” that has escalated “has been made possible by the claim to a Constitutional right that is not there,” meaning that outside the ‘well-regulated militia,’ the Second Amendment holds little relevancy, especially in our current society.

If you take all this into consideration, paired with the fact that nine in 10 Americans support expanded background checks on guns, it’s logical to assume something is being done to curb gun violence, right?

Wrong. Just over a month ago, The Huffington Post reported that despite legislation being proposed, including one that would have enacted comprehensive background checks, Congress has not passed any gun control laws since the Sandy Hook shooting in 2012.
 
In a stark paradox, the Guttmacher Institute reported that in the first half of 2013, 43 state provisions were enacted that restrict access to abortion. By the end of 2013, NARAL Pro-Choice America reports that a total of 53 anti-choice measures were enacted across 24 states despite a recent poll that found Americans support Roe’s landmark decision. You can’t debate the facts and if these numbers are any indication of our legislator’s priorities, it’s saying that they’re more afraid of women with reproductive rights than a nation with lax gun control laws.

Since Roe was decided, the anti-choice movement has been carefully executing a plan to turn back the clock on women’s reproductive rights. This effort can best be described as an assault on women’s bodies but is often presented by the anti-choice community as a “safety concern.” It’s a known fact that abortion is one of thesafest medical procedures but opponents have created myths to drum up support for their restrictions, like abortion being linked to breast cancer (it isn’t), a fetus being able to feel pain at 20 weeks (it can’t), in an all-out attempt to frame abortions as unsafe, murderous and morally negligent. Statistics prove that restricting abortion is what makes it unsafe, but facts, shmacts — who needs them when we’ve got all-male Congressional panels debating women’s health care (hint: not this guy)?

It seems pretty asinine that we live in a country whose state legislators are more likely to restrict abortion than enact comprehensive gun control laws (I’m looking at you, Rick Perry). Take Texas, for example. Last summer, the state imposed catastrophic abortion restrictions under the omnibus anti-choice bill HB 2 that have closed clinics and left patients to travel hundreds of miles to get an abortion. Texas is currently in a reproductive state of emergency but gun rights advocates fear not, because it’s easier to get a gun than an abortion.

How’s that for warped logic, especially when you consider that family planning and women’s reproductive services are good for the economy. Restricting access to reproductive health care can force women to carry unwanted or intended pregnancies to term, thus limiting her ability to fully participate in the workforce and contribute to the economy. A woman’s right to bodily autonomy and agency surrounding her decisions about pregnancy isn’t just good economics, it’s necessary as we build a more gender equal society. Viewing women as primarily baby-making receptacles does nothing for equality. If anything, it promotes the idea that a woman’s place is in the kitchen, not the workforce.

Efforts that restrict access to abortion and reproductive rights but uphold dangerous gun laws magnifies the disconnect between public opinion and actual representation and reflects the desperation of status quo stakeholders to maintain the three Ps: power, privilege and the patriarchy.
There’s no bigger threat to these structures than disrupting their history of power but our current social, political and economic environment calls for change. If we want to save more American lives, it’s not women’s bodies that need regulation — it’s gun laws.

(Phys.org) —A Friday report in Nature News handles a well-publicized topic, the air quality in Beijing. That may seem like rather old news, but the Friday report has new information on the city's troubling air quality. Scientists were looking for information about the potential for pathogens and allergens in Beijing's air and they turned to "metagenomics" as their study tool. The research team described what they were seeking in their paper, "Inhalable Microorganisms in Beijing's PM2.5 and PM10 Pollutants during a Severe Smog Event," for Environmental Science & Technology. "While the physical and chemical properties of PM pollutants have been extensively studied, much less is known about the inhalable microorganisms. Most existing data on airborne microbial communities using 16S or 18S rRNA gene sequencing to categorize bacteria or fungi into the family or genus levels do not provide information on their allergenic and pathogenic potentials. Here we employed metagenomic methods to analyze the microbial composition of Beijing's PM pollutants during a severe January smog event."

by Nancy Owano
A woman wearing a face mask walks on an overpass in Beijing on January 16, 2014       
A woman wearing a face mask walks on an overpass in Beijing on January 16, 2014

(Phys.org) —A Friday report in Nature News handles a well-publicized topic, the air quality in Beijing. That may seem like rather old news, but the Friday report has new information on the city's troubling air quality. Scientists were looking for information about the potential for pathogens and allergens in Beijing's air and they turned to "metagenomics" as their study tool. The research team described what they were seeking in their paper, "Inhalable Microorganisms in Beijing's PM2.5 and PM10 Pollutants during a Severe Smog Event," for Environmental Science & Technology. "While the physical and chemical properties of PM pollutants have been extensively studied, much less is known about the inhalable microorganisms. Most existing data on airborne microbial communities using 16S or 18S rRNA gene sequencing to categorize bacteria or fungi into the family or genus levels do not provide information on their allergenic and pathogenic potentials. Here we employed metagenomic methods to analyze the microbial composition of Beijing's PM pollutants during a severe January smog event."

They took 14 air samples over seven consecutive days. Using genome sequencing, they found about 1,300 different microbial species in the heavy smog period of early last year. The scientists compared their results with a large gene database. What about their findings? Most of the microbes they found were benign but a few were responsible for allergies and respiratory disease. As Nature News reported, the most abundant species identified was Geodermatophilus obscurus, That is a common soil bacterium. Streptococcus pneumonia, however, was also part of the brew, which can cause pneumonia, along with Aspergillus fumigatus, a fungal allergen, and other bacteria typically found in faeces. "Our results," wrote the researchers, "suggested that the majority of the inhalable microorganisms were soil-associated and nonpathogenic to humans. Nevertheless, the sequences of several respiratory microbial allergens and pathogens were identified and their relative abundance appeared to have increased with increased concentrations of PM pollution."

The authors suggested that their findings may provide an important reference for environmental scientists, health workers and city planners. The researchers also suggested, according to Nature News, that clinical studies explore signs of the same microbes in the sputum of patients with respiratory tract infections, to assess whether smoggier days lead to more infections.
Metagenomics, which can analyze regardless of the ability of member organisms to be cultured in the laboratory, is recognized as a powerful approach. Nature Reviews also describes metagenomics as "based on the genomic analysis of microbial DNA that is extracted directly from communities in environmental samples."
Explore further: Delhi says air 'not as bad' as Beijing after smog scrutiny
More information: Environmental Science & Technology paper: pubs.acs.org/doi/abs/10.1021/es4048472

Nature News story: www.nature.com/news/beijing-smog-contains-witches-brew-of-microbes-1.14640
Journal reference: Environmental Science & Technology

Saturday, February 1, 2014

The State of Obama's Energy Thinking
























“And when our children’s children look us in the eye,” said President Obama in his State of the Union address, “and ask if we did all we could to leave them a safer, more stable world, with new sources of energy, I want us to be able to say yes, we did.”

I do, too.

But the way to do this is the exact opposite of Obama’s prescribed policies of restricting fossil fuel use and giving energy welfare to producers of unreliable energy from solar and wind. Just consider the last three decades.

When I was born 33 years ago, politicians and leaders were telling my parents’ generation the same thing we’re being told today: that for the sake of future generations, we need to end our supposed addiction to fossil fuels.

Fossil fuels, they explained, were a fast-depleting resource that would inevitably run out. Fossil fuels, they explained, would inevitably make our environment dirtier and dirtier. And of course, fossil fuels would change the climate drastically and for the worse. It was the responsibility of my parents’ generation, they were told, to drastically restrict fossil fuels and live on solar and wind energy.
My generation should be eternally grateful they did the exact opposite.

Since 1980, when I was born, fossil fuel use here and around the world has only grown. In the U.S., between 1980 and 2012 the consumption of oil increased 3.9%, the consumption of natural gas increased 28.5%, and the consumption of coal increased 12.6%. (Had it not been for the economic downturn these numbers would be higher.)

Globally, oil consumption increased 38.6%, natural gas increased 130.5%, and coal increased 106.8%. (Source: BP Statistical Review of World Energy, June 2013)

My parents’ generation was told to expect disastrous consequences.

In 1980, the “Global 2000 Report to the President” wrote: “If present trends continue, . . . the world
in 2000 will be more crowded, more polluted, less stable ecologically, and more vulnerable to disruption than the world we live in now. Serious stresses involving population, resources, and environment are clearly visible ahead.”

In 1989, the New Yorker’s star journalist Bill McKibben, claiming to represent a scientific consensus, prophesized:
We stand at the end of an era—the hundred years’ binge on oil, gas, and coal…The choice of doing nothing—of continuing to burn ever more oil and coal—is not a choice, in other words. It will lead us, if not straight to hell, then straight to a place with a similar temperature.
Al Gore, just before becoming Vice President, said the use of fossil fuels put us on the verge of an “ecological holocaust.”

What actually happened? Thanks in large part to our increasing use of fossil fuel energy and the technologies they power, life has gotten a lot better across the board for billions of people around the globe.

Life expectancy is way up. World life expectancy at birth was just 63 years in 1980. That number has increased to over 70. The US, already far above average with 73 in 1980, today enjoys an average life expectancy of 78. The infant mortality rate of mankind is less than half of what is used to be in 1980 (from 80 to 35 per 1000 live births).

Malnutrition and undernourishment have plummeted. Access to electricity, a proxy for development and health, is constantly increasing. Access to improved water sources, a necessity for basic hygiene standards and human health, has been possible for ever increasing portions of mankind, especially in poor countries (1990 = 76%, 2012 = 89%).

GDP per capita has constantly increased. The percentage of people who live on $2 a day in South Asia, a region with massive fossil fuel use, has been steadily on the decline from 87% in the early 1980s to 67% in 2010. (Source: World Bank Data.)

And then there is the statistic that the climate doomsayers will never acknowledge. Thanks to energy and technology, climate-related deaths have been plummeting for decades: you are 50X less likely to die from a climate-related cause (heat wave, cold snap, drought, flood, storm) than you would have been 80 years ago.

We are the longest-living, best-fed, healthiest, richest generation of humans on this planet and there is unending potential for further progress.

If, that is, we don’t listen to the anti-fossil-fuel “experts.”

Here’s a scary story. In the 1970s, arguably the most revered intellectuals on energy and environment were men named Amory Lovins and Paul Ehrlich. (They are still revered, which, for reasons that will become apparent within a paragraph, is a moral crime.)

In 1977, Lovins, considered an energy wunderkind for his supposedly innovative criticisms of fossil fuels and his support of solar power and reduced energy use, explained that we already used too much energy. And in particular, the kind of energy we least needed was . . . electricity: “[W]e don’t need any more big electric generating stations. We already have about twice as much electricity as we can use to advantage.”

Environmentalist legend Paul Ehrlich had famously declared “The battle to feed all of humanity is over. In the 1970s hundreds of millions of people will starve to death in spite of any crash programs embarked upon now.” Thanks in large part to a surge in energy use that led to a massive alleviation of global hunger, that prediction did not come to pass. But it might have, had we followed Ehrlich’s advice: “Except in special circumstances, all construction of power generating facilities should cease immediately, and power companies should be forbidden to encourage people to use more power.
Power is much too cheap. It should certainly be made more expensive and perhaps rationed, in order to reduce its frivolous use.”

Had we listened to these two “experts,” billions of people—ultimately, every single person alive—would be worse off today. Or dead. You would certainly not be reading this on an “unnecessary” electronic device connecting to the electricity-hungry Internet.

This is what is at stake with energy. And while President Obama wasn’t extreme enough in his recent rhetoric for many environmentalist groups, he is on record as calling for outlawing 80% of fossil fuel use. Which would mean not only prohibiting future power plants, but shutting down existing ones.

Whether he knows it or not, every time he attacks fossil fuels, President Obama is working to make his grandchildren’s world much, much worse.

Alex Epstein, an energy philosopher, debater, and communications consultant, is Founder and President of the Center for Industrial Progress, head of the I Love Fossil Fuels Campaign, and author of Fossil Fuels Improve the Planet, “The Moral Case for Fossil Fuels: The Key to Winning
Hearts and Minds,” and the forthcoming book The Case for Fossil Fuels (Penguin/Portfolio, 2014). Contact him here.

A scientific first: Team engineers photograph radiation beams in human body through Cherenkov effect


           A scientific first: Team engineers photograph radiation beams in human body through Cherenkov effect

















          A scientific first: Team engineers photograph radiation beams in human body through    
          Cherenkov effect.  Long exposure image of Cherenkov emission and induced fluorescence
          from Fluorescein dissolving in water during irradiation from a therapeutic LINAC beam.
          Credit: Adam Glaser.

Jan 23, 2014
 
A scientific breakthrough may give the field of radiation oncology new tools to increase the precision and safety of radiation treatment in cancer patients by helping doctors "see" the powerful beams of a linear accelerator as they enter or exit the body.

We don't have X-ray vision. When we have an X-ray or mammogram, we cannot detect the beam that passes through our bone or soft tissue, neither can our doctor. But what if we could see X-rays? When we use powerful X-rays for cancer treatment, we could see how they hit the tumor. If we were off target, we could stop and make adjustments to improve accuracy. Pinpoint precision is important. The goal of radiation is to kill cancer cells without harming healthy tissue.

Safety in Radiation Oncology
As a way to make radiation safer and better, Dartmouth began to investigate a scientific phenomenon called the Cherenkov effect in 2011. Our scientists and engineers theorized that by using Cherenkov emissions the beam of radiation could be "visible" to the treatment team. The ability to capture an X-ray would show:
  • how the radiation signals travel through the body
  • the dose of radiation to the skin
  • any errors in dosage.
TV viewers may have seen images of sunken fuel rods from a nuclear power plant emitting a blue-green glow. That is the Cherenkov effect. When a particle with an electric charge travels faster than the speed of light through something that does not conduct electricity, like the human body or water, it glows. As the matter relaxes from polarization, it emits light. (Yes, for a brief period people glow during radiation.)

The Cherenkov effect in the laboratory
As a first step, engineers at the Thayer School of Engineering at Dartmouth modified a regular camera with a night vision scope to take photos of radiation beams as they passed through water. What appeared on the photos is the Cherenkov effect, a luminescent blue glow. (An engineering student, Adam Glaser, explains how it works in this video.)

To refine the approach for use in radiation treatment, scientists used a mannequin of the human body. They measured and studied the results to refine their ability to capture the luminescence, experimenting with beam size, position, and wavelength.

Cherenkov imaging used for first time in treatment setting
   
A scientific first: Team engineers photograph radiation beams in human body through Cherenkov effect
A radiation beam treatment is visualized here in the first in human use of the technique. The blue represents the treatment area. As the dose fades the treatment area becomes a dark gray shadow. The lingering blue in the lower right hand area …more

With the clinical aspects refined, Geisel School of Medicine researchers photographed luminescence during the routine of a dog with an oral tumor.

This was the first time Cherenkov studies came out of the laboratory and into a treatment setting. The scientists coined the approach Cherenkoscopy. As they anticipated, during the session they were able to see detailed information about the treatment field and the dose. The results were published in the November 2013 issue of the Journal of Biomedical Optics.

"This first observation in the dog proved that we could image a during treatment in real time," said David Gladstone, ScD, chief of Clinical Physics at Norris Cotton Cancer Center. "The images verified the shape of the beam as well as intended motion of the treatment machine."

First image of Cherenkov emissions during treatment of human breast
Now ready to use the technology with a human patient, the team prepared to view radiation as it entered the body of a female breast cancer patient undergoing radiation in July 2013.

"Breast cancer is suited for this because the imaging visualizes the superficial dose of radiation to the skin," said Lesley A. Jarvis, MD, radiation oncologist, Norris Cotton Cancer Center. Skin reactions, similar to sunburn, are a common and bothersome side effect during breast radiation. "By imaging and quantitating the surface dose in a way that has never been done before," said Jarvis, "we hope to learn more about the physical factors contributing to this skin reaction."

By seeing the effect of radiation on the body, radiation oncologists and physicists can make adjustments to avoid side effects to the skin. Most radiation patients undergo somewhere between 8-20 sessions. The Cherenkov images of the breast cancer patient showed a hot spot in her underarm, which physicians and physicists could work to prevent in future sessions.

"The actual images show that we are treating the exact correct location, with the appropriate beam modifications and with the precise dose of radiation," said Jarvis.

Clinical use of Cherenkov emissions proves successful
This trial showed that the Cherenkov effect is feasible for use real-time during radiation. "We have learned the imaging is easy to incorporate into the patient's treatment, adding only minimal time to the treatments," said Jarvis.

"The time needed to acquire this information is negligible, even with our experimental, non-integrated system," said Gladstone. "Cherenkov images were found to contain much richer information than anticipated, specifically, we did not expect to visualize internal blood vessels."

Mapping blood vessels opens up opportunities for to use a person's internal anatomy to confirm precise locations. Skin tattoos on patients determine a preliminary alignment that is verified with X-rays, which show bony anatomy or implanted markers. Cherenkov imaging allow technicians to visualize soft tissue and internal vasculature.

A possible safety net for radiation treatment
By integrating Cherenkov imaging into routine clinical care, Gladstone says the technology could be used to verify that the proper dose is being delivered to patients, helping to avoid misadministration of radiation therapy, a rare, but dangerous occurrence.

Twelve patients are participating in a pilot study, which is almost complete. The research team plans to publish the results in a peer reviewed journal. The Cherenkov effect project team includes Lesley Jarvis, MD, assistant professor of Medicine, Geisel School of Medicine; Brian Pogue, PhD, professor of Engineering, Thayer School, professor of Physics & Astronomy, Dartmouth College, professor of Surgery, Geisel School of Medicine; David J. Gladstone, ScD, DABMP associate professor of Medicine, Geisel School of Medicine; Adam Glaser, engineering student; Rongxiao Zhang, physics student; Whitney Hitchcock, medical school student.

With each trial the team gathers more information on the utility of the approach. "Stay tuned, we are now planning more definitive studies to determine the impact of this new imaging technology to improve safety of radiation," said Jarvis.
Explore further: Balloon mis-positioning during prostate cancer treatment could affect success of radiation delivery

Journal reference: Journal of Biomedical Optics search and more info website

New solar cell technology captures high-energy photons more efficiently

Jan 24, 2014 by Jared Sagoff
New solar cell technology captures high-energy photons more efficiently       

















Most simple solar cells handle the bluish hues of the electromagnetic spectrum inefficiently. This is because blue photons — incoming particles of light that strike the solar cell — actually have excess energy that a conventional solar cell can’t capture.


Scientists at the U.S. Department of Energy's Argonne National Laboratory and the University of Texas at Austin have together developed a new, inexpensive material that has the potential to capture and convert —particularly from the bluer part of the spectrum—much more efficiently than ever before.

Most simple solar cells handle these bluish hues of the inefficiently. This is because blue photons—incoming particles of light that strike the solar cell—actually have excess energy that a conventional solar cell can't capture.
"Photons of different energies kick electrons up by different amounts," said University of Texas Professor Brian Korgel. "Some photons come in with more energy than the cell is optimized to handle, and so a lot of that energy is lost as heat."

Because of this limitation, scientists had originally believed that simple would never be able to convert more than about 34 percent of to electricity. However, about a decade ago, researchers saw the potential for a single high-energy photon to stimulate multiple "excitons" (pairs of an electron and a positively-charged partner called a "hole") instead of just one. "This was a very exciting discovery, but we were still skeptical that we could get the electrons out of the material," Korgel said.

In their study, Korgel and his team used specialized spectroscopic equipment at Argonne's Center for Nanoscale Materials to look at multiexciton generation in copper indium selenide, a material closely related to another more commonly produced thin film that holds the record for the most efficient thin-film semiconductor. "This is one of the first studies done of multiple exciton generation in such a familiar and inexpensive material," said Argonne nanoscientist Richard Schaller.

"Argonne's spectroscopic techniques played a critical role in the detection of the multiexcitons," Korgel said. "These kinds of measurements can't be made many places."
In order to deposit thin films of the nanocrystalline material, the researchers used a process known as "photonic curing," which involves the split-second heating up and cooling down of the top layer of the material. This curing process not only prevents the melting of the glass that contains the nanocrystals, but also vaporizes organic molecules that inhibit multiple exciton extraction.

Although the study mostly proves that the efficiency boost provided by multiple exciton extraction is possible in mass-producible materials, the major hurdle will be to incorporate these into actual real-world devices.

"The holy grail of our research is not necessarily to boost efficiencies as high as they can theoretically go, but rather to combine increases in efficiency to the kind of large-scale roll-to-roll printing or processing technologies that will help us drive down costs," Korgel said.
Explore further: New study could lead to paradigm shift in organic solar cell research

Scientific Pride and Prejudice

Olimpia Zagnola
David Strumfels:  Watch Jonathan Eisen writhe his hands over this.  What?  Scientists are above cognitive biases, unlike ordinary mortals (who required them to function).  And they -- the NYT -- present no evidence!  I have bad news for him.  Cognitive bias is built into how the human brain functions.  High intelligence and years of education and research do no make it just go away -- far from it, by creating the illusion of being above it, the Eisens and every other scientist in the world (including me) are that much more susceptible to it.  The truth is, it should be Eisen to be the one presenting evidence that scientists really are just that far superior.  I point to his outrage that he is probably (albeit unconsciously) knows he hasn't a leg to stand on.

It is the cross-checking, competition, and self-correctiveness of science that allows it to make progress, not the perfect lack of biases and other human frailties among the people who do it.  Eisen, and every scientist, should know this, know it by heart one might say, if at least not by mind.
End Strumfels

SCIENCE is in crisis, just when we need it most. Two years ago, C. Glenn Begley and Lee M. Ellis reported in Nature that they were able to replicate only six out of 53 “landmark” cancer studies. Scientists now worry that many published scientific results are simply not true. The natural sciences often offer themselves as a model to other disciplines. But this time science might look for help to the humanities, and to literary criticism in particular.

A major root of the crisis is selective use of data. Scientists, eager to make striking new claims, focus only on evidence that supports their preconceptions. Psychologists call this “confirmation bias”: We seek out information that confirms what we already believe. “We each begin probably with a little bias,” as Jane Austen writes in “Persuasion,” “and upon that bias build every circumstance in favor of it.”

Despite the popular belief that anything goes in literary criticism, the field has real standards of scholarly validity. In his 1967 book “Validity in Interpretation,” E. D. Hirsch writes that “an interpretive hypothesis,” about a poem “is ultimately a probability judgment that is supported by evidence.” This is akin to the statistical approach used in the sciences; Mr. Hirsch was strongly influenced by John Maynard Keynes’s “A Treatise on Probability.”

However, Mr. Hirsch also finds that “every interpreter labors under the handicap of an inevitable circularity: All his internal evidence tends to support his hypothesis because much of it was constituted by his hypothesis.” This is essentially the problem faced by science today. According to Mr. Begley and Mr. Ellis’s report in Nature, some of the nonreproducible “landmark” studies inspired hundreds of new studies that tried to extend the original result without verifying if the original result was true. A claim is not likely to be disproved by an experiment that takes that claim as its starting point. Mr. Hirsch warns about falling “victim to the self-confirmability of interpretations.”

It’s a danger the humanities have long been aware of. In his 1960 book “Truth and Method,” the influential German philosopher Hans-Georg Gadamer argues that an interpreter of a text must first question “the validity — of the fore-meanings dwelling within him.” However, “this kind of sensitivity involves neither ‘neutrality’ with respect to content nor the extinction of one’s self.”
Rather, “the important thing is to be aware of one’s own bias.” To deal with the problem of selective use of data, the scientific community must become self-aware and realize that it has a problem. In literary criticism, the question of how one’s arguments are influenced by one’s prejudgments has been a central methodological issue for decades.

Sometimes prejudgments are hard to resist. In December 2010, for example, NASA-funded researchers, perhaps eager to generate public excitement for new forms of life, reported the existence of a bacterium that used arsenic instead of phosphorus in its DNA. Later, this study was found to have major errors. Even if such influences don’t affect one’s research results, we should at least be able to admit that they are possible.

Austen might say that researchers should emulate Mr. Darcy in “Pride and Prejudice,” who submits, “I will venture to say that my investigations and decisions are not usually influenced by my hopes and fears.” At least Mr. Darcy acknowledges the possibility that his personal feelings might influence his investigations.

But it would be wrong to say that the ideal scholar is somehow unbiased or dispassionate. In my freshman physics class at Caltech, David Goodstein, who later became vice provost of the university, showed us Robert Millikan’s lab notebooks for his famed 1909 oil drop experiment with Harvey Fletcher, which first established the electric charge of the electron.

The notebooks showed many fits and starts and many “results” that were obviously wrong, but as they progressed, the results got cleaner, and Millikan could not help but include comments such as “Best yet — Beauty — Publish.” In other words, Millikan excluded the data that seemed erroneous and included data that he liked, embracing his own confirmation bias.

Mr. Goodstein’s point was that the textbook “scientific method” of dispassionately testing a hypothesis is not how science really works. We often have a clear idea of what we want the results to be before we run an experiment. We freshman physics students found this a bit hard to take. What Mr. Goodstein was trying to teach us was that science as a lived, human process is different from our preconception of it. He was trying to give us a glimpse of self-understanding, a moment of self-doubt.
When I began to read the novels of Jane Austen, I became convinced that Austen, by placing sophisticated characters in challenging, complex situations, was trying to explicitly analyze how people acted strategically. There was no fancy name for this kind of analysis in Austen’s time, but today we call it game theory. I believe that Austen anticipated the main ideas of game theory by more than a century.

As a game theorist myself, how do I know I am not imposing my own way of thinking on Austen? I present lots of evidence to back up my claim, but I cannot deny my own preconceptions and training. As Mr. Gadamer writes, a researcher “cannot separate in advance the productive prejudices that enable understanding from the prejudices that hinder it.” We all bring different preconceptions to our inquiries, whether about Austen or the electron, and these preconceptions can spur as well as blind us.
Perhaps because of its self-awareness about what Austen would call the “whims and caprices” of human reasoning, the field of psychology has been most aggressive in dealing with doubts about the validity of its research. In an open email in September 2012 to fellow psychologists, the Nobel laureate Daniel Kahneman suggests that “to deal effectively with the doubts you should acknowledge their existence and confront them straight on, because a posture of defiant denial is self-defeating.”
Everyone, including natural scientists, social scientists and humanists, could use a little more self-awareness. Understanding science as fundamentally a human process might be necessary to save science itself.

Michael Suk-Young Chwe, a political scientist at U.C.L.A., is the author of “Jane Austen, Game Theorist. ”

Princeton’s Nanomesh Triples Solar Cell Efficiency

December 26, 2012
 
Princeton’s Nanomesh Triples Solar Cell Efficiency
Credit: Jumanji Solar

Researchers at Princeton University have used nanotechnology to develop a nanotechnological mesh that increases efficiency over organic solar cells nearly threefold.
The scientists published their findings in the journal Optics Express¹. The team was able to reduce reflexivity and capture more of the light that isn’t reflected. The resulting solar cell is thinner, less reflective and utilizes sandwiched plastic and metal with the nanomesh. The so-called “Plasmonic Cavity with Subwavelength Hole array,” or “PlaCSH,” reduces the potential for losing light itself and reflects only 4% of direct sunlight, leading to a 52% increase in efficiency compared to conventional, organic solar cells.
Benefits of using the nanomesh
Benefits of using the nanomesh

PlaCSH is capable of capturing large amounts of sunlight even when the sunlight is dispersed on cloudy days, which translates to an increase of 81% in efficiency in indirect lighting conditions. PlaCSH is up to 175% more efficient than traditional solar cells.

The mesh is only 30 nanometers thick and the holes in it are only 175 nm in diameter. This replaces the thicker, traditional top layer that is made out of indium-tin-oxide (ITO). Since this mesh is smaller than the wavelength of the light it’s trying to collect, it is able to exploit the bizarre way that light works in subwavelength structures, allowing the cell to capture light once it enters the holes in the mesh instead of letting much of it reflect away.

The scientists believe that the cells can be made cost effectively and that this method should work for silicon and gallium arsenide solar cells as well.

References
  1. Chou, S. Y., et al., Optics Express, Vol. 21, Issue S1, pp. A60-A76 (2013), dx.doi.org/10.1364/OE.21.000A60

Researchers Discover a Simple Way to Increase Solar Cell Efficiency

January 3, 2014
Researchers Find a Simple Way to Increase Solar Cell Efficiency
By modifying the molecular structure of a polymer used in solar cells, an international team of researchers has increased solar efficiency by more than thirty percent.

Researchers from North Carolina State University and the Chinese Academy of Sciences have found an easy way to modify the molecular structure of a polymer commonly used in solar cells. Their modification can increase solar cell efficiency by more than 30 percent.

Polymer-based solar cells have two domains, consisting of an electron acceptor and an electron donor material. Excitons are the energy particles created by solar cells when light is absorbed. In order to be harnessed effectively as an energy source, excitons must be able to travel quickly to the interface of the donor and acceptor domains and retain as much of the light’s energy as possible.

One way to increase solar cell efficiency is to adjust the difference between the highest occupied molecular orbit (HOMO) of the acceptor and lowest unoccupied molecular orbit (LUMO) levels of the polymer so that the exciton can be harvested with minimal loss. One of the most common ways to accomplish this is by adding a fluorine atom to the polymer’s molecular backbone, a difficult, multi-step process that can increase the solar cell’s performance, but has considerable material fabrication costs.

A team of chemists led by Jianhui Hou from the Chinese Academy of Sciences created a polymer known as PBT-OP from two commercially available monomers and one easily synthesized monomer. Wei Ma, a post-doctoral physics researcher from NC State and corresponding author on a paper describing the research, conducted the X-ray analysis of the polymer’s structure and the donor:acceptor morphology.

PBT-OP was not only easier to make than other commonly used polymers, but a simple manipulation of its chemical structure gave it a lower HOMO level than had been seen in other polymers with the same molecular backbone. PBT-OP showed an open circuit voltage (the voltage available from a solar cell) value of 0.78 volts, a 36 percent increase over the ~ 0.6 volt average from similar polymers.

According to NC State physicist and co-author Harald Ade, the team’s approach has several advantages. “The possible drawback in changing the molecular structure of these materials is that you may enhance one aspect of the solar cell but inadvertently create unintended consequences in devices that defeat the initial intent,” he says. “In this case, we have found a chemically easy way to change the electronic structure and enhance device efficiency by capturing a lager fraction of the light’s energy, without changing the material’s ability to absorb, create and transport energy.”

The researchers’ findings appear in Advanced Materials. The research was funded by the U.S. Department of Energy, Office of Science, Basic Energy Science and the Chinese Ministry of Science and Technology. Dr. Maojie Zhang synthesized the polymers; Xia Guo,Shaoqing Zhang and Lijun Huo from the Chinese Academy of Sciences also contributed to the work.

Publication: Maojie Zhang, et al., “An Easy and Effective Method to Modulate Molecular Energy Level of the Polymer Based on Benzodithiophene for the Application in Polymer Solar Cells,” Advanced Matererilas, 2013; doi: 10.1002/adma.201304631
Source: Tracey Peake, North Carolina State University
Image: Maojie Zhang, et al. doi: 10.1002/adma.201304631

To calculate long-term conservation pay off, factor in people

           To calculate long-term conservation pay off, factor in people












This is a village in Wolong, Sichuan Province, China, where many residents are paid for their actions which help with conservation efforts. Credit: Jianguo
Paying people to protect their natural environment is a popular conservation tool around the world – but figure out that return on investment, for both people and nature, is a thorny problem, especially since such efforts typically stretch on for years.

"Short attention-span worlds with long attention-span problems" is how Xiaodong Chen, a former Michigan State University doctoral student now on faculty at the University of North Carolina-Chapel Hill sums it up.

Chen, with his adviser Jianguo "Jack" Liu, director of the MSU Center for Systems Integration and Sustainability (CSIS) and others, have developed a new way to evaluate and model the long-term effectiveness of conservation investments. Their achievement is not only factoring in ecological gains – like, more trees growing – but also putting the actions and reactions of people into the equation.

The paper, Assessing the Effectiveness of Payments for Ecosystem Services: an Agent-Based Modeling Approach, appears in this week's online edition of Ecology and Society.

The paper examines payments for – the practice of paying people to perform tasks or engage in practices that aid conservation. The authors examined one of China's most sweeping – the National Forest Conservation Program, in which residents in Wolong Nature Reserve are paid to stop chopping down trees for timber and fuel wood.

Chen explained they tapped into both social data and environmental information to be able to create a computer model to simulate how the policy would fare over many years in a variety of scenarios.
Studies documenting results on land cover change and panda habitat dynamics were merged with studies revealing how people were likely to behave if new households were formed or incentives for conservation activities were varied.

"Usually studies are developed in either the social sciences or the natural sciences, and the importance of the other perspectives are not built into scientific exploration," Chen said. "We were able to develop this kind of simulation because of collaborative interdisciplinary research - by putting people with different backgrounds together."

He also said the model's ability to run scenarios about how policy could work over decades is crucial because many goals of , like restoring wildlife habitat, can take decades. In the meantime, the actions of individuals living in the area can change.
Explore further: Tallying the wins and losses of policy


Read more at: http://phys.org/news/2014-01-long-term-factor-people.html#jCp

Energy is the Key to Solving Income Inequality

Posted on by David Holt in Electricity, featured, Jobs & Career Advice, People, Politics/Policy   

What about the cost of energy?

According to the Bureau of Labor Statistics, in 2012 the average U.S. family spent over $4,600 or about 9 percent of their budget to heat and power their homes and fuel their vehicles. Families in the bottom fifth of income earners spent nearly 33 percent more of their budget on energy costs than average $2,500 a year or 12% of their annual budget.

Reference the chart to the left and you will find that low-income families spend two and half times more on energy than on health services. Unlike food and housing, consumers cannot shop around for the lowest cost energy. Bargains can be found in the supermarket, but, prices at the pump do not vary from one station to the next. Conservation similarly is not an option when it’s a choice between driving to work or saving a gallon of gasoline.

A solution to remedying income inequality is tackling rising energy costs. The U.S. Energy Information Administration projects the price of electricity will rise 13.6 percent and the price of gasoline by 15.7 percent from now until 2040. Rising global demand, aging and insufficient energy infrastructure and restrictive government policies all play a role in increasing costs.President Obama has the ability to reverse this trend and lessen the blow to all consumers.

Take the shale gas boom for example. Increasing access to private and state lands and sound state regulatory programs have boosted production of natural gas and led to a significant lowering of prices. IHS CERA predicted that the shale revolution lifted household income by more than $1,200 in 2012 through lower energy costs, more job opportunities and greater federal and state tax revenues.

Policy makers should promote responsible energy development with the knowledge that it will have a positive affect on even the most vulnerable. The president has the power to act. Permitting energy infrastructure – including the Keystone XL Pipeline, opening new offshore areas to oil and natural gas development, and finalizing the nuclear waste confidence rulemaking, could transform the energy economy.

If policy makers want to take meaningful action to help our nation’s low income families, they must pursue actions that help lower – not raise – the cost of energy.

What Actually Happens While You Sleep and How It Affects Your Every Waking Moment

by

“We are living in an age when sleep is more comfortable than ever and yet more elusive.”
The Ancient Greeks believed that one fell asleep when the brain filled with blood and awakened once it drained back out. Nineteenth-century philosophers contended that sleep happened when the brain was emptied of ambitions and stimulating thoughts. “If sleep doesn’t serve an absolutely vital function, it is the greatest mistake evolution ever made,” biologist Allan Rechtschaffen once remarked. Even today, sleep remains one of the most poorly understood human biological functions, despite some recent strides in understanding the “social jetlag” of our internal clocks and the relationship between dreaming and depression. In Dreamland: Adventures in the Strange Science of Sleep (public library), journalist David K. Randall — who stumbled upon the idea after crashing violently into a wall while sleepwalking — explores “the largest overlooked part of your life and how it affects you even if you don’t have a sleep problem.” From gender differences to how come some people snore and others don’t to why we dream, he dives deep into this mysterious third of human existence to illuminate what happens when night falls and how it impacts every aspect of our days.

Most of us will spend a full third of our lives asleep, and yet we don’t have the faintest idea of what it does for our bodies and our brains. Research labs offer surprisingly few answers. Sleep is one of the dirty little secrets of science. My neurologist wasn’t kidding when he said there was a lot that we don’t know about sleep, starting with the most obvious question of all — why we, and every other animal, need to sleep in the first place.

But before we get too anthropocentrically arrogant in our assumptions, it turns out the quantitative requirement of sleep isn’t correlated with how high up the evolutionary chain an organism is:

Lions and gerbils sleep about thirteen hours a day. Tigers and squirrels nod off for about fifteen hours. At the other end of the spectrum, elephants typically sleep three and a half hours at a time, which seems lavish compared to the hour and a half of shut-eye that the average giraffe gets each night.

Humans need roughly one hour of sleep for every two hours they are awake, and the body innately knows when this ratio becomes out of whack. Each hour of missed sleep one night will result in deeper sleep the next, until the body’s sleep debt is wiped clean.

What, then, happens as we doze off, exactly? Like all science, our understanding of sleep seems to be a constant “revision in progress”:

Despite taking up so much of life, sleep is one of the youngest fields of science. Until the middle of the twentieth century, scientists thought that sleep was an unchanging condition during which time the brain was quiet. The discovery of rapid eye movements in the 1950s upended that. Researchers then realized that sleep is made up of five distinct stages that the body cycles through over roughly ninety-minute periods. The first is so light that if you wake up from it, you might not realize that you have been sleeping. The second is marked by the appearance of sleep-specific brain waves that last only a few seconds at a time. If you reach this point in the cycle, you will know you have been sleeping when you wake up. This stage marks the last drop before your brain takes a long ride away from consciousness. Stages three and four are considered deep sleep. In three, the brain sends out long, rhythmic bursts called delta waves. Stage four is known as slow-wave sleep for the speed of its accompanying brain waves. The deepest form of sleep, this is the farthest that your brain travels from conscious thought. If you are woken up while in stage four, you will be disoriented, unable to answer basic questions, and want nothing more than to go back to sleep, a condition that researchers call sleep drunkenness. The final stage is REM sleep, so named because of the rapid movements of your eyes dancing against your eyelids. In this stage of sleep, the brain is as active as it is when it is awake. This is when most dreams occur.

(Recall the role of REM sleep in regulating negative emotions.)

Randall’s most urgent point, however, echoes what we’ve already heard from German chronobiologist Till Roenneberg, who studies internal time — in our blind lust for the “luxuries” of modern life, with all its 24-hour news cycles, artificial lighting on demand, and expectations of round-the-clock telecommunications availability, we’ve thrown ourselves into a kind of circadian schizophrenia:

We are living in an age when sleep is more comfortable than ever and yet more elusive. Even the worst dorm-room mattress in America is luxurious compared to sleeping arrangements that were common not long ago. During the Victorian era, for instance, laborers living in workhouses slept sitting on benches, with their arms dangling over a taut rope in front of them. They paid for this privilege, implying that it was better than the alternatives. Families up to the time of the Industrial Revolution engaged in the nightly ritual of checking for rats and mites burrowing in the one shared bedroom. Modernity brought about a drastic improvement in living standards, but with it came electric lights, television, and other kinds of entertainment that have thrown our sleep patterns into chaos.

Work has morphed into a twenty-four-hour fact of life, bringing its own set of standards and expectations when it comes to sleep … Sleep is ingrained in our cultural ethos as something that can be put off, dosed with coffee, or ignored. And yet maintaining a healthy sleep schedule is now thought of as one of the best forms of preventative medicine.

Reflecting on his findings, Randall marvels:

As I spent more time investigating the science of sleep, I began to understand that these strange hours of the night underpin nearly every moment of our lives.

Indeed, Dreamland goes on to explore how sleep — its mechanisms, its absence, its cultural norms — affects everyone from police officers and truck drivers to artists and entrepreneurs, permeating everything from our decision-making to our emotional intelligence.

Friday, January 31, 2014

Final Keystone XL report: No major boost in greenhouse gasses

By   /   January 31, 2014           
                        
AP photo
AP photo
YOUR TURN: Now it’s up to Secretary of State John Kerry to make a recommendation to President Obama on the fate of the proposed Keystone XL pipeline.

By Deena Winter | Nebraska Watchdog
Updated 4:05 p.m.

LINCOLN, Neb. — The U.S. State Department’s final environmental review of the proposed Keystone XL oil pipeline mirrors earlier conclusions that the pipeline wouldn’t significantly contribute to greenhouse gas emissions.

The report reiterated last year’s draft report conclusion that the pipeline is unlikely to significantly impact the rate of extraction of oil sands or the continued demand for heavy crude oil in the U.S.
Now that the State Department’s environmental review of TransCanada’s application for a federal permit to build the pipeline is complete, a 90-day review by various federal agencies will commence to determine whether the pipeline is in the national interest, since it crosses a national border. The final decision is expected to be made by Secretary of State John Kerry and President Obama.

Canadian pipeline company TransCanada first applied for permission to build the pipeline in late 2008, but it ran into a wall of opposition in Nebraska. Nebraska pipeline fighters have taken part in and helped organize protests from the governor’s mansion to Washington, D.C., even as most of the Republican statewide public officials have pushed for approval.
Courtesy photo
Courtesy photo
REPORT: The U.S. State Department released its final environmental report Friday on the proposed Keystone XL oil pipeline that would cross America. The report found the pipeline would not significantly affect greenhouse gas emissions.

The Keystone XL pipeline would bisect Nebraska, with nearly 200 miles of pipe buried in a dozen counties. A grassroots group called Bold Nebraska has battled against a foreign company having the power to take land from landowners and possible contamination of the massive Ogallala Aquifer by oil spills.

Pipeline opponents successfully lobbied Obama to reject TransCanada’s initial application in late 2011 and forced the company to reroute the pipeline around the ecologically fragile Sandhills. That’s the route reviewed in the latest State Department reports.

The new report noted that most pipeline spills are small: of the 1,692 incidents between 2002 and 2012, 79 percent were small (up to 2,100 gallons) and just 4 percent were large spills where the oil would migrate away from the release site. It also said modeling indicates “aquifer characteristics would inhibit the spread of released oil, and impacts from a release on water quality would be limited.”

Pipeline opponents in Nebraska have questioned why TransCanada didn’t build the pipeline parallel to its existing Keystone One pipeline that crosses eastern Nebraska, away from the Sandhills and aquifer. The report noted this, but concluded it wasn’t a reasonable alternative because it wouldn’t meet Keystone’s contractual obligations to transport 100,000 barrels per day of crude oil from the Bakken oil play in North Dakota. Also, the corridor would be longer, increasing the risk of spills.

The proposed pipeline has put Obama in a difficult position where he must decide whether to live up to his promises to combat climate change or appease labor unions that generally support the pipeline and jobs it would bring. Obama said last year the pipeline should only be built if it doesn’t increase carbon emissions.

Russ Girling, TransCanada president and chief executive officer, told reporters Friday that while opponents will continue to make noise, “The science continues to show that this pipeline can and will be built safety.”

“This pipeline certainly is in the national interest of the United States,” he said.

Bold Nebraska Executive Director Jane Kleeb saw victories in the fact that the report acknowledged the revised route still crosses the Sandhills, which she called a “big shift” from earlier reports.
Environmental groups vowed to keep the pressure on Obama to reject the project.

“Our side continues to gain ground because landowners and environmentalists are now working together,” Kleeb said Friday.

Regardless of the president’s final verdict, a Nebraska lawsuit could still throw another obstacle in the path of the proposed 1,179-mile pipeline. Landowners who oppose the pipeline sued the state, challenging the constitutionality of a law that changed the pipeline route approval process, giving the governor and state environmental regulators the authority to approve or deny the revised route through Nebraska, rather than the Public Service Commission.

If the route review process is deemed unconstitutional, TransCanada would have to go back to square one with siting. A district judge hasn’t yet made a ruling after a one-day trial in September.
Contact Deena Winter at deena@nebraskawatchdog.org.

Researchers report on new catalyst to convert greenhouse gases into chemicals

By Karen B. Roberts
Researchers report on new catalyst to convert greenhouse gases into chemicals
(Phys.org) —A team of researchers at the University of Delaware has developed a highly selective catalyst capable of electrochemically converting carbon dioxide—a greenhouse gas—to carbon monoxide with 92 percent efficiency. The carbon monoxide then can be used to develop useful chemicals.
The researchers recently reported their findings in Nature Communications.

"Converting to useful chemicals in a selective and efficient way remains a major challenge in renewable and sustainable energy research," according to Feng Jiao, assistant professor of chemical and biomolecular engineering and the project's lead researcher.

Co-authors on the paper include Qi Lu, a postdoctoral fellow, and Jonathan Rosen, a graduate student, working with Jiao.

The researchers found that when they used a nano-porous electrocatalyst, it was 3,000 times more active than polycrystalline silver, a catalyst commonly used in converting carbon dioxide to useful chemicals.

Silver is considered a promising material for a carbon dioxide reduction catalyst because of it offers high selectivity—approximately 81 percent—and because it costs much less than other precious metal catalysts. Additionally, because it is inorganic, silver remains more stable under harsh catalytic environments.

The exceptionally high activity, Jiao said, is likely due to the UD-developed electrocatalyst's extremely large and highly curved internal surface, which is approximately 150 times larger and 20 times intrinsically more active than polycrystalline silver.
   
A UD engineering research team led by Feng Jiao has developed a highly selective catalyst capable of electrochemically converting carbon dioxide to carbon monoxide with 92 percent efficiency.
Credit: Evan Krape        

Jiao explained that the active sites on the curved internal surface required a much smaller than expected voltage to overcome the activation energy barrier needed drive the reaction.
The resulting , he continued, can be used as an industry feedstock for producing synthetic fuels, while reducing industrial carbon dioxide emissions by as much as 40 percent.
To validate whether their findings were unique, the researchers compared the UD-developed nano-porous silver catalyst with other potential carbon dioxide electrocatalysts including polycrystalline silver and other silver nanostructures such as nanoparticles and nanowires.

Testing under identical conditions confirmed the non-porous silver catalyst's significant advantages over other silver catalysts in water environments.
Reducing greenhouse from fossil fuel use is considered critical for human society. Over the last 20 years, electrocatalytic carbon dioxide reduction has attracted attention because of the ability to use electricity from renewable energy sources such as wind, solar and wave.

Ideally, Jiao said, one would like to convert carbon dioxide produced in power plants, refineries and petrochemical plants to fuels or other chemicals through renewable energy use.
A 2007 Intergovernmental Panel on Climate Change report stated that 19 percent of greenhouse gas emissions resulted from industry in 2004, according to the Environmental Protection Agency's website.

"Selective conversion of carbon dioxide to carbon monoxide is a promising route for clean energy but it is a technically difficult process to accomplish," said Jiao. "We're hopeful that the catalyst we've developed can pave the way toward future advances in this area."
Explore further: Process holds promise for production of synthetic gasoline

More information: A selective and efficient electrocatalyst for carbon dioxide reduction." Qi Lu, Jonathan Rosen, Yang Zhou, Gregory S. Hutchings, Yannick C. Kimmel, Jingguang G. Chen, Feng Jiao. Nature Communications 5, Article number: 3242 DOI: 10.1038/ncomms4242 . Received 10 September 2013 Accepted 10 January 2014 Published 30 January 2014

Journal reference: Nature Communications search and more info website

 Read more at: http://phys.org/news/2014-01-catalyst-greenhouse-gases-chemicals.html#jCp

Glass that bends but doesn’t break

Natural forms inspire McGill researchers to develop a technique to make glass less brittle     
Published: 29 Jan 2014
Bio-inspired glass
Normally when you drop a drinking glass on the floor it shatters. But, in future, thanks to a technique developed in McGill’s Department of Mechanical Engineering, when the same thing happens the glass is likely to simply bend and become slightly deformed. That’s because Prof. François Barthelat and his team have successfully taken inspiration from the mechanics of natural structures like seashells in order to significantly increase the toughness of glass.
Normally when you drop a drinking glass on the floor it shatters. But, in future, thanks to a technique developed in McGill’s Department of Mechanical Engineering, when the same thing happens the glass is likely to simply bend and become slightly deformed. That’s because Prof. François Barthelat and his team have successfully taken inspiration from the mechanics of natural structures like seashells in order to significantly increase the toughness of glass.

“Mollusk shells are made up of about 95 per cent chalk, which is very brittle in its pure form,” says Barthelat. “But nacre, or mother-of-pearl, which coats the inner shells, is made up of microscopic tablets that are a bit like miniature Lego building blocks, is known to be extremely strong and tough, which is why people have been studying its structure for the past twenty years.”

Previous attempts to recreate the structures of nacre have proved to be challenging, according to Barthelat. “Imagine trying to build a Lego wall with microscopic building blocks. It’s not the easiest thing in the world.” Instead, what he and his team chose to do was to study the internal ‘weak’ boundaries or edges to be found in natural materials like nacre and then use lasers to engrave networks of 3D micro-cracks in glass slides in order to create similar weak boundaries. The results were dramatic.

The researchers were able to increase the toughness of glass slides (the kind of glass rectangles that get put under microscopes) 200 times compared to non-engraved slides. By engraving networks of micro-cracks in configurations of wavy lines in shapes similar to the wavy edges of pieces in a jigsaw puzzle in the surface of borosilicate glass, they were able to stop the cracks from propagating and becoming larger. They then filled these micro-cracks with polyurethane, although according to Barthelat, this second process is not essential since the patterns of micro-cracks in themselves are sufficient to stop the glass from shattering.

The researchers worked with glass slides simply because they were accessible, but Barthelat believes that the process will be very easy to scale up to any size of glass sheet, since people are already engraving logos and patterns on glass panels. He and his team are excited about the work that lies ahead for them.

“What we know now is that we can toughen glass, or other materials, by using patterns of micro-cracks to guide larger cracks, and in the process absorb the energy from an impact,” says Barthelat. “We chose to work with glass because we wanted to work with the archetypal brittle material. But we plan to go on to work with ceramics and polymers in future. Observing the natural world can clearly lead to improved man-made designs.”

To read the full paper: ‘Overcoming the brittleness of glass through bio-inspiration and micro-architecture’ by F. Barthelat et al in Nature Communications: http://www.nature.com/ncomms/2014/140128/ncomms4166/full/ncomms4166.html
The research was funded by the Natural Sciences and Engineering Research Council of Canada (NSERC) and the Canada Foundation for Innovation (CFI), with partial support for one of the authors from a McGill Engineering Doctoral Award. The authors acknowledge useful technical advice by the company Vitro.
To contact the researcher directly: François Barthelat | Dept. Of Mechanical Engineering | McGill University | 514-398-6318

Contact Information

Contact: Katherine Gombay
Organization: Media Relations Office
Office Phone: 514-398-2189

Reproductive rights

From Wikipedia, the free encyclo...