Search This Blog

Thursday, January 16, 2014

Renewable chemical ready for biofuels scale-up.

12 minutes ago by Margaret Broeren

The flow-through reaction setup progressively dissolves biomass producing fractions that are rich in (from left to right) lignin monomers, hemicellulose and cellulose-derived sugars. Credit: Matthew Wisniewski/Wisconsin Energy Institute
 
Using a plant-derived chemical, University of Wisconsin-Madison researchers have developed a process for creating a concentrated stream of sugars that's ripe with possibility for biofuels.
"With the sugar platform, you have possibilities," says Jeremy Luterbacher, a UW-Madison postdoctoral researcher and the paper's lead author. "You've taken fewer forks down the conversion road, which leaves you with more end destinations, such as cellulosic ethanol and drop-in biofuels."

Funded by the National Science Foundation and the U.S. Department of Energy's Great Lakes Bioenergy Research Center (GLBRC), the research team has published its findings in the Jan. 17, 2014 issue of the journal Science, explaining how they use gamma valerolactone, or GVL, to deconstruct plants and produce sugars that can be chemically or biologically upgraded into biofuels. With support from the Wisconsin Alumni Research Foundation (WARF), the team will begin scaling up the process later this year.

Because GVL is created from the plant material, it's both renewable and more affordable than conversion methods requiring expensive chemicals or enzymes. The process also converts 85 to 95 percent of the starting material to sugars that can be fed to yeast for fermentation into ethanol, or chemically upgraded furans to create drop-in biofuels.

To demonstrate the economic viability of this advance, Luterbacher needed to concentrate the sugar, remove the GVL for reuse, and show that yeast could successfully generate ethanol from the sugar stream.

"Showing that removing and recycling GVL can be done easily, with a low-energy separation step, is a little more of an achievement," says Luterbacher. "By feeding the resulting sugar solution to microorganisms, we proved we weren't producing some weird chemical byproducts that would kill the yeast, and that we were taking out enough GVL to make it nontoxic."

"What's neat is that we can use additives to make the solution separate," says Luterbacher. "It becomes like oil and vinegar." Their additive of choice: liquid carbon dioxide.

"It's green, nontoxic and can be removed by simple depressurization once you want GVL and solutions of to mix again. It's the perfect additive," Luterbacher says.

An initial economic assessment of the process has indicated the technology could produce ethanol at a cost savings of roughly 10 percent when compared with current state-of-the-art technologies.
For the past several years, James Dumesic, Steenbock Professor and Michel Boudart Professor of Chemical and Biological Engineering at UW-Madison, and his research group have studied the production of GVL from biomass, and in more recent work they explored the use of GVL as a solvent for the conversion of biomass to furan chemicals.

"The knowledge gained in these previous studies was invaluable to us in the implementation of our new approach to convert real biomass to aqueous solutions of sugars that are suitable for biological conversion," says Dumesic.

This research has contributed new knowledge to the biofuels landscape, resulted in four patent applications, and gained recognition for GVL's commercial potential from WARF's Accelerator Program. The program helps license high potential technologies more rapidly by addressing specific technical hurdles with targeted funding and expert advice from seasoned business mentors in related fields.

Under the Accelerator Program effort, Dumesic will serve as principal investigator for an 18-month project involving construction of a high-efficiency biomass reactor. The reactor will use GVL to produce concentrated streams of high-value sugars and intact lignin solids.

Carbohydrates and lignin from the reactor will be delivered to scientific collaborators, including fellow GLBRC investigators, who will optimize strategies for converting the materials into valuable chemicals and fuels.

"We're excited by the team's scientific achievements and we look forward to supporting the project's next steps through the Accelerator Program," says Leigh Cagan, WARF's chief technology commercialization officer. "If the project successfully achieves the anticipated cost reductions for production of the sugars, lignin and ethanol, we anticipate significant commercial interest in this novel process."
Explore further: Making alternative fuels cheaper: New synthesis could make biofuel more appealing for mass production
 
More information: "Nonenzymatic Sugar Production from Biomass Using Biomass-Derived γ-Valerolactone," by J.S. Luterbacher et al. Science, 2014.

Lord Rayleigh Explains it All
















  http://en.wikipedia.org/wiki/Rayleigh_scattering

John William Strutt, 3rd Baron Rayleigh, OM, PRS (/ˈrli/; 12 November 1842 – 30 June 1919) was an English physicist who, with William Ramsay, discovered argon, an achievement for which he earned the Nobel Prize for Physics in 1904. He also discovered the phenomenon now called Rayleigh scattering, explaining why the sky is blue, and predicted the existence of the surface waves now known as Rayleigh waves. Rayleigh's textbook, The Theory of Sound, is still referred to by acoustic engineers today.


Rayleigh scattering causes the blue hue of the daytime sky and the reddening of the sun at sunset.

Rayleigh scattering is more evident after sunset. This picture was taken about one hour after sunset at 500m altitude, looking at the horizon where the sun had set.

Rayleigh scattering (pronounced /ˈrli/ RAY-lee), named after the British physicist Lord Rayleigh,[1] is the elastic scattering of light or other electromagnetic radiation by particles much smaller than the wavelength of the light. After the Rayleigh scattering the state of material remains unchanged, hence Rayleigh scattering is also said to be a parametric process. The particles may be individual atoms or molecules. It can occur when light travels through transparent solids and liquids, but is most prominently seen in gases. Rayleigh scattering results from the electric polarizability of the particles. The oscillating electric field of a light wave acts on the charges within a particle, causing them to move at the same frequency. The particle therefore becomes a small radiating dipole whose radiation we see as scattered light.
Rayleigh scattering of sunlight in the atmosphere causes diffuse sky radiation, which is the reason for the blue color of the sky and the yellow tone of the sun itself.
Scattering by particles similar to or larger than the wavelength of light is typically treated by the Mie theory, the discrete dipole approximation and other computational techniques. Rayleigh scattering applies to particles that are small with respect to wavelengths of light, and that are optically "soft" (i.e. with a refractive index close to 1). On the other hand, Anomalous Diffraction Theory applies to optically soft but larger particles.

The intensity I of light scattered by a single small particle from a beam of unpolarized light of wavelength λ and intensity I0 is given by:
 I = I_0 \frac{ 1+\cos^2 \theta }{2 R^2} \left( \frac{ 2 \pi }{ \lambda } \right)^4 \left( \frac{ n^2-1}{ n^2+2 } \right)^2 \left( \frac{d}{2} \right)^6[3]
where R is the distance to the particle, θ is the scattering angle, n is the refractive index of the particle, and d is the diameter of the particle. The Rayleigh scattering cross-section is given by [4]
 \sigma_\text{s} = \frac{ 2 \pi^5}{3} \frac{d^6}{\lambda^4} \left( \frac{ n^2-1}{ n^2+2 } \right)^2
The Rayleigh scattering coefficient for a group of scattering particles is the number of particles per unit volume N times the cross-section. As with all wave effects, for incoherent scattering the scattered powers add arithmetically, while for coherent scattering, such as if the particles are very near each other, the fields add arithmetically and the sum must be squared to obtain the total scattered power.

Reason for the blue color of the sky


Scattered blue light is polarized. The picture on the right is shot through a polarizing filter which removes light that is linearly polarized in a specific direction.

A portion of the beam of light coming from the sun scatters off molecules of gas and other small particles in the atmosphere. Here, Rayleigh scattering primarily occurs through sunlight's interaction with randomly located air molecules. Exactly equivalently, but from a purely macroscopic point of view, the scattering comes from the microscopic density fluctuations which result from the random distribution of molecules in the air. A region of higher or lower density has a slightly different refractive index from the surrounding medium, and therefore it acts like a short-lived scattering particle. It is this scattered light that gives the surrounding sky its brightness and its color. As previously stated, Rayleigh scattering is inversely proportional to the fourth power of wavelength, so that shorter wavelength violet and blue light will scatter more than the longer wavelengths (yellow and especially red light). However, the Sun, like any star, has its own spectrum and so I0 in the scattering formula above is not constant but falls away in the violet. In addition the oxygen in the Earth's atmosphere absorbs wavelengths at the edge of the ultra-violet region of the spectrum. The resulting color, which appears like a pale blue, actually is a mixture of all the scattered colors, mainly blue and green. Conversely, glancing toward the sun, the colors that were not scattered away — the longer wavelengths such as red and yellow light — are directly visible, giving the sun itself a slightly yellowish hue. Viewed from space, however, the sky is black and the sun is white.
The reddening of sunlight is intensified when the sun is near the horizon, because the volume of air through which sunlight must pass is significantly greater than when the sun is high in the sky. The Rayleigh scattering effect is thus increased, removing virtually all blue light from the direct path to the observer. The remaining unscattered light is mostly of a longer wavelength, and therefore appears to be orange.

Some of the scattering can also be from sulfate particles. For years after large Plinian eruptions, the blue cast of the sky is notably brightened by the persistent sulfate load of the stratospheric gases. Some works of the artist J. M. W. Turner may owe their vivid red colours to the eruption of Mount Tambora in his lifetime.

In locations with little light pollution, the moonlit night sky is also blue, because moonlight is reflected sunlight, with a slightly lower color temperature due to the brownish color of the moon. The moonlit sky is not perceived as blue, however, because at low light levels human vision comes mainly from rod cells that do not produce any color perception (Purkinje effect).

In laymans' terms, by Analogy

Have you ever sent waves to a log floating in a lake?  If so, you probably (I hope) noticed a curious phenomenon.  Waves shorter than the log's width, even if fairly high, tend to bounce off the log, and scattered back and towards the sides.  Waves longer than the log's width, however, cause the log to bob up and down and go right through it, almost as though the log wasn't there.

In the world of light, blue waves are the shorter (by about a half) than red waves, and instead of logs we use molecules of air (mostly N2 and O2).  Air molecules are on the order of half a micron across (0.0000005 meters), as is the wavelength of white light.  As noted, however, blue light is some half the length of red light.  And yes, for similar reasons, the blue light will bounce and scatter off the molecule than the red, with a factor of 2*2*2*2 = 16 greater intensity ( the lambda to the fourth factor in the equations above).  When the sun is high in a clear sky, this is what you see; the enhanced scattering of the blue spectrum of the was (the solar disk is yellowish-white due to the deprivation of its blue light).

We've all noticed the sun grow deeper yellow, then orange, then red as it sets or rises.  This explained by the fact that, near the horizon, sunlight must pass through considerably longer and denser air.  The more air it must pass through, the more blue light is scattered completely out to space.  Only longer light wavelengths can still pass through (albeit attenuated), and you see from the top picture the top half of the sun is deep yellow, followed by orange beneath, then red, and then the red fades to darkness.  Again, it is all a scattering phenomenon, just to different degrees.  Red, having the longest wavelength, lasts the longest.  However, the mistake we must not make is to assumed the light is altered somehow.  If you have even watched a beam of white light pass through a triangular prism, you know that light is not changed, only separated into its constituent colors (via a different mechanism).

Jon Stewart On Iran Sanctions: 'Congress Is The Justin Bieber Of Our Government'

Jon Stewart On Iran Sanctions: 'Congress Is The Justin Bieber Of Our Government'         
The Huffington Post  |  By Posted:   |  Updated: 01/16/2014 9:23 am        
After seeing much of the news dominated by Justin Bieber's house raid, Jon Stewart was pleased that our government is responsible enough to help broker a deal that will bring peace between Iran and the U.S.
Except that it isn't.

"Congress is the Justin Bieber of our government, throwing away for no reason whatsoever a tremendous opportunity because of immaturity and a lack of self control," Stewart declared, pointing to many senate Republicans' hesitance to support the deal.

He was hardly surprised that Republicans wanted to put new sanctions in place, but the fact that 16 Democrats wanted to do it as well surprised him.

"I get Republicans," Stewart joked. "They would line up to oppose Obama on the Orgasms Cure Cancer Act."

http://www.huffingtonpost.com/2014/01/16/jon-stewart-iran-sanctions_n_4609526.html?ncid=txtlnkushpmg00000037&ir=Politics
Check out the full clip above to see Stewart's dismay with Congress' inaction, and the difficulties caused by various interests in the Middle East.
 

Climate change: The case of the missing heat

Sixteen years into the mysterious ‘global-warming hiatus’, scientists are piecing together (DJS-yet) an(other) explanation.
Tim Graham/Robert Harding Picture Library
 
The Pacific Ocean may hold the key to understanding why global warming has stalled.
The biggest mystery in climate science today may have begun, unbeknownst to anybody at the time, with a subtle weakening of the tropical trade winds blowing across the Pacific Ocean in late 1997. (Strumfels:  or was it '98, 05', 10'?)  These winds normally push sun-baked water towards Indonesia. When they slackened, the warm water sloshed back towards South America, resulting in a spectacular example of a phenomenon known as El Niño. Average global temperatures hit a record high in 1998 — and then the warming stalled.  (Strumfels:  according to some climate skeptics, but not according to the non-skeptical.  And if true, why is Australia having one of its hottest summers on record?

(The first theory of warming abatement was the dispersal of CFSs in the Antarctic.  Then came the multi-part denial of abatement/possible cooling -- the time period was too short, 2012 was one of the hosttest years on record, etc -- and the '97 "beginning of cooling" was treated as ludicrous.  Now we have Australia's/ Indonesia's  heat wave whilst the warmer Pacific trade winds have concentrated near South America (and how are they doing?)  What will it be next?  Furthermore, how accurate are global climate change models if new data & factos, both by skeptics and supporters, are being discovered all the time?)

For several years, scientists wrote off the stall as noise in the climate system: the natural variations in the atmosphere, oceans and biosphere that drive warm or cool spells around the globe. But the pause has persisted, sparking a minor crisis of confidence in the field. Although there have been jumps and dips, average atmospheric temperatures have risen little since 1998, in seeming defiance of projections of climate models and the ever-increasing emissions of greenhouse gases. Climate sceptics have seized on the temperature trends as evidence that global warming has ground to a halt.
Climate scientists, meanwhile, know that heat must still be building up somewhere in the climate system, but they have struggled to explain where it is going, if not into the atmosphere. Some have begun to wonder whether there is something amiss in their models.

Now, as the global-warming hiatus enters its sixteenth year, scientists are at last making headway in the case of the missing heat. Some have pointed to the Sun, volcanoes and even pollution from China as potential culprits, but recent studies suggest that the oceans are key to explaining the anomaly. The latest suspect is the El Niño of 1997–98, which pumped prodigious quantities of heat out of the oceans and into the atmosphere — perhaps enough to tip the equatorial Pacific into a prolonged cold state that has suppressed global temperatures ever since.

“The 1997 to ’98 El Niño event was a trigger for the changes in the Pacific, and I think that’s very probably the beginning of the hiatus,” says Kevin Trenberth, a climate scientist at the National Center for Atmospheric Research (NCAR) in Boulder, Colorado. According to this theory, the tropical Pacific should snap out of its prolonged cold spell in the coming years.“Eventually,” Trenberth says, “it will switch back in the other direction.”

Stark contrast

On a chart of global atmospheric temperatures, the hiatus stands in stark contrast to the rapid warming of the two decades that preceded it. Simulations conducted in advance of the 2013–14 assessment from the Intergovernmental Panel on Climate Change (IPCC) suggest that the warming should have continued at an average rate of 0.21 °C per decade from 1998 to 2012. Instead, the observed warming during that period was just 0.04 °C per decade, as measured by the UK Met Office in Exeter and the Climatic Research Unit at the University of East Anglia in Norwich, UK.
The simplest explanation for both the hiatus and the discrepancy in the models is natural variability. Much like the swings between warm and cold in day-to-day weather, chaotic climate fluctuations can knock global temperatures up or down from year to year and decade to decade. Records of past climate show some long-lasting global heatwaves and cold snaps, and climate models suggest that either of these can occur as the world warms under the influence of greenhouse gases.
Nate Mantua/NOAA
But none of the climate simulations carried out for the IPCC produced this particular hiatus at this particular time. That has led sceptics — and some scientists — to the controversial conclusion that the models might be overestimating the effect of greenhouse gases, and that future warming might not be as strong as is feared. Others say that this conclusion goes against the long-term temperature trends, as well as palaeoclimate data that are used to extend the temperature record far into the past. And many researchers caution against evaluating models on the basis of a relatively short-term blip in the climate. “If you are interested in global climate change, your main focus ought to be on timescales of 50 to 100 years,” says Susan Solomon, a climate scientist at the Massachusetts Institute of Technology in Cambridge.

But even those scientists who remain confident in the underlying models acknowledge that there is increasing pressure to work out just what is happening today. “A few years ago you saw the hiatus, but it could be dismissed because it was well within the noise,” says Gabriel Vecchi, a climate scientist at the US National Oceanic and Atmospheric Administration’s Geophysical Fluid Dynamics Laboratory in Princeton, New Jersey. “Now it’s something to explain.”

Researchers have followed various leads in recent years, focusing mainly on a trio of factors: the Sun1, atmospheric aerosol particles2 and the oceans3. The output of energy from the Sun tends to wax and wane on an 11-year cycle, but the Sun entered a prolonged lull around the turn of the millennium.
The natural 11-year cycle is currently approaching its peak, but thus far it has been the weakest solar maximum in a century. This could help to explain both the hiatus and the discrepancy in the model simulations, which include a higher solar output than Earth has experienced since 2000.

An unexpected increase in the number of stratospheric aerosol particles could be another factor keeping Earth cooler than predicted. These particles reflect sunlight back into space, and scientists suspect that small volcanoes — and perhaps even industrialization in China — could have pumped extra aerosols into the stratosphere during the past 16 years, depressing global temperatures.

Some have argued that these two factors could be primary drivers of the hiatus, but studies published in the past few years suggest that their effects are likely to be relatively small4, 5. Trenberth, for example, analysed their impacts on the basis of satellite measurements of energy entering and exiting the planet, and estimated that aerosols and solar activity account for just 20% of the hiatus. That leaves the bulk of the hiatus to the oceans, which serve as giant sponges for heat. And here, the spotlight falls on the equatorial Pacific.

Just before the hiatus took hold, that region had turned unusually warm during the El Niño of 1997–98, which fuelled extreme weather across the planet, from floods in Chile and California to droughts and wildfires in Mexico and Indonesia. But it ended just as quickly as it had begun, and by late 1998 cold waters — a mark of El Niño’s sister effect, La Niña — had returned to the eastern equatorial Pacific with a vengeance. More importantly, the entire eastern Pacific flipped into a cool state that has continued more or less to this day.

This variation in ocean temperature, known as the Pacific Decadal Oscillation (PDO), may be a crucial piece of the hiatus puzzle. The cycle reverses every 15–30 years, and in its positive phase, the oscillation favours El Niño, which tends to warm the atmosphere (see ‘The fickle ocean’). After a couple of decades of releasing heat from the eastern and central Pacific, the region cools and enters the negative phase of the PDO. This state tends towards La Niña, which brings cool waters up from the depths along the Equator and tends to cool the planet. Researchers identified the PDO pattern in 1997, but have only recently begun to understand how it fits in with broader ocean-circulation patterns and how it may help to explain the hiatus.

One important finding came in 2011, when a team of researchers at NCAR led by Gerald Meehl reported that inserting a PDO pattern into global climate models causes decade-scale breaks in global warming3. Ocean-temperature data from the recent hiatus reveal why: in a subsequent study, the NCAR researchers showed that more heat moved into the deep ocean after 1998, which helped to prevent the atmosphere from warming6. In a third paper, the group used computer models to document the flip side of the process: when the PDO switches to its positive phase, it heats up the surface ocean and atmosphere, helping to drive decades of rapid warming7.

A key breakthrough came last year from Shang-Ping Xie and Yu Kosaka at the Scripps Institution of Oceanography in La Jolla, California. The duo took a different tack, by programming a model with actual sea surface temperatures from recent decades in the eastern equatorial Pacific, and then seeing what happened to the rest of the globe8. Their model not only recreated the hiatus in global temperatures, but also reproduced some of the seasonal and regional climate trends that have marked the hiatus, including warming in many areas and cooler northern winters.

“It was actually a revelation for me when I saw that paper,” says John Fyfe, a climate modeller at the Canadian Centre for Climate Modelling and Analysis in Victoria. But it did not, he adds, explain everything. “What it skirted was the question of what is driving the tropical cooling.”
Univ. Washington/IPCC
That was investigated by Trenberth and John Fasullo, also at NCAR, who brought in winds and ocean data to explain how the pattern emerges4. Their study documents how tropical trade winds associated with La Niña conditions help to drive warm water westward and, ultimately, deep into the ocean, while promoting the upwelling of cool waters along the eastern equatorial region. In extreme cases, such as the La Niña of 1998, this may be able to push the ocean into a cool phase of the PDO.
An analysis of historical data buttressed these conclusions, showing that the cool phase of the PDO coincided with a few decades of cooler temperatures after the Second World War (see ‘The Pacific’s global reach’), and that the warm phase lined up with the sharp spike seen in global temperatures between 1976 and 1998 (ref. 4).

“I believe the evidence is pretty clear,” says Mark Cane, a climatologist at Columbia University in New York. “It’s not about aerosols or stratospheric water vapour; it’s about having had a decade of cooler temperatures in the eastern equatorial Pacific.”

Heated debate

Cane was the first to predict the current cooling in the Pacific, although the implications weren’t clear at the time. In 2004, he and his colleagues found that a simple regional climate model predicted a warm shift in the Pacific that began around 1976, when global temperatures began to rise sharply9. Almost as an afterthought, they concluded their paper with a simple forecast: “For what it is worth the model predicts that the 1998 El Niño ended the post-1976 tropical Pacific warm period.”
It is an eerily accurate result, but the work remains hotly contested, in part because it is based on a partial climate model that focuses on the equatorial Pacific alone. Cane further maintains that the trend over the past century has been towards warmer temperatures in the western Pacific relative to those in the east. That opens the door, he says, to the possibility that warming from greenhouse gases is driving La Niña-like conditions and could continue to do so in the future, helping to suppress global warming. “If all of that is true, it’s a negative feedback, and if we don’t capture it in our models they will overstate the warming,” he says.

There are two potential holes in his assessment. First, the historical ocean-temperature data are notoriously imprecise, leading many researchers to dispute Cane’s assertion that the equatorial Pacific shifted towards a more La Niña-like state during the past century10. Second, many researchers have found the opposite pattern in simulations with full climate models, which factor in the suite of atmospheric and oceanic interactions beyond the equatorial Pacific. These tend to reveal a trend towards more El Niño-like conditions as a result of global warming. The difference seems to lie, in part, in how warming influences evaporation in areas of the Pacific, according to Trenberth. He says the models suggest that global warming has a greater impact on temperatures in the relatively cool east, because the increase in evaporation adds water vapour to the atmosphere there and enhances atmospheric warming; this effect is weaker in the warmer western Pacific, where the air is already saturated with moisture.

Scientists may get to test their theories soon enough. At present, strong tropical trade winds are pushing ever more warm water westward towards Indonesia, fuelling storms such as November’s Typhoon Haiyan, and nudging up sea levels in the western Pacific; they are now roughly 20 centimetres higher than those in the eastern Pacific. Sooner or later, the trend will inevitably reverse. “You can’t keep piling up warm water in the western Pacific,” Trenberth says. “At some point, the water will get so high that it just sloshes back.” And when that happens, if scientists are on the right track, the missing heat will reappear and temperatures will spike once again.
Journal name:
Nature
Volume:
505,
Pages:
276–278
Date published:
()
DOI:
doi:10.1038/505276a

Texas Public Schools Are Teaching Creationism

An investigation into charter schools’ dishonest and unconstitutional science, history, and “values” lessons.  By  in Slate Magazine, Jan. 16 2014 4:30 AM

140115_SCI_Creationism
A Texas charter school group has a secular veneer and is funded by public money, but it has been connected from its inception to the creationist movement.
Illustration by Lisa Larson-Walker
When public-school students enrolled in Texas’ largest charter program open their biology workbooks, they will read that the fossil record is “sketchy.” That evolution is “dogma” and an “unproved theory” with no experimental basis. They will be told that leading scientists dispute the mechanisms of evolution and the age of the Earth. These are all lies.
The more than 17,000 students in the Responsive Education Solutions charter system will learn in their history classes that some residents of the Philippines were “pagans in various levels of civilization.” They’ll read in a history textbook that feminism forced women to turn to the government as a “surrogate husband.”
 
Responsive Ed has a secular veneer and is funded by public money, but it has been connected from its inception to the creationist movement and to far-right fundamentalists who seek to undermine the separation of church and state.
Infiltrating and subverting the charter-school movement has allowed Responsive Ed to carry out its religious agenda—and it is succeeding. Operating more than 65 campuses in Texas, Arkansas, and Indiana, Responsive Ed receives more than $82 million in taxpayer money annually, and it is expanding, with 20 more Texas campuses opening in 2014.

Charter schools may be run independently, but they are still public schools, and through an open records request, I was able to obtain a set of Responsive Ed’s biology “Knowledge Units,” workbooks that Responsive Ed students must complete to pass biology. These workbooks both overtly and underhandedly discredit evidence-based science and allow creationism into public-school classrooms.

A favorite creationist claim is that there is “uncertainty” in the fossil record, and Responsive Ed does not disappoint. The workbook cites the “lack of a single source for all the rock layers as an argument against evolution.”

I asked Ken Miller, a co-author of the Miller-Levine Biology textbook published by Pearson and one of the most widely used science textbooks on the market today, to respond to claims about the fossil record and other inaccuracies in the Responsive Ed curriculum. (It’s worth noting that creationists on the Texas State Board of Education recently tried, and failed, to block the approval of Miller’s textbook because it teaches evolution.)
Charles Darwin, circa 1870, glass negative.
Charles Darwin, circa 1870.
Courtesy of the George Grantham Bain Collection/Library of Congress
“Of course there is no ‘single source’ for all rock layers,” Miller told me over email. “However, the pioneers of the geological sciences observed that the sequence of distinctive rock layers in one place (southern England, for example) could be correlated with identical layers in other places, and eventually merged into a single system of stratigraphy. All of this was established well before Darwin's work on evolution.”

The workbook also claims, “Some scientists even question the validity of the conclusions concerning the age of the Earth.” As Miller pointed out, “The statement that ‘some scientists question,’ is a typical way that students can be misled into thinking that there is serious scientific debate about the age of the Earth or the nature of the geological record. The evidence that the Earth was formed between 4 and 5 billion years ago is overwhelming.”

Another Responsive Ed section claims that evolution cannot be tested, something biologists have been doing for decades. It misinforms students by claiming, “How can scientists do experiments on something that takes millions of years to accomplish? It’s impossible.”

The curriculum tells students that a “lack of transitional fossils” is a “problem for evolutionists who hold a view of uninterrupted evolution over long periods of time.”

“The assertion that there are no ‘transitional fossils’ is false,” Miller responded. “We have excellent examples of transitional forms documenting the evolution of amphibians, mammals, and birds, to name some major groups. We also have well-studied transitional forms documenting the evolution of whales, elephants, horses, and humans.”
Peking Man Skull (replica) presented at Paleozoological Museum of China, February 2009.
A replica of the Peking man skull presented at the Paleozoological Museum of China in February 2009.
Photo by Yan Li/Creative Commons
Evolution is not a scientific controversy, and there are no competing scientific theories. All of the evidence supports evolution, and the overwhelming majority of scientists accept the evidence for it. 

Another tactic creationists often use is to associate evolution with eugenics. One Responsive Ed quiz even asks students, “With regards to social Darwinism, do you think humans who are not capable should be left to die out, or should they be helped?”

“They imply that the control of human reproduction and the abandonment of people who might be ‘left to die’ are elements of evolutionary theory,” Miller said. “This is false, and the authors of these questions surely know that.”

Outright creationism appears in Responsive Ed’s section on the origins of life. It’s not subtle. The opening line of the workbook section, just as the opening line of the Bible, declares, “In the beginning, God created the Heavens and the Earth.”

Responsive Ed’s butchering of evolution isn’t the only part of its science curriculum that deserves an F; it also misinforms students about vaccines and mauls the scientific method.
A young girl receives a band aid after getting an H1N1 flu vaccination, December 2009 in San Francisco, California.
A girl reacts to receiving the H1N1 flu vaccination in 2009 in San Francisco.
Photo by Justin Sullivan/Getty Images
The only study linking vaccines to autism was exposed as a fraud and has been retracted, and the relationship has been studied exhaustively and found to be nonexistent. But a Responsive Ed workbook teaches, “We do not know for sure whether vaccines increase a child’s chance of getting autism, but we can conclude that more research needs to be done.”

On the scientific method, Responsive Ed confuses scientific theories and laws. It argues that theories are weaker than laws and that there is a natural progression from theories into laws, all of which is incorrect.

The Responsive Ed curriculum undermines Texas schoolchildren’s future in any possible career in science.

Dan Quinn, the communications director for the Texas Freedom Network, a watchdog organization that monitors the religious right, said, “These materials should raise a big red flag for any parent or school administrator. It's bad enough that they promote the same discredited anti-evolution arguments that scientists debunked a long time ago. But the materials also veer into teaching religious beliefs that the courts have repeatedly ruled have no place in a public school science classroom.”
When it’s not directly quoting the Bible, Responsive Ed’s curriculum showcases the current creationist strategy to compromise science education, which the National Center for Science Education terms “stealth creationism.”

In 1987, the Supreme Court ruled in Edwards v. Aguillard that teaching creationism is unconstitutional. In the 2005 Kitzmiller v. Dover case, Judge John Jones III ruled in federal district court that intelligent design is still creationism and equally unconstitutional.

To get around court rulings, Responsive Ed and other creationists resort to rhetoric about teaching “all sides” of “competing theories” and claiming that this approach promotes “critical thinking.”

In response to a question about whether Responsive Ed teaches creationism, its vice president of academic affairs, Rosalinda Gonzalez, told me that the curriculum “teaches evolution, noting, but not exploring, the existence of competing theories.”

Bringing creationism into a classroom by undermining evolution and “noting … competing theories” is still unconstitutional. What’s more, contrary to Gonzalez’s statement, teaching about supernatural creation in the section on the origins of life is doing far more than noting competing theories.

In a previous Slate column on the Texas textbook wars, I explained that Texas’ current science standards were designed to compromise the teaching of evolution. The standards require teachers to “analyze, evaluate, and critique” evolution and teach “all sides” of evolution to encourage “critical thinking.” These requirements are a back-door way to enable teachers to attack evolution and inject creationism into the classroom. If teachers are questioned on their materials, they can shift the responsibility for what they’re teaching onto the state.

I asked Gonzalez if these science standards played a role in Responsive Ed’s curriculum on evolution, and her answer was yes.

Last month, science won the day in the battle over textbooks, and Texas adopted texts that teach evolution. But schools don’t necessarily have to adhere to this list of textbooks. They can choose, as Responsive Ed does, to use alternative textbooks, which may teach creationism.
Policymakers must understand that on a fundamental level teaching creationism is still unconstitutional. The Texas Legislature could take action to regulate these charter schools. The state Senate Education Committee is currently investigating another charter program due to its ties to the Pelican Educational Foundation, which has been under FBI investigation for alleged financial improprieties and alleged sexual misconduct. Sen. Dan Patrick, chair of the Texas Senate Education Committee and a Republican candidate for lieutenant governor, told the Austin American-Statesman that “legislative scrutiny is necessary to ensure quality in Texas charter schools.” 

It’s high time for Patrick to give some legislative scrutiny to Responsive Ed. But in reality, he is a big fan of the program, which he “lauded in particular” at the Responsive Ed Charter Conference. It’s no wonder; he’s also a creationist. In a recent debate, Patrick said that he would help pass a law to allow creationism to be taught in public schools because, "We need to stand for what this nation was founded upon, which is the word of God.”
Patrick, who has not responded to requests for comment on Responsive Ed, is not the only leading Republican in Texas ready to toss out evolution or the separation of church and state. In fact, every Republican in the race for lieutenant governor in Texas, the incumbent included, is putting a religious agenda ahead of public education. David Dewhurst, the current lieutenant governor, said he “happens ‘to believe in creationism.’”

Greg Abbott, the current attorney general and front-runner in the Texas governor’s race, seems to be of a similar mindset. One piece of his campaign literature shows a gun and a Bible and includes the phrase, “Two things that every American should know how to use … Neither of which are taught in schools.” Abbott’s campaign hasn’t responded to questions about Responsive Ed or creationism in schools.
* * *
Science isn’t the only target of the religious right. The movement also undermines the study of history. I received a set of Responsive Ed U.S. history “Knowledge Units” through my public records request and discovered problems there, too.

In the section on the causes of World War I, the study materials suggest that “anti-Christian bias” coming out of the Enlightenment helped create the foundations for the war. The workbook states,
“[T]he abandoning of religious standards of conduct and the breakdown in respect for governmental authority would lead to one of two options: either anarchy or dictatorship would prevail in the absence of a monarch.” Responsive Ed also asserts that a person’s values are based on solely his or her religious beliefs.

A section on World War II suggests that Japan’s military aggression was led by the samurai. They write, “Following World War I, Japan attempted to solve its economic and social problems by military means. The Samurai, a group promoting a military approach to create a vast Japanese empire in Asia, wanted to expand Japan’s influence along the Chinese mainland including many Pacific Islands.”

I asked one of my former professors about this. Rich Smith, an East Asia scholar at Rice University, said,  “There were no samurai in Japan after WWI; the samurai class was effectively abolished in 1876, after the Meiji Restoration in 1868.”

Responsive Ed continues to demonstrate its religious and cultural biases in a section on the Philippines, describing the population as made up of “Catholics, Moslems (Muslims), and pagans in various stages of civilization.”

When discussing stem cells, it claims President George W. Bush banned stem-cell research because it was done “primarily with the cells from aborted babies.” The California Institute for Regenerative Medicine debunks this on its website: “A common misconception is that the cells can come from aborted fetuses, which is in fact not possible.”
Colored scanning electron micrograph of human embryonic stem cells.
A colored scanning electron micrograph of human embryonic stem cells.
Courtesy of Miodrag Stojkovic/Science Photo Library
About LGBTQ rights, Responsive Ed says, “Laws against the homosexual lifestyle had been repealed in many states, but some states continued to ban the behavior.” The homosexual lifestyle?

About President Franklin Roosevelt, it teaches, “The New Deal had not helped the economy. However, it ushered in a new era of dependency on the Federal government.”
Perhaps the workbook’s best line comes when it explains that President Jimmy Carter pardoned Vietnam War draft dodgers out of “a misguided sense of compassion.”

One of Responsive Ed’s schools, Founders Classical Academy in Lewisville, Texas, where Responsive Ed is based, uses a curriculum far worse even than the Responsive Ed Knowledge Units. The school teaches American history from A Patriot’s History of the United States. The patriots book is “required reading,” according to Glenn Beck, and it opens with an interview between Rush Limbaugh and the author. It is a book that, as Dave Weigel says, “will make you stupider.”

This book teaches the superiority of the West, which in the 1400s and 1500s was apparently “quantum leaps” ahead of “native peoples,” including Ming Dynasty China, one of the most prosperous Chinese dynasties. It explains that the West was superior to “native populations” in battles because “Aztec chiefs and Moor sultans alike were completely vulnerable to massed firepower, yet without the legal framework of republicanism and civic virtue like Europe’s to replace its leadership cadre, a native army could be decapitated at its head by one volley.”
140114_SCI_CreationismTexas_bookpatriot

Instead of being taught that 16th-century Spain had a monarchy, students at Founders Classical Academy are incorrectly learning that it had a form of republican government that was superior to anything that “native peoples” had created.

On the feminist movement, Founders Classical Academy students are taught that feminism “created an entirely new class of females who lacked male financial support and who had to turn to the state as a surrogate husband.”

A Patriot’s History of the United States also addresses the “pinnacle” of the “western way of war” as demonstrated by the Iraq War and questions the legitimacy of Secretary of State John Kerry’s “suspect at best” Purple Hearts and Bronze Star.
* * *
Some of Responsive Ed’s lessons appear harmless at first, but their origin is troubling. Students also learn about “discernment,” which is defined as “understanding the deeper reasons why things happen.” In other sections, students learn other moral lessons such as “values” and “deference.”
These lessons were lifted directly from a company called Character First Education, which was founded by an Oklahoma businessman named Tom Hill. He is a follower of Bill Gothard, a minister who runs the Institute in Basic Life Principles, a Christian organization that teaches its members to incorporate biblical principles into daily life. IBLP is considered a cult by some of its former followers. Gothard developed character qualities associated with a list of “49 General Commands of Christ” that Hill adopted for his character curriculum. Hill then removed Gothard’s references to God and Bible verses and started marketing the curriculum to public schools and other public institutions.

The values taught by Responsive Ed can often be found word for word on Gothard’s website. The Responsive Ed unit on genetics includes “Thoroughness: Knowing what factors will diminish the effectiveness of my work or words if neglected.” The only difference is that Gothard’s website also adds “Proverbs 18:15” after the quote.

Many of Gothard’s teachings revolve around obedience to men, especially that of the wife and the children. Gothard has upset even other conservative Christians. In an interview for an article published by Religion Dispatches, Don Veinot, a conservative Christian and founder of the Midwest Christian Outreach, accused Gothard of “creating a culture of fear.” Gothard has been accused of emotional and sexual abuse by some of his former followers, “happening as far back as the mid- to late-1970’s and as recently as this year.”

Responsive Ed and Character First may have removed the references to God and Bible verses from the curriculum that is being used in public schools, but it is clear that the line between church and state is still being blurred. And nothing that Gothard has created should be allowed near children.

Responsive Ed has plenty of connections to other fundamentalist right-wing organizations as well. Its website’s “Helpful Information” section directs parents to Focus on the Family under the heading of “Family Support.” Under “Values” it steers students to the Traditional Values Coalition, whose website includes a header that says, “Say NO to Obama. Stop Sharia in America.”
* * *
In late October, I visited iSchool High, a Responsive Ed public charter in Houston, and asked the campus director, Michael Laird, about reports that the school was teaching creationism.

A few days before my visit, writing for Salon, Jonny Scaramanga, an activist who reports on Christian education, had exposed a section of iSchool’s curriculum that blamed Hitler’s atrocities on the theory of evolution. Scaramanga had been sent the curriculum by Joshua Bass, the parent of a former iSchool student. Bass rightly viewed this curriculum as an attempt to sneak religion into the classroom and teach creationism.

“Oh, you’re media,” Laird said, sounding strained as he scribbled a phone number on a pink sticky note for me. “You’ll have to talk to the main office.”

I was quickly shuffled out, but while I was not allowed to see any curriculum or talk to any teachers, I did get to look into a classroom from the outside and verified that the setup looked exactly like a picture of an Accelerated Christian Education classroom I had seen on Scaramanga’s website.
The ACE plant and Responsive Ed headquarters in Dallas, Texas.
The ACE plant near the Responsive Ed headquarters near Lewisville, Texas.
Courtesy of Zack Kopplin
ACE is a popular Christian home-school curriculum that’s also used in many private schools and publicly funded voucher schools. It’s the most infamous Christian home-school curriculum and for years taught that the Loch Ness monster was real in its attempts to disprove evolution.
Bass discovered that Responsive Ed was founded by Donald Howard, who had also founded ACE.
But it wasn’t immediately clear exactly how interconnected these two organizations are.
 
ACE and Responsive Ed are both headquartered in Lewisville, just 4 miles apart, and staff members appear to rotate between the two organizations.

When I asked Responsive Ed’s Gonzalez about her charter network’s history with Howard and ACE, she said that none of the ACE founders, including Howard, had been associated with Responsive Ed for the past seven or eight years. But I found that five members of Responsive Ed’s current board and leadership group used to work for ACE (also known as School of Tomorrow). Responsive Ed's current CEO, Charles Cook, spent several years in charge of marketing at ACE before he joined Responsive Ed, and he designed the original curriculum that Responsive Ed used.

Raymond Moore, one of Responsive Ed’s earliest principals (at that time Responsive Ed was known as Eagle Charter Schools), explained that while Responsive Ed “took the Christian vernacular out” of ACE curriculum, they still “put in character traits that reflect our values.” He also noted that “almost everyone in the management has been in the ministry.”

Howard expressed this same sentiment about his charter schools in an interview with the Wall Street Journal in 1998, saying, “Take the Ten Commandments—you can rework those as ‘success principles’ by rewording them. We will call it truth, we will call it principles, we will call it values.
We will not call it religion.” (Hat tip to the Texas Freedom Network and Scaramanga for locating this quote).
One figure stands out when it comes to revealing the political and religious agenda behind the Responsive Ed charter schools. ACE’s former vice president, Ronald Johnson, founded a curriculum company, Paradigm Accelerated Curriculum, which also ran four public charter schools in Texas.
Paradigm’s curriculum teaches abstinence in English class. ChristianBook.com describes the science curriculum as teaching “evolution from a young-earth creationist perspective.” Paradigm’s website also says that the curriculum is “carefully designed to equip high school students to defend their faith” and is being used in public schools in 11 states.

Paradigm and Johnson are closely connected to Responsive Ed. In 2010, Responsive Ed absorbed Paradigm, taking over its schools and replacing its board. Paradigm noted in a press release that this allowed Responsive Ed to “incorporate the PACS system and curriculum across Texas and in other states.” The release described this as “a ‘win-win’ situation for both organizations” because Responsive Ed schools already use a “learning system based on a manual designed and written by Dr. Johnson while he was Vice President of [Accelerated Christian Education].” Before 2010, Responsive Ed and Paradigm operated on the same model, and now Paradigm and Responsive Ed are the same organization.

The release also added that Johnson would continue his role in marketing Paradigm curriculum (now for Responsive Ed) and would train Responsive Ed’s teachers and help design the curriculum used in their schools.

While Responsive Ed attempts to preserve a facade of secularism, on the Paradigm website, Johnson is far more explicit about his goal of subverting charter programs.

Johnson believes the public education system strips students “of access to the foundational Judeo-Christian moral and economic virtues.” Other problems with public education include “sex education classes and distribution of free condoms” and “tolerance of homosexual life-styles and Islam.”

5,900 natural gas leaks discovered under Washington, D.C.

24 minutes ago 

5,900 natural gas leaks discovered under Washington, D.C.
This is a map of the District of Columbia showing where researchers found natural gas leaks under city streets, with colors indicating the concentration in parts per million of methane at each location. Credit: Duke University

More than 5,893 leaks from aging natural gas pipelines have been found under the streets of Washington, D.C. by a research team from Duke University and Boston University.
 A dozen of the leaks could have posed explosion risks, the researchers said. Some manholes had concentrations as high as 500,000 parts per million of natural gas – about 10 times greater than the threshold at which explosions can occur.

Four months after phoning in the leaks to city authorities, the research team returned and found that nine were still emitting dangerous levels of methane. "Finding the leaks a second time, four months after we first reported them, was really surprising," said Robert B. Jackson, a professor of environmental sciences at Duke who led the study.

The researchers published their findings this week in the peer-reviewed journal Environmental Science & Technology.

"Repairing these leaks will improve air quality, increase consumer health and safety, and save money," Jackson said. "Pipeline safety has been improving over the last two decades. Now is the time to make it even better."

Nationally, natural gas failures cause an average of 17 fatalities, 68 injuries, and $133 million in property damage annually, according to the U.S. Pipeline and Hazardous Materials Safety Administration.

In addition to the explosion hazard, natural gas leaks also pose another threat: Methane, the primary ingredient of natural gas, is a powerful greenhouse gas that also can catalyze ozone formation. Pipeline leaks are the largest human-caused source of methane in the United States and contribute to $3 billion of lost and unaccounted for natural gas each year.
5,900 natural gas leaks discovered under Washington, D.C.
This is a satellite image of the District of Colombia with bar charts showing where natural gas leaks were located under city streets and in what concentration methane was identified. Higher bars indicate higher concentrations in parts per million. Credit: Duke University

Jackson's team collaborated with researchers from Boston University and Gas Safety, Inc., on the new study. The team mapped gas leaks under all 1,500 road miles within Washington using a high-precision Picarro G2301 Cavity Ring-Down Spectrometer installed in a GPS-equipped car. Laboratory analyses then confirmed that the isotopic chemical signatures of the methane and ethane found in the survey closely matched that of pipeline gas.

The average methane concentration observed in the leaks was about 2.5 times higher than in background air samples collected in the city. Methane levels in some leaks were as high as 89 parts per million, about 45 times higher than normal background levels.

The team also measured how much methane was coming from four individual street-level leaks. "Methane emissions from these four leaks ranged from 9,200 to 38,200 liters per day for each leak—that's comparable to the amount of natural gas used by between 2 and 7 homes," said Duke Ph.D. student Adrian Down.

Last year, the team mapped more than 3,300 leaks beneath 785 road miles in the city of Boston. "The average density of leaks we mapped in the two cities is comparable, but the average methane concentrations are higher in Washington," said Nathan G. Phillips, a professor at Boston University's Department of Earth and Environment.

Like Washington and Boston, many U.S. cities have aging pipeline infrastructure that may be prone to leaks. The researchers recommend coordinated gas-leak mapping campaigns in cities where the infrastructure is deemed to be at risk.

The new study comes at a time when the nation's aging pipeline infrastructure is generating increased legislative attention. Last November, Sen. Edward J. Markey (D-Mass.) introduced two new bills to speed up the replacement of pipelines in states with older infrastructures by offering new federal programs and incentives to help defray the costs associated with the repairs.

"We need to put the right financial incentives in place," said Jackson. "Companies and public utility commissions need help to fix leaks and replace old cast iron pipes more quickly."
Explore further: Developing a new laser to detect methane leaks      

More information: "Natural Gas Pipeline Leaks Across Washington, D.C.," Robert B. Jackson, Adrian Down, Nathan G. Phillips, Robert C. Ackley, Charles W. Cook, Desiree L. Plata and Kaiguang Zhao. Environmental Science & Technology, January 16, 2014. dx.doi.org/10.1021/es404474x
Journal reference: Environmental Science & Technology

 Read more at: http://phys.org/news/2014-01-natural-gas-leaks-washington-dc.html#jCp

Chasing the Dream of Half-Price Gasoline from Natural Gas

A startup called Siluria thinks it’s solved a mystery that has stymied huge oil companies for decades.  By Kevin Bullis on January 15, 2014

http://www.technologyreview.com/news/523146/chasing-the-dream-of-half-price-gasoline-from-natural-gas/?utm_campaign=socialsync&utm_medium=social-post&utm_source=facebook            
 
 
At a pilot plant in Menlo Park, California, a technician pours white pellets into a steel tube and then taps it with a wrench to make sure they settle together. He closes the tube, and oxygen and methane—the main ingredient of natural gas—flow in. Seconds later, water and ethylene, the world’s largest commodity chemical, flow out. Another simple step converts the ethylene into gasoline.
The white pellets are a catalyst developed by the Silicon Valley startup Siluria, which has raised $63.5 million in venture capital. If the catalysts work as well in a large, commercial scale plant as they do in tests, Siluria says, the company could produce gasoline from natural gas at about half the cost of making it from crude oil—at least at today’s cheap natural-gas prices.
If Siluria really can make cheap gasoline from natural gas it will have achieved something that has eluded the world’s top chemists and oil and gas companies for decades. Indeed, finding an inexpensive and direct way to upgrade natural gas into more valuable and useful chemicals and fuels could finally mean a cheap replacement for petroleum.

Natural gas burns much more cleanly than oil—power plants that burn oil emit 50 percent more carbon dioxide than natural gas ones. It also is between two and six times more abundant than oil, and its price has fallen dramatically now that technologies like fracking and horizontal drilling have led to a surge of production from unconventional sources like the Marcellus Shale. While oil costs around $100 a barrel, natural gas sells in the U.S. for the equivalent of $20 a barrel.

But until now oil has maintained a crucial advantage: natural gas is much more difficult to convert into chemicals such as those used to make plastics. And it is relatively expensive to convert natural gas into liquid fuels such as gasoline. It cost Shell $19 billion to build a massive gas-to-liquids plant in Qatar, where natural gas is almost free. The South African energy and chemicals company Sasol is considering a gas-to-liquids plant in Louisiana that it says will cost between $11 billion and $14 billion. Altogether, such plants produce only about 400,000 barrels of liquid fuels and chemicals a day, which is less than half of 1 percent of the 90 million barrels of oil produced daily around the world.

The costs are so high largely because the process is complex and consumes a lot of energy. First high temperatures are required to break methane down into carbon monoxide and hydrogen, creating what is called syngas. The syngas is then subjected to catalytic reactions that turn it into a mixture of hydrocarbons that is costly to refine and separate into products.
For years, chemists have been searching for catalysts that would simplify the process, skipping the syngas step and instead converting methane directly into a specific, desired chemical. Such a process wouldn’t require costly refining and separation steps, and it might consume less energy. But the chemistry is difficult—so much so that some of the world’s top petroleum companies gave up on the idea in the 1980s.

Siluria thinks it can succeed where others have failed not because it understands the chemistry better, but because it has developed new tools for making and screening potential catalysts. Traditionally, chemists have developed catalysts by analyzing how they work and calculating what combination of elements might improve them. Siluria’s basic philosophy is to try out a huge number of catalysts in the hope of getting lucky. The company built an automated system—it looks like a mess of steel and plastic tubes, mass spectrometers, small stainless steel furnaces, and data cables—that can quickly synthesize hundreds of different catalysts at a time and then test how well they convert methane into ethylene.

The system works by varying both what catalysts are made of—the combinations and ratios of various elements—and their microscopic structure. Siluria was founded based on the work of Angela Belcher, a professor of biological engineering at MIT who developed viruses that can assemble atoms of inorganic materials into precise shapes. Siluria uses this and other methods to form nanowires from the materials that make up its catalysts. Sometimes the shape of a nanowire changes the way the catalyst interacts with gases such as methane—and this can transform a useless combination of elements into an effective one. “How you build up the structure of the catalyst matters as much as its composition,” says Erik Scher, Siluria’s vice president of research and development.
The process of making and testing catalysts isn’t completely random—Siluria has the work of earlier chemists to guide it, and it has developed software that sorts out the most efficient way to screen a wide variety of possibilities. The result is that what used to take chemists a year Siluria can now do in a couple of days, Scher says. “We’ve made and screened over 50,000 catalysts at last count,” he says. “And I haven’t been counting in a while.”

Nonetheless, some seasoned chemists are skeptical that Siluria can succeed. Siluria’s process is a version of one that chemists pursued in the 1970s and 1980s known as oxidative coupling, which involves reacting methane with oxygen. The problem with this approach is that it’s hard to get the reaction to stop at ethylene and not keep going to make carbon dioxide and water. “The reaction conditions you need to convert methane to ethylene do at least as good a job, if not better, of converting ethylene into carbon dioxide, which is useless,” says Jay Labinger, a chemist at the Beckman Institute at Caltech.

In the late 1980s, Labinger wrote a paper that warned researchers not to waste their time working on the process. And history seems to have borne him out. The process “hasn’t been, and doesn’t appear at all likely to be” an economically viable one, he says.

Yet in spite of the challenging chemistry, Siluria says the performance of its catalysts at its pilot plant have justified building two larger demonstration plants—one across San Francisco Bay in Hayward, California, that will make gasoline, and one in Houston that will only make ethylene. The plants are designed to prove to investors that the technology can work at a commercial scale, and that the process can be plugged into existing refineries and chemical plants, keeping down capital costs. The company hopes to open its first commercial plants within four years.
The hope for finding more valuable uses for natural gas—and making natural gas a large-scale alternative to oil—doesn’t rest on Siluria alone. The abundance of cheap natural gas has fueled a number of startups with other approaches. Given the challenges that such efforts have faced, there’s good reason to be skeptical that they will succeed, says David Victor, director of the Laboratory on International Law and Regulation at the University of California at San Diego. But should some of them break through, he says, “that would be seismic.”

Social privilege

From Wikipedia, the free encyclopedia https://en.wikipedi...