Search This Blog

Sunday, December 29, 2013

Prebiotic Molecules May Form in Exoplanet Atmospheres

Prebiotic Molecules May Form in Exoplanet Atmospheres

 





New research suggests that the building blocks of life — prebiotic molecules — may form in the atmospheres of planets, where the dust provides a safe platform to form on and various reactions with the surrounding plasma provide enough energy necessary to create life.

It's time to approve the full Keystone XL pipeline project.





Mark Green, Posted Dec 27, 2013 by Energy Tomorrow

Brainlike Computers, Learning From Experience

Science Twitter Logo.
Erin Lubin/The New York Times
 
Kwabena Boahen holding a biologically inspired processor attached to a robotic arm in a laboratory at Stanford University.
 
PALO ALTO, Calif. — Computers have entered the age when they are able to learn from their own mistakes, a development that is about to turn the digital world on its head.       
 
The first commercial version of the new kind of computer chip is scheduled to be released in 2014.
Not only can it automate tasks that now require painstaking programming — for example, moving a robot’s arm smoothly and efficiently — but it can also sidestep and even tolerate errors, potentially making the term “computer crash” obsolete.       
 
The new computing approach, already in use by some large technology companies, is based on the biological nervous system, specifically on how neurons react to stimuli and connect with other neurons to interpret information. It allows computers to absorb new information while carrying out a task, and adjust what they do based on the changing signals.
 
In coming years, the approach will make possible a new generation of artificial intelligence systems that will perform some functions that humans do with ease: see, speak, listen, navigate, manipulate and control. That can hold enormous consequences for tasks like facial and speech recognition, navigation and planning, which are still in elementary stages and rely heavily on human programming.
 
Designers say the computing style can clear the way for robots that can safely walk and drive in the physical world, though a thinking or conscious computer, a staple of science fiction, is still far off on the digital horizon.
 
“We’re moving from engineering computing systems to something that has many of the characteristics of biological computing,” said Larry Smarr, an astrophysicist who directs the California Institute for Telecommunications and Information Technology, one of many research centers devoted to developing these new kinds of computer circuits.
 
Conventional computers are limited by what they have been programmed to do. Computer vision systems, for example, only “recognize” objects that can be identified by the statistics-oriented algorithms programmed into them. An algorithm is like a recipe, a set of step-by-step instructions to perform a calculation.
 
But last year, Google researchers were able to get a machine-learning algorithm, known as a neural network, to perform an identification task without supervision. The network scanned a database of 10 million images, and in doing so trained itself to recognize cats.
 
In June, the company said it had used those neural network techniques to develop a new search service to help customers find specific photos more accurately.
 
The new approach, used in both hardware and software, is being driven by the explosion of scientific knowledge about the brain. Kwabena Boahen, a computer scientist who leads Stanford’s Brains in Silicon research program, said that is also its limitation, as scientists are far from fully understanding how brains function.
 
“We have no clue,” he said. “I’m an engineer, and I build things. There are these highfalutin theories, but give me one that will let me build something.”
 
Until now, the design of computers was dictated by ideas originated by the mathematician John von Neumann about 65 years ago. Microprocessors perform operations at lightning speed, following instructions programmed using long strings of 1s and 0s. They generally store that information separately in what is known, colloquially, as memory, either in the processor itself, in adjacent storage chips or in higher capacity magnetic disk drives.
 
The data — for instance, temperatures for a climate model or letters for word processing — are shuttled in and out of the processor’s short-term memory while the computer carries out the programmed action. The result is then moved to its main memory.
 
The new processors consist of electronic components that can be connected by wires that mimic biological synapses. Because they are based on large groups of neuron-like elements, they are known as neuromorphic processors, a term credited to the California Institute of Technology physicist Carver Mead, who pioneered the concept in the late 1980s.
 
They are not “programmed.” Rather the connections between the circuits are “weighted” according to correlations in data that the processor has already “learned.” Those weights are then altered as data flows in to the chip, causing them to change their values and to “spike.” That generates a signal that travels to other components and, in reaction, changes the neural network, in essence programming the next actions much the same way that information alters human thoughts and actions.
 
“Instead of bringing data to computation as we do today, we can now bring computation to data,” said Dharmendra Modha, an I.B.M. computer scientist who leads the company’s cognitive computing research effort. “Sensors become the computer, and it opens up a new way to use computer chips that can be everywhere.”
 
The new computers, which are still based on silicon chips, will not replace today’s computers, but will augment them, at least for now. Many computer designers see them as coprocessors, meaning they can work in tandem with other circuits that can be embedded in smartphones and in the giant centralized computers that make up the cloud. Modern computers already consist of a variety of coprocessors that perform specialized tasks, like producing graphics on your cellphone and converting visual, audio and other data for your laptop.
 
One great advantage of the new approach is its ability to tolerate glitches. Traditional computers are precise, but they cannot work around the failure of even a single transistor. With the biological designs, the algorithms are ever changing, allowing the system to continuously adapt and work around failures to complete tasks.
 
Traditional computers are also remarkably energy inefficient, especially when compared to actual brains, which the new neurons are built to mimic.
 
I.B.M. announced last year that it had built a supercomputer simulation of the brain that encompassed roughly 10 billion neurons — more than 10 percent of a human brain. It ran about 1,500 times more slowly than an actual brain. Further, it required several megawatts of power, compared with just 20 watts of power used by the biological brain.
 
Running the program, known as Compass, which attempts to simulate a brain, at the speed of a human brain would require a flow of electricity in a conventional computer that is equivalent to what is needed to power both San Francisco and New York, Dr. Modha said.
 
I.B.M. and Qualcomm, as well as the Stanford research team, have already designed neuromorphic processors, and Qualcomm has said that it is coming out in 2014 with a commercial version, which is expected to be used largely for further development. Moreover, many universities are now focused on this new style of computing. This fall the National Science Foundation financed the Center for Brains, Minds and Machines, a new research center based at the Massachusetts Institute of
Technology, with Harvard and Cornell.
 
The largest class on campus this fall at Stanford was a graduate level machine-learning course covering both statistical and biological approaches, taught by the computer scientist Andrew Ng. More than 760 students enrolled. “That reflects the zeitgeist,” said Terry Sejnowski, a computational neuroscientist at the Salk Institute, who pioneered early biologically inspired algorithms. “Everyone knows there is something big happening, and they’re trying find out what it is.”

The Act 13 decision unravels protections


Our industry has worked, and continues to work, closely toward shared goals with the communities in which we operate to be good neighbors and stewards of our environment. And the outcomes are clear. The federal Environmental Protection Agency confirmed in October that U.S. carbon emissions are at their lowest since 1994, thanks to expanded natural gas use. The independent State Review of Oil & Natural Gas Environmental Regulations again gave Pennsylvania’s oil and gas regulatory framework high marks — noting that regulations were strengthened significantly through Act 13’s enhancements.

Consumers are winning, too. A recent Boston Consulting Group analysis projects average U.S. household savings tied to shale development to reach $1,200. And in addition to more than $1.8 billion in state taxes since 2008, our industry has paid $406 million in impact fees over the last two years to communities across the commonwealth. Unfortunately, with this ruling, these shared benefits may unnecessarily be limited rather than maximized.

If we are to remain competitive and if our focus is truly an improved environment, more job creation and economic prosperity, we must continue to work together toward common-sense proposals that encourage investment in the commonwealth.
DAVE SPIGELMYER
President
Marcellus Shale Coalition
North Fayette


Read more: http://www.post-gazette.com/opinion/letters/2013/12/29/The-Act-13-decision-unravels-protections/stories/201312290024#ixzz2osdnciFd

GMOs May Feed the World Using Fewer Pesticides (and Bring Back the Bees)

By on
De Jong produced the plants in the same old, laborious way that his father did before him. He collected pollen from a plant that produces potatoes that fry as potato chips should and then sprinkled the pollen on the flower of a potato plant that resists viruses. If the resulting potatoes bear their parents’ finest features—and none of the bad ones—De Jong will bury them in the ground next year and test their mettle against a common potato virus. If they survive—and are good for frying and eating—he and his team will repeat this for 13 years to ensure that problematic genes did not creep in during the initial cross.

Walter_field
Walter De Jong, a second-generation potato breeder, walks the fields in upstate New York.

Each year, the chance of failure is high. Potatoes that resist viruses, for example, often have genes that make them taste bitter. Others turn an unappetizing shade of brown when fried. If anything like that happens, De Jong will have to start from scratch. Tedious as it is, he loves the work. Kicking up dirt in the furrows that cascade along the hillsides of upstate New York, he says, “I’m never stressed in the potato fields.”

De Jong has some serious cred in the agriculture world. Not only was his father a potato breeder, he’s also descended from a long line of farmers. The potato farmers he works with appreciate this deeply, along with his commitment to the age-old craft of producing new potato varieties through selective breeding. They even advocated on his behalf during his hiring and when he was up for tenure at Cornell, a school with a long history of agriculture research. “All of our farmers like Walter,” says Melanie Wickham, the executive secretary of the Empire State Growers organization in Stanley, New York. Often, he’s in the fields in a big hat, she says. Other times “you’ll see him in the grocery store, looking over the potatoes.”

De Jong has been working with farmers long enough to know that our food supply is never more than a step ahead of devastating insect infestations and disease. Selective breeders like De Jong work hard to develop resistant crops, but farmers still have to turn to chemical pesticides, some of which are toxic to human health and the environment. De Jong enjoys dabbing pollen from plant-to-plant the old-fashioned way, but he knows that selective breeding can only do so much.

seedlings
Seedlings from De Jong's selective breeding experiments poke their shoots above the soil in a greenhouse.
 
So while De Jong still devotes most of his time to honing his craft, he has recently begun to experiment in an entirely different way, with genetic engineering. To him, genetic engineering represents a far more exact way to produce new varieties, rather than simply scrambling the potato genome’s 39,000 genes the way traditional breeding does. By inserting a specific fungus-defeating gene into a tasty potato, for example, De Jong knows he could offer farmers a product that requires fewer pesticides.

“We want to make food production truly sustainable,” De Jong says, “and right now I cannot pretend that it is.”

The need to protect crops from ruin grows more vital every day. By 2050, farmers must produce 40% more food to feed an estimated 9 billion people on the planet. Either current yields will have to increase or farmland will expand farther into forests and jungles. In some cases, genetically modified organisms (GMOs) would offer an alternative way to boost yields without sacrificing more land or using more pesticides, De Jong says. But he fears this approach won’t blossom if the public rejects GMOs out of hand.

When It All Began

In the late 1990s, the agriculture corporation Monsanto began to sell corn engineered to include a protein from the bacteria Bacillus thuringiensis, better known as Bt. The bacteria wasn’t new to agriculture—organic farmers spray it on their crops to kill certain insects. Today more than 60% of the corn grown within the United States is Bt corn. Farmers have adopted it in droves because it saves them money that they would otherwise spend on insecticide and the fuel and labor needed to apply it.
They also earn more money for an acre of Bt corn compared with a conventional variety because fewer kernels are damaged. Between 1996 and 2011, Bt corn reduced insecticide use in corn production by 45% worldwide (110 million pounds, or roughly the equivalent of 20,000 Olympic swimming pools).

Between 1996 and 2001, Monsanto also produced Bt potato plants. Farmers like Duane Grant of Grant 4D Farms in Rupert, Idaho, welcomed the new variety. Grant grew up on his family’s farm, and his distaste for insecticides started at a young age. As a teenager, he recalls feeling so nauseous and fatigued after spraying the fields that he could hardly move until the next day. Today, pesticides are safer than those used 40 years ago, and stiffer U.S. federal regulations require that employees take more precautionary measures when applying them, but Grant occasionally tells his workers to head home early when they feel dizzy after spraying the fields. He was relieved when GM potatoes were introduced because he didn’t have to spray them with insecticide. He was warned that pests might overcome the modification in 15 to 20 years, but that didn’t deter him—he says the same thing happens with chemicals, too.

Unlike Bt corn, you can’t find any fields planted today with Bt potatoes. Soon after the breed hit the market, protestors began to single out McDonald’s restaurants, which collectively are the biggest buyer of potatoes worldwide. In response, McDonald’s, Wendy’s, and Frito-Lay stopped purchasing GM potatoes. In 2001, Monsanto dropped the product and Grant returned to conventional potatoes and the handful of insecticides he sprays on them throughout the summer.

“There is not a single documented case of anyone being hurt by genetically modified food, and yet this is a bigger problem for people than pesticides, which we know have caused harm,” he says. “I just shake my head in bewilderment at the folks who take these stringent positions that biotech should be banned.”
In the decade after Monsanto pulled their GM potatoes from the market, dozens of long-term animal feeding studies concluded that various GM crops were as safe as traditional varieties. And statements from science policy bodies, such as those issued by the American Association for the Advancement of Science, the U.S. National Academy of Sciences, the World Health Organization, and the European Commission, uphold that conclusion. Secondly, techniques to tweak genomes have become remarkably precise. Specific genes can be switched off without lodging foreign material into a plant’s genome. Scientists don’t necessarily have to mix disparate organisms with one another, either. In cisgenic engineering, organisms are engineered by transferring genes between individuals that could breed naturally.

Even some organic farmers bristle when asked about the anti-GMO movement. Under the U.S. Organic Foods Production Act, they are not allowed to grow GMOs, despite their ability to reduce pesticide applications. Organic farmers still spray their crops, just with different chemicals, such as sulfur and copper. Amy Hepworth, an organic farmer at Hepworth Farms in Milton, New York, says that they, too, can take a toll on the environment.

Hepworth would like to continuously evaluate new avenues towards sustainable agriculture as technology advances. However, her views often clash with her customers’ in the affluent Brooklyn, New York, neighborhood of Park Slope. Many of them see no benefit in GMOs’ ability to reduce pesticides because they say farmers should rely strictly on traditional farming methods.

“What people don’t understand is that without pesticides there is not enough food for the masses,” Hepworth says. “The fact is that GM is a tool that can help us use less pesticide.”

Saturday, December 28, 2013

Wht Do We Need the SPACE Launch System?

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Why is NASA using its precious, and shrinking, funding to construct a massive, Saturn V (which took Apollo astronauts to the moon) type rocket, the Space Launch System (SLS), when private companies, such as SpaceX and Orbital Sciences Corporation already or soon will have rockets that will accomplish all the functions sooner and cheaper?
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
SpaceX's Falcon 9 rocket is already delivering cargo regularly to the ISS (International Space System) and even to Geosynchronous orbit (22,000 miles high).  It is easily already man-rated in terms of safety, and will be carrying astronauts into orbit within a couple of years.  A more powerful version of the Falcon, the Falcon Heavy, is due for testing in 2014 and will also be ready for missions within the same time span.  While not as powerful as the SLS, the combination of a Falcon Heavy (to carry the needed equipment) and Falcon 9 (to carry a crew) will already have the capacity to send men to the moon or beyond -- even to Mars perhaps -- before the end of this decade, ahead of SLS' schedule.  SpaceX is also working on recoverable rocket stages and other advanced technologies.
 
 
Antares, known during early development as Taurus II, is an expendable launch system developed by Orbital Sciences Corporation. Designed to launch payloads of mass up to 5,000 kg (11,000 lb) into low-Earth orbit, it made its maiden flight on April 21, 2013. Designed to launch the Cygnus spacecraft to the International Space Station as part of NASA's COTS and CRS programs, Antares is the largest rocket operated by Orbital Sciences, and is scheduled to start supplying the ISS early 2014.

NASA awarded to Orbital a Commercial Orbital Transportation Services (COTS) Space Act Agreement (SAA) in 2008 to demonstrate delivery of cargo to the International Space Station. For these COTS missions Orbital intends to use Antares to launch its Cygnus spacecraft. In addition, Antares will compete for small-to-medium missions. On December 12, 2011 Orbital Sciences renamed the launch vehicle "Antares" from the previous designation of Taurus II, after the star of the same name.
 
I cannot see how, between SpaceX and COTS, the two companies and their rockets alone don't invalidate the need for the SLS.  If it were scrubbed (it is an Obama Administration project), large amounts of money could be freed up for more unmanned planetary missions, such as to Europa, Uranus, and Neptune, and to asteroids for mining possibilities.
 
 
Linus Torvalds

Linus Torvalds

On this date in 1969, Linus Torvalds was born in Helsinki, Finland. He started using computers when he was about 10 year old, and soon began designing simple computer programs. Torvalds earned his M.S. in computer science from the University of Helsinki in 1996, where he was introduced to the Unix operating system. In 1991, Torvalds began creating the innovative Linux, an operating system similar to Unix. Later in the year, he released Linux for free as an open source operating system, allowing anyone to edit its source code with Torvalds’ permission. Linux’s open source nature has contributed to its popularity and reliability, since it is regularly updated and improved by dedicated users. For his work with Linux, Torvalds received the 2008 Computer History Fellow Award and the 2005 Vollum Award for Distinguished Accomplishment in Science and Technology. The asteroid 9793 Torvalds was named after him.

After developing Linux, Torvalds worked for Transmeta Corporation from 1997 to 2003. He appeared in the 2001 documentary “Revolution OS,” and authored an autobiography titled Just for Fun: The Story of an Accidental Revolutionary (2001). He is married to Tove Torvalds, who also attended the University of Helsinki for Computer Science. They live in the U.S. and have three daughters, Patricia, born in 1996, Daniela, born in 1998, and Celeste, born in 2000.

In a Nov. 1, 1999 interview with Linux Journal, Torvalds described himself as “completely a-religious” and “atheist.” He explained his reasons for being an atheist: “I find it kind of distasteful having religions that tell you what you can do and what you can’t do.” He also believes in the separation of church and state, telling Linux Journal, “In practice, religion has absolutely nothing to do with everyday life.”
“I find that people seem to think religion brings morals and appreciation of nature. I actually think it detracts from both . . . I think we can have morals without getting religion into it, and a lot of bad things have come from organized religion in particular. I actually fear organized religion because it usually leads to misuses of power.”
—Linus Torvalds, Linux Journal, Nov. 1, 1999.
Compiled by Sabrina Gaylor
© Freedom From Religion Foundation. All rights reserved.

Thermodynamic diagrams

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Thermodynamic_diagrams Thermodynamic diagrams are diagrams used to repr...