Search This Blog

Sunday, December 29, 2013

Welcome to the 'Era of Behavior'

by Dov Seidman      
December 29, 2013, 8:00 AM
Shutterstock_107684843
As the world has gone from connected to interconnected to interdependent, I believe we’ve entered a new era.  What I call the era of behavior.  I acknowledge that behavior has always mattered.  What
I’m saying is that behavior now matters more than ever and in ways it never has before.  And what I mean by behavior – it’s not just doing the right principle, the responsible thing.  
 
Of course that’s fundamentally what I mean by behavior.  But every Tweet is a behavior.  Every email is a behavior.  We call these things communications and Tweeting and friending and unfriending.  But every email we write, collaboration is a behavior.  Innovation is a behavior.  How we lead is a behavior.  How we engender trust in our relationships.  How we say we’re sorry when we should.  It’s all behavior.  And the more connected the world gets, the more that every form of these behaviors and the more that leaders can create cultures where these behaviors can flourish and be scaled and be embedded into the DNA – the more these are the organizations that will succeed and, more importantly, achieve significance in their endeavors and therefore lasting enduring success.  So I believe that we have entered deeply the era of behavior.
 
Now I want to make a distinction when it comes to behavior.  Carrots and sticks.  The proverbial carrots and sticks, you know, bonuses and compensation and threats of punishment or discipline – they can shift behavior.  You know, we went through an election where tiny slivers of swing state voters received ads, were bombarded with ads trying to get them to shift from one camp to the other.  If you put a product on sale we’re shifting behavior.  Buy now, not later.
 
So we’ve scaled up marketplaces and mechanics of shifting behavior – left, right, forward, back, now not later.  But if you sit with corporate managers and you say, “What behaviors do you want from your colleagues, from your people?”  They say, “I want creativity.  I want collaboration, loyalty, passion.”  And they go on and on and on – responsible, principle of conduct.  These are not behaviors you can shift for.  You can’t say, “You two, go in the room and don’t emerge until you have a brilliant idea.”  “And you two from different cultures, go in a room and don’t come out but I’ll pay you double if you figure out how to truly collaborate and move us forward.  The behaviors we want today are behaviors that we elevate.  These are elevated behaviors.
 
And what’s elevated people since the beginning of time is a mission worthy of their loyalty.  A purpose worthy of their dedication.  Core values that they share with others that really animate them.  Beliefs that they believe are near and dear and a kind of leadership rooted in moral authority that inspires them.  So not only are we in the era of behavior where competitive advantage has shifted to behavior, we are in the era of elevated behavior and what elevates behavior are fundamental values that we share with others and missions and purposes worthy of our commitment and dedication that we also share with others.  
 
And twenty-first century leadership is about connecting with people from within.  And that’s what I mean by this notion that there’s only three ways to get another human being, a friend, a colleague, a worker, to do anything.  You can coerce them, do this or else.  You can motivate them with the carrot, with the bonus, with the stock option.  And fireable offenses, for example, coercion and paying people well – they continue to have their place.  But the freest, cheapest, most enduring, most affordable and cleanest form of human energy is inspiration.  
 
And when you inspire somebody, you get in touch with the first two letters of the word inspire – IN.  And what’s in us are beliefs and values and missions and purposes worthy of our commitment. And leadership today is really fast going from command and control with carrots and sticks as the mechanics of command and control to inspirational leadership.  Leadership that’s animated by moral authority that connects with people in an inspired way from within.  And I think twenty-first century leadership is about becoming an inspirational leader.
 
In Their Own Words is recorded in Big Think's studio.
Image courtesy of Shutterstock

Why America is NOT the greatest country in the world, anymore.


More Good News for Solar Power

Solar cell performance improves with ion-conducting polymer

Sep 04, 2013
Solar cell performance improves with ion-conducting polymer       
A dye-sensitized solar cell panel is tested in the laboratory at the School of Chemical Science and Engineering. Dye-sensitized solar photovoltaics can be greatly improved as a result of research done at KTH Royal Institute of Technology. Credit: David Callahan

Researchers at Stockholm's KTH Royal Institute of Technology have found a way to make dye-sensitized solar cells more energy-efficient and longer-lasting.

Drawing their inspiration from photosynthesis, dye-sensitized offer the promise of low-cost and – when coupled with catalysts – even the possibility of generating hydrogen and oxygen, just like plants. A study published in August could lead to more efficient and longer-lasting dye-sensitized solar cells, says one of the researchers from KTH Royal Institute of Technology in Stockholm.

A research team that included James Gardner, Assistant Professor of Photoelectrochemistry at KTH, reported the success of a new quasi-liquid, polymer-based electrolyte that increases a dye-sensitized solar cell's voltage and current, and lowers resistance between its electrodes.

The study highlights the advantages of speeding up the movement of oxidized electrolytes in a dye-sensitized solar cell, or DSSC. Also on the team from KTH were Lars Kloo, Professor of Inorganic Chemistry and researcher Muthuraaman Bhagavathi Achari.

Their research was published in the Royal Society of Chemistry's journal, Physical Chemistry Chemical Physics on August 19.

"We now have clear evidence that by adding the ion- to the solar cell's cobalt redox electrolyte, the transport of oxidized electrolytes is greatly enhanced," Gardner says. "The fast transport increases by 20 percent."

A dye-sensitized solar cell absorbs photons and injects electrons into the of a transparent semiconductor. This anode is actually a plate with a highly porous, thin layer of that is sensitized with dyes that absorb visible light. The electrons in the semiconductor diffuse through the anode, out into the external circuit.

In the electrolyte, a cobalt complex redox shuttle acts as a catalyst, providing the internal electrical continuity between the anode and cathode. When the dye releases electrons and becomes oxidized by the titanium dioxide, the electrolyte supplies electrons to replenish the deficiency. This "resets" the dye molecules, reducing them back to their original states. As a result, the electrolyte becomes oxidized and electron-deficient and migrates toward the cathode to recovers its missing electrons. Electrons migrating through the circuit recombine with the oxidized form of the cobalt complex when they reach the cathode.

In the most efficient solar cells this transport of ions relies on acetonitrile, a low viscosity, volatile organic solvent. But in order to build a stable, commercially-viable solar cell, a low volatility solvent is used instead, usually methoxypropionitrile. The problem is that while methoxypropionitrile is more stable, it is also more viscous than acetonitrile, and it impedes the flow of ions.

But with the introduction of a new quasi-liquid, polymer-based electrolyte (containing the Co3+/Co2+ redox mediator in 3-methoxy propionitrile solvent), the research team has overcome the viscosity problem, Gardner says. At the same time, adding the ion-conducting polymer to the maintains its low volatility. This makes it possible for the oxidized form of the cobalt complex to reach the cathode, and get reduced, faster.

Speeding up this transport is important because when slowed down, more of the cobalt complexes react with electrons in the semiconductor instead of with the electrons at the cathode, resulting in rapid recombination losses. Speeding up the cobalt lowers resistance and increases voltage and current in the solar cell, Gardner says.
Explore further: Cobalt replacements make solar cells more sustainable

Prebiotic Molecules May Form in Exoplanet Atmospheres

Prebiotic Molecules May Form in Exoplanet Atmospheres

 





New research suggests that the building blocks of life — prebiotic molecules — may form in the atmospheres of planets, where the dust provides a safe platform to form on and various reactions with the surrounding plasma provide enough energy necessary to create life.

It's time to approve the full Keystone XL pipeline project.





Mark Green, Posted Dec 27, 2013 by Energy Tomorrow

Brainlike Computers, Learning From Experience

Science Twitter Logo.
Erin Lubin/The New York Times
 
Kwabena Boahen holding a biologically inspired processor attached to a robotic arm in a laboratory at Stanford University.
 
PALO ALTO, Calif. — Computers have entered the age when they are able to learn from their own mistakes, a development that is about to turn the digital world on its head.       
 
The first commercial version of the new kind of computer chip is scheduled to be released in 2014.
Not only can it automate tasks that now require painstaking programming — for example, moving a robot’s arm smoothly and efficiently — but it can also sidestep and even tolerate errors, potentially making the term “computer crash” obsolete.       
 
The new computing approach, already in use by some large technology companies, is based on the biological nervous system, specifically on how neurons react to stimuli and connect with other neurons to interpret information. It allows computers to absorb new information while carrying out a task, and adjust what they do based on the changing signals.
 
In coming years, the approach will make possible a new generation of artificial intelligence systems that will perform some functions that humans do with ease: see, speak, listen, navigate, manipulate and control. That can hold enormous consequences for tasks like facial and speech recognition, navigation and planning, which are still in elementary stages and rely heavily on human programming.
 
Designers say the computing style can clear the way for robots that can safely walk and drive in the physical world, though a thinking or conscious computer, a staple of science fiction, is still far off on the digital horizon.
 
“We’re moving from engineering computing systems to something that has many of the characteristics of biological computing,” said Larry Smarr, an astrophysicist who directs the California Institute for Telecommunications and Information Technology, one of many research centers devoted to developing these new kinds of computer circuits.
 
Conventional computers are limited by what they have been programmed to do. Computer vision systems, for example, only “recognize” objects that can be identified by the statistics-oriented algorithms programmed into them. An algorithm is like a recipe, a set of step-by-step instructions to perform a calculation.
 
But last year, Google researchers were able to get a machine-learning algorithm, known as a neural network, to perform an identification task without supervision. The network scanned a database of 10 million images, and in doing so trained itself to recognize cats.
 
In June, the company said it had used those neural network techniques to develop a new search service to help customers find specific photos more accurately.
 
The new approach, used in both hardware and software, is being driven by the explosion of scientific knowledge about the brain. Kwabena Boahen, a computer scientist who leads Stanford’s Brains in Silicon research program, said that is also its limitation, as scientists are far from fully understanding how brains function.
 
“We have no clue,” he said. “I’m an engineer, and I build things. There are these highfalutin theories, but give me one that will let me build something.”
 
Until now, the design of computers was dictated by ideas originated by the mathematician John von Neumann about 65 years ago. Microprocessors perform operations at lightning speed, following instructions programmed using long strings of 1s and 0s. They generally store that information separately in what is known, colloquially, as memory, either in the processor itself, in adjacent storage chips or in higher capacity magnetic disk drives.
 
The data — for instance, temperatures for a climate model or letters for word processing — are shuttled in and out of the processor’s short-term memory while the computer carries out the programmed action. The result is then moved to its main memory.
 
The new processors consist of electronic components that can be connected by wires that mimic biological synapses. Because they are based on large groups of neuron-like elements, they are known as neuromorphic processors, a term credited to the California Institute of Technology physicist Carver Mead, who pioneered the concept in the late 1980s.
 
They are not “programmed.” Rather the connections between the circuits are “weighted” according to correlations in data that the processor has already “learned.” Those weights are then altered as data flows in to the chip, causing them to change their values and to “spike.” That generates a signal that travels to other components and, in reaction, changes the neural network, in essence programming the next actions much the same way that information alters human thoughts and actions.
 
“Instead of bringing data to computation as we do today, we can now bring computation to data,” said Dharmendra Modha, an I.B.M. computer scientist who leads the company’s cognitive computing research effort. “Sensors become the computer, and it opens up a new way to use computer chips that can be everywhere.”
 
The new computers, which are still based on silicon chips, will not replace today’s computers, but will augment them, at least for now. Many computer designers see them as coprocessors, meaning they can work in tandem with other circuits that can be embedded in smartphones and in the giant centralized computers that make up the cloud. Modern computers already consist of a variety of coprocessors that perform specialized tasks, like producing graphics on your cellphone and converting visual, audio and other data for your laptop.
 
One great advantage of the new approach is its ability to tolerate glitches. Traditional computers are precise, but they cannot work around the failure of even a single transistor. With the biological designs, the algorithms are ever changing, allowing the system to continuously adapt and work around failures to complete tasks.
 
Traditional computers are also remarkably energy inefficient, especially when compared to actual brains, which the new neurons are built to mimic.
 
I.B.M. announced last year that it had built a supercomputer simulation of the brain that encompassed roughly 10 billion neurons — more than 10 percent of a human brain. It ran about 1,500 times more slowly than an actual brain. Further, it required several megawatts of power, compared with just 20 watts of power used by the biological brain.
 
Running the program, known as Compass, which attempts to simulate a brain, at the speed of a human brain would require a flow of electricity in a conventional computer that is equivalent to what is needed to power both San Francisco and New York, Dr. Modha said.
 
I.B.M. and Qualcomm, as well as the Stanford research team, have already designed neuromorphic processors, and Qualcomm has said that it is coming out in 2014 with a commercial version, which is expected to be used largely for further development. Moreover, many universities are now focused on this new style of computing. This fall the National Science Foundation financed the Center for Brains, Minds and Machines, a new research center based at the Massachusetts Institute of
Technology, with Harvard and Cornell.
 
The largest class on campus this fall at Stanford was a graduate level machine-learning course covering both statistical and biological approaches, taught by the computer scientist Andrew Ng. More than 760 students enrolled. “That reflects the zeitgeist,” said Terry Sejnowski, a computational neuroscientist at the Salk Institute, who pioneered early biologically inspired algorithms. “Everyone knows there is something big happening, and they’re trying find out what it is.”

The Act 13 decision unravels protections


Our industry has worked, and continues to work, closely toward shared goals with the communities in which we operate to be good neighbors and stewards of our environment. And the outcomes are clear. The federal Environmental Protection Agency confirmed in October that U.S. carbon emissions are at their lowest since 1994, thanks to expanded natural gas use. The independent State Review of Oil & Natural Gas Environmental Regulations again gave Pennsylvania’s oil and gas regulatory framework high marks — noting that regulations were strengthened significantly through Act 13’s enhancements.

Consumers are winning, too. A recent Boston Consulting Group analysis projects average U.S. household savings tied to shale development to reach $1,200. And in addition to more than $1.8 billion in state taxes since 2008, our industry has paid $406 million in impact fees over the last two years to communities across the commonwealth. Unfortunately, with this ruling, these shared benefits may unnecessarily be limited rather than maximized.

If we are to remain competitive and if our focus is truly an improved environment, more job creation and economic prosperity, we must continue to work together toward common-sense proposals that encourage investment in the commonwealth.
DAVE SPIGELMYER
President
Marcellus Shale Coalition
North Fayette


Read more: http://www.post-gazette.com/opinion/letters/2013/12/29/The-Act-13-decision-unravels-protections/stories/201312290024#ixzz2osdnciFd

GMOs May Feed the World Using Fewer Pesticides (and Bring Back the Bees)

By on
De Jong produced the plants in the same old, laborious way that his father did before him. He collected pollen from a plant that produces potatoes that fry as potato chips should and then sprinkled the pollen on the flower of a potato plant that resists viruses. If the resulting potatoes bear their parents’ finest features—and none of the bad ones—De Jong will bury them in the ground next year and test their mettle against a common potato virus. If they survive—and are good for frying and eating—he and his team will repeat this for 13 years to ensure that problematic genes did not creep in during the initial cross.

Walter_field
Walter De Jong, a second-generation potato breeder, walks the fields in upstate New York.

Each year, the chance of failure is high. Potatoes that resist viruses, for example, often have genes that make them taste bitter. Others turn an unappetizing shade of brown when fried. If anything like that happens, De Jong will have to start from scratch. Tedious as it is, he loves the work. Kicking up dirt in the furrows that cascade along the hillsides of upstate New York, he says, “I’m never stressed in the potato fields.”

De Jong has some serious cred in the agriculture world. Not only was his father a potato breeder, he’s also descended from a long line of farmers. The potato farmers he works with appreciate this deeply, along with his commitment to the age-old craft of producing new potato varieties through selective breeding. They even advocated on his behalf during his hiring and when he was up for tenure at Cornell, a school with a long history of agriculture research. “All of our farmers like Walter,” says Melanie Wickham, the executive secretary of the Empire State Growers organization in Stanley, New York. Often, he’s in the fields in a big hat, she says. Other times “you’ll see him in the grocery store, looking over the potatoes.”

De Jong has been working with farmers long enough to know that our food supply is never more than a step ahead of devastating insect infestations and disease. Selective breeders like De Jong work hard to develop resistant crops, but farmers still have to turn to chemical pesticides, some of which are toxic to human health and the environment. De Jong enjoys dabbing pollen from plant-to-plant the old-fashioned way, but he knows that selective breeding can only do so much.

seedlings
Seedlings from De Jong's selective breeding experiments poke their shoots above the soil in a greenhouse.
 
So while De Jong still devotes most of his time to honing his craft, he has recently begun to experiment in an entirely different way, with genetic engineering. To him, genetic engineering represents a far more exact way to produce new varieties, rather than simply scrambling the potato genome’s 39,000 genes the way traditional breeding does. By inserting a specific fungus-defeating gene into a tasty potato, for example, De Jong knows he could offer farmers a product that requires fewer pesticides.

“We want to make food production truly sustainable,” De Jong says, “and right now I cannot pretend that it is.”

The need to protect crops from ruin grows more vital every day. By 2050, farmers must produce 40% more food to feed an estimated 9 billion people on the planet. Either current yields will have to increase or farmland will expand farther into forests and jungles. In some cases, genetically modified organisms (GMOs) would offer an alternative way to boost yields without sacrificing more land or using more pesticides, De Jong says. But he fears this approach won’t blossom if the public rejects GMOs out of hand.

When It All Began

In the late 1990s, the agriculture corporation Monsanto began to sell corn engineered to include a protein from the bacteria Bacillus thuringiensis, better known as Bt. The bacteria wasn’t new to agriculture—organic farmers spray it on their crops to kill certain insects. Today more than 60% of the corn grown within the United States is Bt corn. Farmers have adopted it in droves because it saves them money that they would otherwise spend on insecticide and the fuel and labor needed to apply it.
They also earn more money for an acre of Bt corn compared with a conventional variety because fewer kernels are damaged. Between 1996 and 2011, Bt corn reduced insecticide use in corn production by 45% worldwide (110 million pounds, or roughly the equivalent of 20,000 Olympic swimming pools).

Between 1996 and 2001, Monsanto also produced Bt potato plants. Farmers like Duane Grant of Grant 4D Farms in Rupert, Idaho, welcomed the new variety. Grant grew up on his family’s farm, and his distaste for insecticides started at a young age. As a teenager, he recalls feeling so nauseous and fatigued after spraying the fields that he could hardly move until the next day. Today, pesticides are safer than those used 40 years ago, and stiffer U.S. federal regulations require that employees take more precautionary measures when applying them, but Grant occasionally tells his workers to head home early when they feel dizzy after spraying the fields. He was relieved when GM potatoes were introduced because he didn’t have to spray them with insecticide. He was warned that pests might overcome the modification in 15 to 20 years, but that didn’t deter him—he says the same thing happens with chemicals, too.

Unlike Bt corn, you can’t find any fields planted today with Bt potatoes. Soon after the breed hit the market, protestors began to single out McDonald’s restaurants, which collectively are the biggest buyer of potatoes worldwide. In response, McDonald’s, Wendy’s, and Frito-Lay stopped purchasing GM potatoes. In 2001, Monsanto dropped the product and Grant returned to conventional potatoes and the handful of insecticides he sprays on them throughout the summer.

“There is not a single documented case of anyone being hurt by genetically modified food, and yet this is a bigger problem for people than pesticides, which we know have caused harm,” he says. “I just shake my head in bewilderment at the folks who take these stringent positions that biotech should be banned.”
In the decade after Monsanto pulled their GM potatoes from the market, dozens of long-term animal feeding studies concluded that various GM crops were as safe as traditional varieties. And statements from science policy bodies, such as those issued by the American Association for the Advancement of Science, the U.S. National Academy of Sciences, the World Health Organization, and the European Commission, uphold that conclusion. Secondly, techniques to tweak genomes have become remarkably precise. Specific genes can be switched off without lodging foreign material into a plant’s genome. Scientists don’t necessarily have to mix disparate organisms with one another, either. In cisgenic engineering, organisms are engineered by transferring genes between individuals that could breed naturally.

Even some organic farmers bristle when asked about the anti-GMO movement. Under the U.S. Organic Foods Production Act, they are not allowed to grow GMOs, despite their ability to reduce pesticide applications. Organic farmers still spray their crops, just with different chemicals, such as sulfur and copper. Amy Hepworth, an organic farmer at Hepworth Farms in Milton, New York, says that they, too, can take a toll on the environment.

Hepworth would like to continuously evaluate new avenues towards sustainable agriculture as technology advances. However, her views often clash with her customers’ in the affluent Brooklyn, New York, neighborhood of Park Slope. Many of them see no benefit in GMOs’ ability to reduce pesticides because they say farmers should rely strictly on traditional farming methods.

“What people don’t understand is that without pesticides there is not enough food for the masses,” Hepworth says. “The fact is that GM is a tool that can help us use less pesticide.”

Delayed-choice quantum eraser

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Delayed-choice_quantum_eraser A delayed-cho...