Search This Blog

Sunday, December 8, 2013

Physicists Eye Quantum-Gravity Interface

 
Gravity curves space and time around massive objects. What happens when such objects are put in quantum superpositions, causing space-time to curve in two different ways?
Courtesy of Dirk Bouwmeester
Gravity curves space and time around massive objects. What happens when such objects are put in quantum superpositions, causing space-time to curve in two different ways?
  
It starts like a textbook physics experiment, with a ball attached to a spring. If a photon strikes the ball, the impact sets it oscillating very gently. But there’s a catch. Before reaching the ball, the photon encounters a half-silvered mirror, which reflects half of the light that strikes it and allows the other half to pass through.
What happens next depends on which of two extremely well-tested but conflicting theories is correct: quantum mechanics or Einstein’s theory of general relativity; these describe the small- and large-scale properties of the universe, respectively.
In a strange quantum mechanical effect called “superposition,” the photon simultaneously passes through and reflects backward off the mirror; it then both strikes and doesn’t strike the ball. If quantum mechanics works at the macroscopic level, then the ball will both begin oscillating and stay still, entering a superposition of the two states. Because the ball has mass, its gravitational field will also split into a superposition.
But according to general relativity, gravity warps space and time around the ball. The theory cannot tolerate space and time warping in two different ways, which could destabilize the superposition, forcing the ball to adopt one state or the other.
Knowing what happens to the ball could help physicists resolve the conflict between quantum mechanics and general relativity. But such experiments have long been considered infeasible: Only photon-size entities can be put in quantum superpositions, and only ball-size objects have detectable gravitational fields. Quantum mechanics and general relativity dominate in disparate domains, and they seem to converge only in enormously dense, quantum-size black holes. In the laboratory, as the physicist Freeman Dyson wrote in 2004, “any differences between their predictions are physically undetectable.”
In the past two years, that widely held view has begun to change. With the help of new precision instruments and clever approaches for indirectly probing imperceptible effects, experimentalists are now taking steps toward investigating the interface between quantum mechanics and general relativity in tests like the one with the photon and the ball. The new experimental possibilities are revitalizing the 80-year-old quest for a theory of quantum gravity.
“In the final showdown between quantum mechanics and gravity, our understanding of space and time will be completely changed.”
“The biggest single problem of all of physics is how to reconcile gravity and quantum mechanics,” said Philip Stamp, a theoretical physicist at the University of British Columbia. “All of a sudden, it’s clear there is a target.”
Theorists are thinking through how the experiments might play out, and what each outcome would mean for a more complete theory merging quantum mechanics and general relativity. “Neither of them has ever failed,” Stamp said. “They’re incompatible. If experiments can get to grips with that conflict, that’s a big deal.”
Quantum Nature
At the quantum scale, rather than being “here” or “there” as balls tend to be, elementary particles have a certain probability of existing in each of the locations. These probabilities are like the peaks of a wave that often extends through space. When a photon encounters two adjacent slits on a screen, for example, it has a 50-50 chance of passing through either of them. The probability peaks associated with its two paths meet on the far side of the screen, creating interference fringes of light and dark. These fringes prove that the photon existed in a superposition of both trajectories.
But quantum superpositions are delicate. The moment a particle in a superposition interacts with the environment, it appears to collapse into a definite state of “here” or “there.” Modern theory and experiments suggest that this effect, called environmental decoherence, occurs because the superposition leaks out and envelops whatever the particle encountered. Once leaked, the superposition quickly expands to include the physicist trying to study it, or the engineer attempting to harness it to build a quantum computer. From the inside, only one of the many superimposed versions of reality is perceptible.
A single photon is easy to keep in a superposition. Massive objects like a ball on a spring, however, “become exponentially sensitive to environmental disturbances,” explained Gerard Milburn, director of the Center for Engineered Quantum Systems at the University of Queensland in Australia. “The chances of any one of their particles getting disturbed by a random kick from the environment is extremely high.”
Because of environmental decoherence, the idea of probing quantum superpositions of massive objects in tabletop experiments seemed for decades to be dead in the water. “The problem is getting the isolation, making sure no disturbances come along other than gravity,” Milburn said. But the prospects have dramatically improved.
Dirk Bouwmeester, an experimental physicist who splits his time between the University of California, Santa Barbara, and Leiden University in the Netherlands, has developed a setup much like the photon-and-ball experiment, but replacing the ball on its spring with an object called an optomechanical oscillator — essentially a tiny mirror on a springboard. The goal is to put the oscillator in a quantum superposition of two vibration modes, and then see whether gravity destabilizes the superposition.
Ten years ago, the best optomechanical oscillators of the kind required for Bouwmeester’s experiment could wiggle back and forth 100,000 times without stopping. But that wasn’t long enough for the effects of gravity to kick in. Now, improved oscillators can wiggle one million times, which Bouwmeester calculates is close to what he needs in order to see, or rule out, decoherence caused by gravity. “Within three to five years, we will prove quantum superpositions of this mirror,” he said. After that, he and his team must reduce the environmental disturbances on the oscillator until it is sensitive to the impact of a single photon. “It’s going to work,” he insists.
Photo of Markus Aspelmeyer
Courtesy of Markus Aspelmeyer
Markus Aspelmeyer, a quantum physicist at the University of Vienna, is developing three experiments aimed at probing the interface between quantum mechanics and gravity.
Markus Aspelmeyer, a professor of physics at the University of Vienna, is equally optimistic. His group is developing three separate experiments at the quantum-gravity interface — two for the lab and one for an orbiting satellite. In the space-based experiment, a nanosphere will be cooled to its lowest energy state of motion, and a laser pulse will put the nanosphere in a quantum superposition of two locations, setting up a situation much like a double-slit experiment. The nanosphere will behave like a wave with two interfering peaks as it moves toward a detector. Each nanosphere can be detected in only a single location, but after multiple repetitions of the experiment, interference fringes will appear in the distribution of the nanospheres’ locations. If gravity destroys superpositions, the fringes won’t appear for nanospheres that are too massive.
The group is designing a similar experiment for Earth’s surface, but it will have to wait. At present, the nanospheres cannot be cooled enough, and they fall too quickly under Earth’s gravity, for the test to work. But “it turns out that optical platforms on satellites actually already meet the requirements that we need for our experiments,” said Aspelmeyer, who is collaborating with the European Aeronautic Defense and Space Company in Germany. His team recently demonstrated a key technical step required for the experiment. If it gets off the ground and goes as planned, it will reveal the relationship between the mass of the nanospheres and decoherence, pitting gravity against quantum mechanics.
The researchers laid out another terrestrial experiment last spring in Nature Physics. Many proposed quantum gravity theories involve modifications to Heisenberg’s uncertainty principle, a cornerstone of quantum mechanics that says it isn’t possible to precisely measure both the position and momentum of an object at the same time. Any deviations to Heisenberg’s formula should show up in the position-momentum uncertainty of an optomechanical oscillator, because it is affected by gravity. The uncertainty itself is immeasurably small — a blurriness just 100-million-trillionth the width of a proton — but Igor Pikovski, a theorist in Aspelmeyer’s group, has discovered a backdoor route to detecting it. When a light pulse strikes the oscillator, Pikovski claims that its phase (the position of its peaks and troughs) will undergo a discernible shift that depends on the uncertainty. Deviations from the predictions of traditional quantum mechanics could be experimental evidence of quantum gravity.
Aspelmeyer’s group has started to realize the first experimental steps. Pikovski’s idea “provides us with a quite, I have to admit, unexpected improvement in performance,” Aspelmeyer said. “We are all a little surprised, actually.”
The Showdown
Many physicists expect quantum theory to prevail. They believe the ball on a spring should, in principle, be able to exist in two places at once, just as a photon can. The ball’s gravitational field should be able to interfere with itself in a quantum superposition, just as the photon’s electromagnetic field does. “I don’t see why these concepts of quantum theory that have proven to be right for the case of light should fail for the case of gravity,” Aspelmeyer said.
But the incompatibility of general relativity and quantum mechanics itself suggests that gravity might behave differently. One compelling idea is that gravity could act as a sort of inescapable background noise that collapses superpositions.
“While you can get rid of air molecules and electromagnetic radiation, you can’t screen out gravity,” said Miles Blencowe, a professor of physics at Dartmouth College. “My view is that gravity is sort of like the fundamental, unavoidable, last-resort environment.”
Rendering of an optomechanical oscillator.
Christopher Baker and Ivan Favero at Université Paris Diderot-CNRS
In an optomechanical oscillator, the light confined between two mirrors causes one of the mirrors to oscillate on a spring. Experimentalists plan to use such devices to pit quantum mechanics against general relativity.
The background-noise idea was conceived in the 1980s and 1990s by Lajos Diósi of the Wigner Research Center for Physics in Hungary and, separately, by Roger Penrose of Oxford University. According to Penrose’s model, a discrepancy in the curvature of space and time could accumulate during a superposition, eventually destroying it. The more massive or energetic the object involved and, thus, the larger its gravitational field, the more quickly “gravitational decoherence” would happen. The space-time discrepancy ultimately results in an irreducible level of noise in the position and momentum of particles, consistent with the uncertainty principle.
“That would be a wonderful result if the ultimate reason for the uncertainty principle and the puzzling features of quantum physics are due to some quantum effects of space and time,” Milburn said.
Inspired by the possibility of experimental tests, Milburn and other theorists are expanding on Diósi and Penrose’s basic idea. In a July paper in Physical Review Letters, Blencowe derived an equation for the rate of gravitational decoherence by modeling gravity as a kind of ambient radiation. His equation contains a quantity called the Planck energy, which equals the mass of the smallest possible black hole. “When we see the Planck energy we think quantum gravity,” he said. “So it may be that this calculation is touching on elements of this undiscovered theory of quantum gravity, and if we had one, it would show us that gravity is fundamentally different than other forms of decoherence.”
Stamp is developing what he calls a “correlated path theory” of quantum gravity that pinpoints a possible mathematical mechanism for gravitational decoherence. In traditional quantum mechanics, probabilities of future outcomes are calculated by independently summing the various paths a particle can take, such as its simultaneous trajectories through both slits on a screen. Stamp found that when gravity is included in the calculations, the paths connect. “Gravity basically is the interaction that allows communication between the different paths,” he said. The correlation between paths results once more in decoherence. “No adjustable parameters,” he said. “No wiggle room. These predictions are absolutely definite.”
At meetings and workshops, theorists and experimentalists are working closely to coordinate the various proposals and plans for testing them. They say it’s a mutually motivating situation.
“In the final showdown between quantum mechanics and gravity, our understanding of space and time will be completely changed,” Milburn said. “We’re hoping these experiments will lead the way.”
This article was reprinted on ScientificAmerican.com.

From Time One: Discover How the Universe Began

It starts like a textbook physics experiment, with a ball attached to a spring. If a photon strikes the ball, the impact sets it oscillating very gently. But there’s a catch. Before reaching the ball, the photon encounters a half-silvered mirror, which reflects half of the light that strikes it and allows the other half to pass through. http://ow.ly/ruwgN

Saturday, December 7, 2013

Billions and Billions



I wasn't really a Cosmos fan, but I found Sagan's mind a writings remarkable, if you were a fan you'll probably enjoy this:

http://www.youtube.com/watch?v=HZmafy_v8g8&feature=youtu.be

Supernovae may Drive Evolution on Earth

Posted on December 7, 2013 at 6:00 am
By

                         
60400_490494217703659_1525124106_n
Image credit: ESO

On Earth, we have an almost incomprehensible array of life. It comes in millions of different forms (the best estimate puts the figure at 8.7 million species, not counting bacteria). What’s more, these organisms are only an addition to the species that have long been extinct. What causes such diversity?

The answer seems rather simple — seemingly random genetic mutations drive evolution. These mutations are the raw materials of genetic variation; without them, evolution could not occur. But what actually drives these random mutations? Well, this is where things get a little complicated; however, new light has been shed on one possible factor – supernovae.


Cosmic rays are an assortment of sub-atomic particles that reach Earth travelling at great speeds (sometimes near the speed of light). These fast moving particles continuously bombard the Earth, and they are thought to primarily come from supernovae explosions.

As these cosmic rays reach our atmosphere they collide with other molecules, producing a shower of other particles that rain down on the surface of Earth. Most of these pass harmlessly through an organism, but some researchers think that some of the the particles may strike through the genetic material inside biological cells and slightly alter their codes. This may produce a direct mutation in the living organism, or produce a mutation in any descendants that it may produce. If this theory is true, then cosmic ray particles will be one of the biggest drivers of evolution, not just on Earth — but everywhere in the Universe!

However, this unusual relationship between distant stellar explosions and evolution on Earth doesn’t end here. In the words of Henrik Svensmark, who heads the research into the relationship between supernovae and evolution, “The biosphere seems to contain a reflection of the sky, in that the evolution of life mirrors the evolution of the Galaxy.” The findings – based on geological and astronomical data – suggest that nearby supernovae have strongly influenced the development of life over the last 500 million years.

Svensmark began by studying open star clusters where there is intense star formation and supernovae activity. He was able to map when supernovae occurred near the solar system over time, and when he compared this with the geological record, he found a remarkable correlation. It seems that when the Sun passed through the spiral arms of the Milky Way, where large stars are most common, life appeared to prosper. Combined with the tectonic activity, these two factors appear to correlate with nearly all of the variations in the diversity of life of the past 500 million years.

Marine fossils (typically invertebrates such as trilobites, as well as plants and microbes) are a very good indicator of what conditions were like, and the diversity of any life that existed at a certain point in time. When the rate of nearby supernovae is high, the level of carbon dioxide is low, and this points to the thought that plant life may have been very high – as it would use up the carbon dioxide. But plants also ‘dislike’ carbon 13, and they leave it behind. This isotope can be seen in the geological record, and the changes in the level of which further provides quantitative data to back up the theory.

There has also been a match between the patterns of particular geological periods – as they start and end with either an increase or decrease in the supernovae rate. Supernovae are thought to cause sea levels to drop, as they appear to coincide with ‘ice ages’ or glacial periods. During this time, a lot of water is stored on the land as snow and ice – so the sea level stops (we call these changes glacial-eustatic). As a result the species that dominate a certain period (be it warmer or colder) changes as each passes.

Overall, the data supports the idea that cosmic-rays are linked to climate change in the long term, and it is these climatic alterations that lead to the biological effects. The link is actually even larger than that between our climate and our own Sun’s activity! It goes to show the extent to which the Universe is intertwined; just because objects are situated many light-years away from one another, they can still have an impact in extremely significant ways.

Gotcha! Photons Seen Without Being Destroyed in a First


 
     
 
Scientists have used a single atom trapped in an optical resonator to detect the presence of a reflected photon without destroying that packet of light.
Scientists have used a single atom trapped in an optical resonator to detect the presence of a reflected photon without destroying that packet of light.
CREDIT: MPQ, Quantum Dynamics Division.
If you want to see a packet of light called a photon, you have to destroy it. Any device that picks up on the presence of light has to absorb its energy, and with it, the photons. At least, that was what scientists thought until now.
At the Max Planck Institute of Quantum Optics in Germany, researchers found a way to detect single, visible-light photons without "touching" them and losing the photons themselves.
The work, detailed in the Nov. 14 issue of the journal Science Express, has important implications for quantum computing devices and communications. In an ordinary computer the presence of electrons — current — encodes the bits in logic circuits. Being able to keep photons around while still detecting them means photons could be used in a similar way. [Wacky Physics: The Coolest Little Particles in Nature]
Others have detected photons without destroying them, the most notable being Serge Haroche at the Collège de France in Paris, who won a Nobel Prize in 2012 for the achievement. However, he detected photons comprising microwave wavelengths of light. The Max Planck team detected visible-light photons, which are more useful for quantum communications.
Seeing photons
To see the photons, Ritter and his colleagues, Andreas Reiserer and Gerhard Rempe, trapped a single atom of rubidium in a cavity, just a half-millimeter across, with mirrors on the sides. The atom was in two states. In one, it is in resonance, or "coupled," with the cavity — one can think of them as vibrating in time with each other. In the other state it isn't — the atom is "out of tune" with both the cavity and the incoming photon. Atoms and subatomic particles are governed by the rules of quantum mechanics, which allowed the rubidium atom to be in both states at once.
They then fired laser pulses that, on average, had less than a single photon in them. When the photon reached the cavity, it would either continue inside and get reflected straight back or it would just bounce off the cavity, never entering — which happened if the atom was coupled to the cavity. The key is that there is a difference in the state of the atom after each outcome. They confirmed that the photon had reflected from the cavity a second time with an ordinary detector.
The photon didn't interact with the atom directly, but it did alter the atom's phase — the timing of its resonance with the cavity. The scientists could use the difference between the superposition state — when the atom is in two states at once — and the atom's measured phase to calculate whether or not the photon entered the cavity. In that way they "saw" the photon without destroying it, without touching it.
Photon qubits
Not "touching" the photon also means that certain quantum properties are never observed, preserving them. An unobserved photon can be in a "superposition" state — any one of its quantum properties, called degrees of freedom, can have more than one value simultaneously. Observing the photon forces it to be one or the other. For example, if a photon is polarized either horizontally or vertically, it's impossible to know which one until the photon is observed. In quantum mechanics that means the photon can be in both states, until it is measured and takes on a definite value. [How Quantum Entanglement Works (Infographic)]
This ability matters for quantum computing devices. Quantum computers are powerful because the bits in them, called qubits, can be both 1 and 0 at the same time, whereas an ordinary computer has to have its bits set at 1 or 0 sequentially. Essentially, a quantum computer can be in many states simultaneously, speeding up calculations such as factoring prime numbers.
If a photon is encoding the qubit, observing that photon directly would spoil its superposition state, and, thus, its ability to function as a qubit. But one might need to detect that the photon reached a certain place in the network. "Let's say you encode the qubit into the polarization," Ritter said. "The detection of the presence of a photon tells you nothing about its polarization."
By measuring the photon's state indirectly, however, it's possible to see the photon without destroying the quantum state (or the photon), and use different quantum states — such as polarization — to store qubits.
Going forward, Ritter says his group plans to work on boosting the efficiency of the detection – so far they can detect about 74 percent of the photons released. Stringing several detectors together would improve that — and one would end up with a detector that could pick up single photons better than those currently available.
Follow us @livescience, Facebook & Google+. Original article on LiveScience.

Noam Chomsky on the Freedom of Expression

by Big Think Editors      
December 7, 2013, 5:00 AM
Noam_chomsky2
"If we don't believe in freedom of expression for people we despise, we don't believe in it at all."
-Noam Chomsky (born on this date in 1928)
The EPA needs a Scientific Integrity Advocate?  Please, someone, tell me why?  Is this a backhanded way of labeling the agency lacking in scientific integrity.
 
Worse, if it does need one, why on Earth would it hire a long time employee of the so-called Union of "Concerned" "Scientists?"  This organization, like Greenpeace and "Friends of the Earth", has a multi-decade history of extremist environmental activism, particularly with regard to energy development in the US.  If you question my harsh judgment of the USC and similar organizations, for one, note the follow quote, taken from Kevin Mooney, of the Capital Research Center:
 
"Then there’s the so-called Union of Concerned Scientists (UCS), which is often quoted by the media as if it were a scientific, rather than political, organization. For one thing, UCS is in no sense an organization of scientists (unlike the EPA). Anyone willing to charge $35 on a credit card can join. One intrepid researcher even signed up his dog to drive the point home. The dog, Kenji, received a welcome kit and a signed letter from UCS President Kevin Knobloch."
 
The UCS's position on hydraulic fracking is clearly intended to fog over the mountainous scientific evidence of its safety and benefits by befuddling local residents and officials with open-ended questions that would make fracking appear unsafe whatever the evidence.  The quote below, taken from the UCS's web page:  Science, Democracy, and Fracking: A Guide for Community Residents and Policy Makers Facing Decisions over Hydraulic Fracturing (currently http://www.ucsusa.org/center-for-science-and-democracy/events/fracking-forum-toolkit.html) is a textbook example of this:
 
"Recent advances in hydraulic fracturing ( or “fracking”) technology leading to a rapid expansion in domestic oil and gas production.  The pace of growth is driving many communities to make decisions without access to comprehensive and reliable scientific information about the potential impacts of hydraulic fracturing on their local air and water quality, com­munity health, safety, economy, environment, and overall quality of life.
If you are an active citizen in a community facing decisions about fracking, this toolkit is for you. It provides practi­cal advice and resources to help you identify the critical questions to ask and get the scientific information you need when weighing the prospects and risks of shale oil or shale gas development in your region.
This toolkit can improve decision making on fracking by helping you to:
  • Identify critical issues about the potential impacts of fracking in your area, and how to obtain answers to your questions
  • Distinguish reliable information from misinformation or spin—and help your neighbors and local decision makers do the same
  • Identify and communicate with scientists, journalists, policy makers, and community groups that should be part of the public discussion
  • Identify and engage with the key actors in your community to influence oil and gas policy at the local and state level"
I suggest this passage, innocent on the surface, is about as deceitful a political tract as anyone could devise.  The fact is, anyone interested in the science supporting fracking -- and it is much more overwhelming than that supporting anthropocentric global warming -- can easily find it using a search engine or Wikipedia.  Talking with local people, even college scientists who will almost entirely have other specialties, is just going make a clear situation confused, leading to irrational opposition based on unfounded fears.  I suggest that an open-minded person can only come to one conclusion, that of Katie Brown's summary of situation (http://amedleyofpotpourri.blogspot.com/2013/12/report-environmentalists-opposing-shale.html):

"A report released today puts the folly of anti-fracking activism squarely in the spotlight. The report, authored primarily by University of California-Berkeley physics professor Richard Muller, comes to a sobering conclusion: “Environmentalists who oppose the development of shale gas and fracking are making a tragic mistake.”"

Nevertheless, here it is:
Integral player. Francesca Grifo, here testifying before a congressional panel earlier this year, has been named to lead the U.S. Environmental Protection Agency’s efforts to implement policies designed to protect scientific integrity.
U.S. House of Representatives Committee on Science, Space and Technology/Democrats
Integral player. Francesca Grifo, here testifying before a congressional panel earlier this year, has been named to lead the U.S. Environmental Protection Agency’s efforts to implement policies designed to protect scientific integrity.

For more than a decade, Francesca Grifo of the Union of Concerned Scientists (UCS) advocated for improving scientific integrity policies at government agencies. When she commented on a draft of the policy at the U.S. Environmental Protection Agency (EPA) in 2011, she wrote: “These are great principles but how will this happen? Who will monitor? Who will detect problems and enforce these strong words?”
Well, it turns out, she will. EPA announced today that it has hired Grifo to oversee its new policy on scientific integrity. “It’s great news,” says Rena Steinzor of the University of Maryland School of Law in Baltimore, who studies environmental regulation and the misuse of science in environmental policy.
Grifo is charged with overseeing the four main areas of EPA’s policy: creating and maintaining a culture of scientific integrity within the agency; communicating openly to the public; ensuring rigorous peer review; and encouraging the professional development of agency scientists.
It sounds like a gargantuan task, but Grifo won’t actually be checking the integrity of every committee, scientific document, and peer review. Instead, she will be focusing on improving the process, says Michael Halpern, her former colleague at UCS. Part of the job will be educating staff members. Last week, EPA launched an online training guide for its staff members to make them aware of the policy and its protections. “It’s a cultural change so that [EPA] scientists feel they can participate in public life and the scientific community,” Halpern says, and better prepare them to deal with political pressure.

If problems come to light, Grifo will help investigate. She will work with an internal Scientific Integrity Committee, as well as the inspector general. Her job is not a political appointment, so it comes with civil service protections. She will report to Glenn Paulson, EPA’s science adviser. Grifo will also issue an annual report about any incidents with scientific integrity at the agency.
UCS has ranked EPA’s policy, which was finalized about a year and a half ago, as one of the stronger ones in the U.S. government. Unlike most other agencies, EPA’s plan called for a full-time position. “While strong improvements have been made on paper, we recognize that the agency is challenged in fully realizing those improvements” Halpern wrote in a blog post.
Jeff Ruch of Public Employees for Environmental Responsibility in Washington, D.C., says he is hopeful for progress. “She is coming from an organization that is probably responsible for the adoption of scientific integrity policies,” he says. “We think that these policies are potentially revolutionary. But progress has been slow and uneven.” It’s not clear, he says, what power she would have to bring relief in individual cases.

Hillary Clinton Touts Benefits of Oil, Natural Gas

    

Often times in media reports, the subject of natural gas development may come off as a bit of a partisan issue. Here in Pennsylvania, Democrats like State Senator Jim Ferlo have even pushed for a moratorium. Spanning out nationally, the topic is far from partisan, however – with Democratic Governors like Pat Quinn in Illinois, John Hickenlooper in Colorado, and even Jerry Brown in California rejecting the claims of anti-fracking activists and openly discussing the benefits of development.
Add to that list former Secretary of State Hillary Clinton, who showed during a speech at Hamilton College in New York on Friday that responsible development is something we can and indeed should embrace.
From the Democrat and Chronicle:
Late into the lecture portion of Clinton’s Oneida County appearance, she referenced a report that the U.S. in on track to surpass Russia in domestic oil-and-gas production.
That’s good news, Clinton said.
What that means for viable manufacturing and industrialization in this country is enormous,” she said to the crowd of 5,800 in Hamilton’s athletic field house.
As IHS highlighted last month, shale is helping to transform the U.S. economy, and bringing manufacturing back to America after a decade of decline.
EID has also been following the U.S. and Russian oil and gas production race for the past year, and the United States has likely surpassed Russia to become the largest oil and gas producing country in the world – thanks in large part to shale development.
Needless to say, the former U.S. Senator from New York was spot on with her comments on the implications this can have for U.S. economic growth, as well as in other markets.
As many of you will remember, the State Department under Mrs. Clinton’s leadership actually promoted Gasland as part of its Annual Film Showcase. It’s good to see that she has evolved in her view of responsible oil and gas development.

Report: Environmentalists Opposing Shale Gas Are Making a ‘Tragic Mistake’

    
A report released today puts the folly of anti-fracking activism squarely in the spotlight. The report, authored primarily by University of California-Berkeley physics professor Richard Muller, comes to a sobering conclusion: “Environmentalists who oppose the development of shale gas and fracking are making a tragic mistake.”
The reason is because natural gas provides a solution for two major worldwide environmental concerns: air pollution and greenhouse gas emissions. For its ability to provide an affordable energy source that can also address these problems, the authors conclude that “shale gas is a wonderful gift that has arrived just in time.”
The report focuses heavily on the local air quality benefits of shale gas, which could be especially effective in places like China that have rapidly growing economies, and by extension a great need for affordable and abundant energy.  As the repot notes, shale gas “provides a solution to the pollution,” observing it’s “amazing” that local air quality benefits are “not more widely addressed by environmentalists.”
The report focuses mostly on shale gas and local air quality, but the reduction in greenhouse gas emissions made possible by shale gas is also addressed. The authors find that “both global warming and air pollution can be mitigated by the development and utilization of shale gas,” owing to the fuel’s ability to reduce carbon dioxide emissions, as well as low methane leakage rates (more on that later).
In addition to firmly establishing the environmental benefits of natural gas, the also report addresses a number of anti-fracking activists’ objections to responsible shale development, concluding that they are not credible: “These concerns are either largely false or can be addressed by appropriate regulation.”
While the authors express some concern about the volumes of water required for hydraulic fracturing, they explain that “viable alternatives exist,” including the fact that “most of the water that flows back from the well can be treated and reused.”  As an example, the report points to Apache, a company that has made news recently for its dramatic reduction in water use:
“[T]hey [Apache] eliminated fresh water use in fracking operations in Irion County, Texas; this year they have used only recycled produced water from fracking operations and oil fields together with brackish water obtained from the Santa Rosa formation at 800 to 900 feet depth [Reuters 2013]. In all of Apache’s hydraulic fracturing operations in the Permian Basin, more than half the water is sourced from non-fresh water sources, about 900 wells” (p. 6-7).
Of course, as we’ve pointed out many times, water recycling is quickly becoming standard operating procedure. In Pennsylvania alone, Marcellus producers are now recycling 90 percent of their flowback water, and that’s a trend we’re increasingly seeing elsewhere across the country.
Regarding anti-fracking activists’ claims on flaming faucets and the fraud of Gasland, the authors offer a scathing but entirely justified critique:
“The famous ‘flaming faucets’ shown in the movie Gasland (and on YouTube) were not due to fracking, despite what that movie suggests. The accounts were investigated by state environmental agencies, and in every case traced to methane-saturated ground water produced by shallow bacteria. Indeed, the movie FrackNation includes a clip in which the Gasland producer, writer, and star Josh Fox admits that flaming faucets were common long before fracking was ever tried” (p. 7).
The report states that any risk of leakage is “not particularly linked to shale gas wells.” It’s also clear that whatever risks there may be to drinking water supplies, they are manageable: “The solution lies in regulating shale at least as stringently as conventional oil and gas,” the report states.
As for activists’ claim that methane emissions during development cancel out the climate benefits of natural gas, the report says that’s simply wrong:
“The initial scare of the danger of ‘fugitive’ (leaked) methane came from mistaken use of the fact that its ‘greenhouse potential’ is 83 times that of CO2, kilogram per kilogram. That makes it seem that even 1% leakage would undo its advantage over coal. But if you take into account the fact that methane is rapidly destroyed in the atmosphere (with a much shorter lifetime than CO2), then the potency is reduced to about 34 times. And the fact that methane weighs less (molecule per molecule) than CO2 means that leaked methane is only 12 times more potent for the same energy produced.  Because natural gas power plants are more efficient than those of coal, even with leakage rate of up to 17% (far higher than even the most pessimistic estimates), natural gas still provides a greenhouse gas improvement over coal for the same electricity produced” (p. 8).
This is yet another rebuke of the research of Cornell anti-fracking activist Anthony Ingraffea, who has bizarrely claimed that natural gas is a “gangplank” to irreversible global warming.  Dr. Muller has offered harsh criticism for Ingraffea’s work before (in the New York Times, no less), so it’s unsurprising that he identifies a series of flaws in the infamous Howarth/Ingraffea methane paper:
“However, Howarth’s original work made assumptions for parameters that were not directly measured, and many of these were ‘conservative estimates’ – which means prejudicial against natural gas. It took two years, but finally a calibrated study of 190 wells showed that the leakage from shale gas production averaged about 0.4%. [Allen, 2013; Hausfather & Muller 2013]. If we add in leakage in pipelines and storage, the maximum is still only 1.4%, and the greenhouse advantage over coal is large. A recent report by Miller et al. [2013] suggests the rate could be twice that; but even if this new report is more accurate than the EPA value, fugitive methane is still a vast greenhouse gas improvement compared to coal” (p. 8).
On seismicity, the Muller report notes, ”No large earthquakes have been associated with fracking but rather with ‘disposal wells’.” Further, such seismic activity from wastewater disposal is already being mitigated by the surge in water recycling: “We can prevent disposal earthquakes by recycling water to minimize injection volumes and by taking care in the choice of disposal well locations.”
Finally, the report offers yet another blow to anti-fracking activists who contend that natural gas will somehow ‘crowd out’ renewables:
“Yet cheap natural gas can also make it easier for solar and wind energy to further penetrate electricity markets by providing the rapid back-up that those intermittent sources require. In addition, natural gas is the only base load fuel that can be downscaled into microgrids and distributed generation networks to provide that same flexibility and reliability for solar energy on rooftops and in buildings, expanding the market for urban solar systems. Particularly for areas focusing on distributed generation, natural gas can be an enabler of wind and solar” (p. 9-10).
To sum up: the report finds that shale gas “can be the solution” for addressing air pollution and reducing greenhouse gas emissions, and the numerous objections to fracking put forth by activists are simply not credible.  As the report puts it, environmental groups should “recognize the shale gas revolution as beneficial to society – and lend their full support to helping it advance.”

This technology could reduce the travel time to Mars to just 39 days

From: 
Science Recorder | Kramer Phillips | Saturday, December 07, 2013

Scientists now say it may be possible to travel to Mars in just over a month, reducing the one-way six-month journey to just under three months.  The advancement could have major implications for startups aiming to travel to Mars in the coming years, part of an increasingly competitive bid to put the first humans on Mars.

“VASIMR stands for Variable Specific Impulse Magnetoplasma Rocket, which makes use of argon gas (one of the most stable gasses known to man) and a renewable source of energy found in space, radio waves in the form of light.

The rocket will allow for a mission to Mars with a travel period of just 39 days, which is almost 6 times faster than current methods. With speeds estimated at 35 miles a second, the rocket system will make quick work of the distance between Earth and Mars. NASA rates a new system on a scale of one to ten based on its readiness to be deployed. The VASIMR system is at a six currently, which means that it is ready to be tested in space. It would seem that it won’t be much longer before the new rocket system is employed in all space missions.”

Read more: http://www.sciencerecorder.com/news/this-technology-could-reduce-the-travel-time-to-mars-to-just-39-days/#ixzz2moa6JM6S

Evolutionary biologist Alice Roberts was denounced by a very silly Christian as a "fanatical evolutionist" who peddles "vacuous subjective claims that our ape-like ancestors apparently miraculously (something in which perhaps only a minority at the BBC would believe) decided to walk upright in search of food".

This is from Atheism on Facebook, author currently unknown:
 
Evolutionary biologist Alice Roberts was denounced by a very silly Christian as a "fanatical evolutionist" who peddles "vacuous subjective claims that our ape-like ancestors apparently miraculously (something in which perhaps only a minority at the BBC would believe) decided to walk upright in search of food".

She has written an eloquent rebuttal, which to date (quelle surprise) has not met with a response:

//I wanted to register my disquiet at the paragraph in which I am d...escribed as a "fanatical evolutionist". Like most biologists, I think that evolution through natural selection best explains the diversity of life on this planet; this is not a minority view and not necessarily incompatible with religious belief: many Christians accept evolution.

However, I felt moved to respond to the criticisms of the series Origins of Us, and set the record straight. Firstly, the criticisms do BBC Science an injustice. Even if I wanted to present my own opinions and speculation (in any other way than clearly flagging them as such) the BBC would not allow me to do this in a science programme. Secondly, the criticism levelled at me brings my own academic integrity into question. Every hypothesis and fact discussed or presented in the programme is already "out there", in peer reviewed scientific publications. BBC Science (and I myself) are very careful about the factual basis of such programmes, and extremely careful to differentiate between fact and opinion.

The "vacuous subjective claims" to which Mr Stephen Green alludes are facts based on peer-reviewed scientific research. I am also surprised that Mr Green suggests I presented the extremely outdated "savannah hypothesis" as current science - this is something that was critically appraised and research suggesting, instead, an arboreal origin for bipedalism was put forward. The idea that tool-using and tool-making may have influenced the shape of our hands is, again, not idle speculation but based on published research. Any change in anatomy which leads to a survival advantage (whether that's an adaptation helping survival in a particular natural environment or an adaptation which makes you better at making technology which helps you to survive) is likely to be selected for.

I realise that few readers of this website will read my response objectively, but I object strongly to the criticism that my programmes with the BBC have lacked objectivity and include "idle speculation". That can only be true if you believe that the numerous academic papers which form the backbone of such a series are also "idle speculation".

Regards, Professor Alice Roberts
John Hobbs
Shared publicly  -  Dec 5, 2013

Columbus discovers Hispaniola (12-5-1492)
Arawak men and women, naked, tawny, and full of wonder, emerged from their villages onto the island's beaches and swam out to get a closer look at the strange big boat. When Columbus and his sailors came ashore, carrying swords, speaking oddly, the Arawaks ran to greet them, brought them food, water, gifts.....
....The Indians had been given an impossible task. The only gold around was bits of dust garnered from the streams. So they fled, were hunted down with dogs, and were killed.
...From his base on Haiti, Columbus sent expedition after expedition into the interior. They found no gold fields, but had to fill up the ships returning to Spain with some kind of dividend. In the year 1495, they went on a great slave raid, rounded up fifteen hundred Arawak men, women, and children, put them in pens guarded by Spaniards and dogs, then picked the five hundred best specimens to load onto ships. Of those five hundred, two hundred died en route. The rest arrived alive in Spain and were put up for sale by the archdeacon of the town, who reported that, although the slaves were "naked as the day they were born," they showed "no more embarrassment than animals." Columbus later wrote: "Let us in the name of the Holy Trinity go on sending all the slaves that can be sold."
What Columbus did to the Arawaks of the Bahamas, Cortes did to the Aztecs of Mexico, Pizarro to the Incas of Peru, and the English settlers of Virginia and Massachusetts to the Powhatans and the Pequots.
* A People's History of the United States by Howard Zinn


A “Buddy Bench” Makes Recess More Inclusive

Posted by Rohmteen Mokhtari, December 06, 2013
 
Christian on Buddy BenchFor many elementary students, recess can be a highlight of the school day. A chance to run and play after hours of sitting still behind a desk.
But it can also be an isolating experience for students who feel left out.
At Roundtown Elementary School in York, PA one 2nd grader is doing his part to make sure all students are included in the fun.
With his family considering a temporary move to Germany, Christian began researching German schools. That’s when he noticed that one German school had a “buddy bench” for students who felt lonely or excluded during recess.
With this idea in mind he took action to support the students at his school who he noticed were being left out recess. He went to his principal and got a “buddy bench” at his school.
Now when students feel alone or excluded they can go to the bench where they’ll be invited by other students to talk or play.
This “buddy bench” allows more students to share in the joys of recess.
But just as importantly, it helps create a school culture of caring and inclusion. It challenges students to support their fellow classmates and empowers them to be a part of the solution.
As Christian puts it, “we show we care about others when we ask others to play.”
Christian and the “buddy bench” teach us a lot about what it takes to make schools more safe and welcoming for all students.
Christian exemplifies the power of upstanders willing to take action when they see students being excluded or teased.
In order to become upstanders, students need to know there are many ways to constructively support a classmate who is being bullied or teased. Options include talking to an adult when they see a student being teased, speaking up in the moment, supporting a student who has been bullied and causing a distraction in the moment that takes the attention away from a student who is being targeted.
Welcoming School’s new film, What Can We Do? Bias, Bullying and Bystanders shows how two schools are using Welcoming Schools materials to help students be upstanders. Learn more about the film and find many more resources to stop name-calling and bullying.

The Problem of Antimatter

    When in contact, matter and antimatter can annihilate one another to produce pure energy―that’s why there is extremely little naturally occurring antimatter in the world around us. — Brian Greene (1999) Mass is made of certain kinds of particles. The Standard Model lists them all. In its scheme each particle is paired up with an antiparticle. One way to think about an antiparticle is that it is the particle but travels back in time. Another is its charge is opposite. When a particle meets its antiparticle they annihilate immediately. They make two photons whose energies are equal to the masses in accordance with: E = mc2. So the Problem of Antimatter’s not: Why is so little of it left? It’s: Why is there any matter left? Which is to say: It seems that the original proportion wasn’t half and half. How come? Physics has a symmetry it calls CP. It says exactly 50-50 is the way it has to be. Physical cosmology contrives an answer to the Problem. It says CP Symmetry was violated when particles were born in the Big Bang. There are some suggestions as to how this happened but they look like little more than stirring up the same old Problem, like the beat cop saying to the beach bum: Move along. More recently it turns out that the weak force doesn’t follow CP Symmetry. At last, there is a way to have more particles than antis. The celebration is a short one. CP violation, as it’s called, covers just a trillionth of the matter that we see. Back to square one: Why were there more particles than antiparticles? - See more at: http://www.timeone.ca/clues/the-problem-of-antimatter/#sthash.RXq5zA1z.qQ7nzyX8.dpuf

Fossils Yield Oldest Known Human DNA

By Gemma Tarlach | December 4, 2013 12:00 pm
bone-analysis
Researchers have successfully sequenced the oldest known human DNA, and it points to unexpected relationships between hominid populations scattered across the length of Eurasia.
The genetic material came from a 400,000-year-old femur of Homo heidelbergensis, an early hominid considered to be the ancestor of both Neanderthals and modern humans. The achievement pushes back the age of the oldest hominid DNA sequencing by 200,000 years.
The site of the fossil’s discovery, Sima de los Huesos (“pit of bones”) in northern Spain, has yielded remains of more than two dozen individuals dated to older than 300,000 years. The skeletons found at Sima de los Huesos exhibit Neanderthal-derived traits, leading researchers to anticipate a strong relation to Homo neanderthalensis.

Denisovan Connection

But after sequencing an almost complete mitochondrial DNA (mtDNA) genome from the femur, researchers discovered the individual was more closely related to Denisovans, eastern Eurasian hominids known only from a few fragments found at sites in Siberia. Although related to Neanderthals, Denisovans are considered genetically distinct, and are thought to have dispersed from Siberia to southeast Asia. By way of comparison, Neanderthals and modern humans are more closely related in their mitochondrial makeup than are Neanderthals and Denisovans.
Ancient DNA that can still be sequenced is usually found only in permafrost conditions; it typically degrades much faster in temperate and tropical zones, where early hominids lived. But the Sima de los Huesos cave site’s humidity and naturally controlled temperature created an environment conducive to mtDNA preservation of both early hominids and their contemporaries, including a cave bear, which researchers successfully sequenced earlier this year.

Mixing Populations

femur-groundThe team obtained about two grams of bone samples — less than a tenth of an ounce — from the femur and performed a number of tests to rule out contamination with modern genetic material. They then sequenced the mitochondrial DNA, because retrieving usable mtDNA is easier than collecting nuclear DNA from such an old specimen because several hundred copies of mtDNA exist in each cell. This makes it possible to piece together the mitochondrial genome even if many of the copies are degraded. Mitchondrial DNA is passed down from the individual’s mother, however, and does not provide as complete an evolutionary picture as nuclear DNA.
In light of the individual’s unexpected relatedness to Denisovans, the team proposed a number of possible scenarios for how the genes from a population known only in Siberia ended up in Spain. One of the most plausible, researchers suggested, was gene flow from another, as-yet-unknown but Denisovan-like hominid into the Sima de los Huesos group.
To support this theory, the team noted in their paper published today in Nature that a number of early hominid fossils found from the same time period across Asia, Europe and Africa have been classified as H. heidelbergensis, in many ways a catch-all lacking precise definition. It’s possible that some or all of these individuals may be an early hominid population as yet unclassified by science.

CLIMATE CHANGE WEEKLY #112

Southern Hemisphere polar ice extent set new records this week, combining with fairly average Northern Hemisphere polar ice extent to set the final stages of a year marked by above-average global polar ice extent. Polar ice caps, apparently, are global warming deniers, attacking the science of alarmist global warming predictions.
Average Southern Hemisphere polar sea ice extent during November 2013 was nearly 1 million square kilometers above the long-term average.
When polar ice happens to be below average in a given year, global warming alarmists cite the annual departure from the long-term mean as proof of a human-induced global warming crisis. During years like 2013, when polar ice extent is above the long-term average, global warming alarmists are largely silent on the topic.
Importantly, even if the years with below-average polar ice extent began to form a meaningful trend, this in itself would not constitute a global warming crisis. Polar ice retreat would merely reflect warming temperatures, even if the warming is modest and benign. During recent years when global polar ice extent has been below normal, it has been Northern Hemisphere polar ice – floating in the Arctic Ocean – driving the global trend. When floating sea ice melts, it does nothing to raise global sea levels.
Southern Hemisphere polar ice, resting primarily on the Antarctic continent, has been consistently expanding during the 30-plus years since NASA/NOAA satellites first began precisely measuring polar ice extent.

Friday, December 6, 2013

If you really care about the environment you should love fracking. Here's why

A report released on Friday by the Centre for Policy Studies (CPS) has found that increasing use of shale gas can massively reduce some of the world's deadliest air pollution.
As well as slashing carbon emissions and providing enticing economic prospects the findings of the report should present a compelling case for those who value the environment to embrace fracking.
Reduce deadly PM2.5
PM2.5 are microscopic dust particles created from burning fuel. These tiny particles can penetrate the lungs where they are absorbed into the blood and lead to cardiorespiratory disease and are one of the major contributors to air pollution.
The US Environmental Protection Agency (EPA) estimates that PM2.5 is responsible for about 75,000 premature deaths per year in the US. The use of coal for energy is a major source of rising levels of PM.25.
In the US, shale gas production has grown by a factor of 17 over the past 13 years. Shale now supplies 35 per cent of US natural gas. Compared to coal, shale gas results in a 400-fold reduction of PM2.5, a 4,000-fold reduction in sulphur dioxide, a 70-fold reduction in nitrous oxides, and more than a 30-fold reduction in mercury. Air pollution is still major killer globally with the Health Effects Institute estimating that air pollution led to 3.2m deaths in the year 2010.
Slash CO2 emissions
While shale gas is a fossil fuel, most of the increases in CO2 are coming from increasing coal use in developing countries. The CPS report estimates if their increased energy needs could be met from natural gas instead of coal, global warming could be slowed by a factor of two to three.
This would mean that instead of having 30 to 50 years before the world reaches twice the preindustrial carbon dioxide levels the we may have 60 to 100 years. If developing countries continue to use coal their PM2.5 and greenhouse emissions will also continue to grow.
The authors also highlight the need for energy conservation, especially in China. However, they emphasise that this will be far from sufficient to tackle the enormous environmental challenges facing the planet.
Affordable
Europe and China both pay a high price for imported natural gas, typically paying $10m (£6m) British Thermal Unit. With such high prices Europe and China are in a strong position to exploit vast deposits of shale gas at greatly reduced cost compared to natural gas imports.
The report suggests that Europe could be the testing and proving ground where innovative technology can be trialled and improved while still profitable. If the same technology and expertise is brought to developing countries they can also enjoy a more environmentally friendly energy mix.
The report also addresses many of the objections to fracking such as the increased frequency of earthquakes and the dangers to water supply. It documents how these concerns have been wildly exaggerated and in some cases are wholly spurious.
The report was written by Richard Muller, professor of physics at the University of California Berkeley since 1980 and Elizabeth Muller, co-founder of Berkeley Earth a non-profit working on environmental issues. 
- See more at: http://www.cityam.com/blog/1386342437/if-you-really-care-about-environment-you-should-love-fracking-heres-why#sthash.zkvHYfuD.dpuf

Time warp: Researchers show possibility of

 
 
 




(Phys.org) —Popular television shows such as "Doctor Who" have brought the idea of time travel

into the vernacular of popular culture. But problem of time travel is even more complicated than one

might think. LSU's Mark Wilde has shown that it would theoretically be possible for time travelers to

copy quantum data from the past.


It all started when David Deutsch, a pioneer of quantum computing and a physicist at Oxford, came up with

a simplified model of time travel to deal with the paradoxes that would occur if one could travel back in

time. For example, would it be possible to travel back in time to kill one's grandfather? In the Grandfather

paradox, a time traveler faces the problem that if he kills his grandfather back in time, then he himself is

never born, and consequently is unable to travel through time to kill his grandfather, and so on. Some

theorists have used this paradox to argue that it is actually impossible to change the past.

"The question is, how would you have existed in the first place to go back in time and kill your

grandfather?" said Mark Wilde, an LSU assistant professor with a joint appointment in the Department of

Physics and Astronomy and with the Center for Computation and Technology, or CCT.

Deutsch solved the Grandfather paradox originally using a slight change to quantum theory, proposing that

you could change the past as long as you did so in a self-consistent manner.

"Meaning that, if you kill your grandfather, you do it with only probability one-half," Wilde said. "Then,

he's dead with probability one-half, and you are not born with probability one-half, but the opposite is a fair

chance. You could have existed with probability one-half to go back and kill your grandfather."

But the Grandfather paradox is not the only complication with time travel. Another problem is the

no-cloning theorem, or the no "subatomic Xerox-machine" theorem, known since 1982. This theorem,

which is related to the fact that one cannot copy quantum data at will, is a consequence of Heisenberg's

famous Uncertainty Principle, by which one can measure either the position of a particle or its momentum,

but not both with unlimited accuracy. According to the Uncertainty Principle, it is thus impossible to have a

subatomic Xerox-machine that would take one particle and spit out two particles with the same position and

momentum – because then you would know too much about both particles at once.

"We can always look at a paper, and then copy the words on it. That's what we call copying classical data,"

Wilde said. "But you can't arbitrarily copy quantum data, unless it takes the special form of classical data.

This no-cloning theorem is a fundamental part of quantum mechanics – it helps us reason how to process

quantum data. If you can't copy data, then you have to think of everything in a very different way."

But what if a Deutschian closed timelike curve did allow for copying of quantum data to many different

points in space? According to Wilde, Deutsch suggested in his late 20th century paper that it should be

possible to violate the fundamental no-cloning theorem of quantum mechanics. Now, Wilde and

collaborators at the University of Southern California and the Autonomous University of Barcelona have

advanced Deutsch's 1991 work with a recent paper in Physical Review Letters. The new approach allows



for a particle, or a time traveler, to make multiple loops back in time – something like Bruce Willis' travels

in the Hollywood film "Looper."

"That is, at certain locations in spacetime, there are wormholes such that, if you jump in, you'll emerge at

some point in the past," Wilde said. "To the best of our knowledge, these time loops are not ruled out by the

laws of physics. But there are strange consequences for quantum information processing if their behavior is

"Time warp: Researchers show possibility of cloning quantum information from the past." Phys.org. 6 Dec 2013.

http://phys.org/news/2013-12-warp-possibility-cloning-quantum.html


Page 1/3


dictated by Deutsch's model."

A single looping path back in time, a time spiral of sorts, behaving according to Deutsch's model, for

example, would have to allow for a particle entering the loop to remain the same each time it passed

through a particular point in time. In other words, the particle would need to maintain self-consistency as it

looped back in time.

"In some sense, this already allows for copying of the particle's data at many different points in space,"

Wilde said, "because you are sending the particle back many times. It's like you have multiple versions of

the particle available at the same time. You can then attempt to read out more copies of the particle, but the

thing is, if you try to do so as the particle loops back in time, then you change the past."

To be consistent with Deutsch's model, which holds that you can only change the past as long as you can do

it in a self-consistent manner, Wilde and colleagues had to come up with a solution that would allow for a

looping curve back in time, and copying of quantum data based on a time traveling particle, without

disturbing the past.

"That was the major breakthrough, to figure out what could happen at the beginning of this time loop to

enable us to effectively read out many copies of the data without disturbing the past," Wilde said. "It just

worked."

However, there is still some controversy over interpretations of the new approach, Wilde said. In one

instance, the new approach may actually point to problems in Deutsch's original closed timelike curve

model.

"If quantum mechanics gets modified in such a way that we've never observed should happen, it may be

evidence that we should question Deutsch's model," Wilde said. "We really believe that quantum mechanics

is true, at this point. And most people believe in a principle called Unitarity in quantum mechanics. But

with our new model, we've shown that you can essentially violate something that is a direct consequence of

Unitarity. To me, this is an indication that something weird is going on with Deutsch's model. However,

there might be some way of modifying the model in such a way that we don't violate the no-cloning

theorem."

Other researchers argue that Wilde's approach wouldn't actually allow for copying quantum data from an

unknown particle state entering the time loop because nature would already "know" what the particle

looked like, as it had traveled back in time many times before.

But whether or not the no-cloning theorem can truly be violated as Wilde's new approach suggests, the

consequences of being able to copy quantum data from the past are significant. Systems for secure Internet

communications, for example, will likely soon rely on quantum security protocols that could be broken or

"hacked" if Wilde's looping time travel methods were correct.

"If an adversary, if a malicious person, were to have access to these time loops, then they could break the

security of quantum key distribution," Wilde said. "That's one way of interpreting it. But it's a very strong

practical implication because the big push of quantum communication is this secure way of communicating.

We believe that this is the strongest form of encryption that is out there because it's based on physical

principles."

Today, when you log into your Gmail or Facebook, your password and information encryption is not based

on physical principles of quantum mechanical security, but rather on the computational assumption that it is

very difficult for "hackers" to factor mathematical products of prime numbers, for example. But physicists

and computer scientists are working on securing critical and sensitive communications using the principles

"Time warp: Researchers show possibility of cloning quantum information from the past." Phys.org. 6 Dec 2013.

http://phys.org/news/2013-12-warp-possibility-cloning-quantum.html


Page 2/3


of quantum mechanics. Such encryption is believed to be unbreakable – that is, as long as hackers don't

have access to Wilde's looping closed timelike curves.

"This ability to copy quantum information freely would turn quantum theory into an effectively classical

theory in which, for example, classical data thought to be secured by quantum cryptography would no

longer be safe," Wilde said. "It seems like there should be a revision to Deutsch's model which would

simultaneously resolve the various time travel paradoxes but not lead to such striking consequences for

quantum information processing. However, no one yet has offered a model that meets these two

requirements. This is the subject of open research."

More information: DOI: 10.1103/PhysRevLett.111.190401



Provided by Louisiana State University

Cellular automaton

From Wikipedia, the free encyclopedia https://en.wikipedi...