Search This Blog

Wednesday, May 6, 2015

Quantum gravity


From Wikipedia, the free encyclopedia

Quantum gravity (QG) is a field of theoretical physics that seeks to describe the force of gravity according to the principles of quantum mechanics.

The current understanding of gravity is based on Albert Einstein's general theory of relativity, which is formulated within the framework of classical physics. On the other hand, the nongravitational forces are described within the framework of quantum mechanics, a radically different formalism for describing physical phenomena based on probability.[1] The necessity of a quantum mechanical description of gravity follows from the fact that one cannot consistently couple a classical system to a quantum one.[2]

Although a quantum theory of gravity is needed in order to reconcile general relativity with the principles of quantum mechanics, difficulties arise when one attempts to apply the usual prescriptions of quantum field theory to the force of gravity.[3] From a technical point of view, the problem is that the theory one gets in this way is not renormalizable and therefore cannot be used to make meaningful physical predictions. As a result, theorists have taken up more radical approaches to the problem of quantum gravity, the most popular approaches being string theory and loop quantum gravity.[4] A recent development is the theory of causal fermion systems which gives quantum mechanics, general relativity and quantum field theory as limiting cases.[5][6][7][8][9][10]

Strictly speaking, the aim of quantum gravity is only to describe the quantum behavior of the gravitational field and should not be confused with the objective of unifying all fundamental interactions into a single mathematical framework. Although some quantum gravity theories such as string theory try to unify gravity with the other fundamental forces, others such as loop quantum gravity make no such attempt; instead, they make an effort to quantize the gravitational field while it is kept separate from the other forces. A theory of quantum gravity that is also a grand unification of all known interactions is sometimes referred to as a theory of everything (TOE).

One of the difficulties of quantum gravity is that quantum gravitational effects are only expected to become apparent near the Planck scale, a scale far smaller in distance (equivalently, far larger in energy) than what is currently accessible at high energy particle accelerators. As a result, quantum gravity is a mainly theoretical enterprise, although there are speculations about how quantum gravity effects might be observed in existing experiments.[11]

Overview


Diagram showing where quantum gravity sits in the hierarchy of physics theories

Much of the difficulty in meshing these theories at all energy scales comes from the different assumptions that these theories make on how the universe works. Quantum field theory depends on particle fields embedded in the flat space-time of special relativity. General relativity models gravity as a curvature within space-time that changes as a gravitational mass moves. Historically, the most obvious way of combining the two (such as treating gravity as simply another particle field) ran quickly into what is known as the renormalization problem. In the old-fashioned understanding of renormalization, gravity particles would attract each other and adding together all of the interactions results in many infinite values which cannot easily be cancelled out mathematically to yield sensible, finite results. This is in contrast with quantum electrodynamics where, given that the series still do not converge, the interactions sometimes evaluate to infinite results, but those are few enough in number to be removable via renormalization.

Effective field theories

Quantum gravity can be treated as an effective field theory. Effective quantum field theories come with some high-energy cutoff, beyond which we do not expect that the theory provides a good description of nature. The "infinities" then become large but finite quantities depending on this finite cutoff scale, and correspond to processes that involve very high energies near the fundamental cutoff. These quantities can then be absorbed into an infinite collection of coupling constants, and at energies well below the fundamental cutoff of the theory, to any desired precision; only a finite number of these coupling constants need to be measured in order to make legitimate quantum-mechanical predictions. This same logic works just as well for the highly successful theory of low-energy pions as for quantum gravity. Indeed, the first quantum-mechanical corrections to graviton-scattering and Newton's law of gravitation have been explicitly computed[12] (although they are so astronomically small that we may never be able to measure them). In fact, gravity is in many ways a much better quantum field theory than the Standard Model, since it appears to be valid all the way up to its cutoff at the Planck scale.

While confirming that quantum mechanics and gravity are indeed consistent at reasonable energies, it is clear that near or above the fundamental cutoff of our effective quantum theory of gravity (the cutoff is generally assumed to be of the order of the Planck scale), a new model of nature will be needed. Specifically, the problem of combining quantum mechanics and gravity becomes an issue only at very high energies, and may well require a totally new kind of model.

Quantum gravity theory for the highest energy scales

The general approach to deriving a quantum gravity theory that is valid at even the highest energy scales is to assume that such a theory will be simple and elegant and, accordingly, to study symmetries and other clues offered by current theories that might suggest ways to combine them into a comprehensive, unified theory. One problem with this approach is that it is unknown whether quantum gravity will actually conform to a simple and elegant theory, as it should resolve the dual conundrums of special relativity with regard to the uniformity of acceleration and gravity, and general relativity with regard to spacetime curvature.

Such a theory is required in order to understand problems involving the combination of very high energy and very small dimensions of space, such as the behavior of black holes, and the origin of the universe.

Quantum mechanics and general relativity


Gravity Probe B (GP-B) has measured spacetime curvature near Earth to test related models in application of Einstein's general theory of relativity.

The graviton

At present, one of the deepest problems in theoretical physics is harmonizing the theory of general relativity, which describes gravitation, and applications to large-scale structures (stars, planets, galaxies), with quantum mechanics, which describes the other three fundamental forces acting on the atomic scale. This problem must be put in the proper context, however. In particular, contrary to the popular claim that quantum mechanics and general relativity are fundamentally incompatible, one can demonstrate that the structure of general relativity essentially follows inevitably from the quantum mechanics of interacting theoretical spin-2 massless particles (called gravitons).[13][14][15][16][17]
While there is no concrete proof of the existence of gravitons, quantized theories of matter may necessitate their existence.[citation needed] Supporting this theory is the observation that all fundamental forces except gravity have one or more known messenger particles, leading researchers to believe that at least one most likely does exist; they have dubbed this hypothetical particle the graviton. The predicted find would result in the classification of the graviton as a "force particle" similar to the photon of the electromagnetic field. Many of the accepted notions of a unified theory of physics since the 1970s assume, and to some degree depend upon, the existence of the graviton. These include string theory, superstring theory, M-theory, and loop quantum gravity. Detection of gravitons is thus vital to the validation of various lines of research to unify quantum mechanics and relativity theory.

The dilaton

The dilaton made its first appearance in Kaluza–Klein theory, a five-dimensional theory that combined gravitation and electromagnetism. Generally, it appears in string theory. More recently, it has appeared in the lower-dimensional many-bodied gravity problem[18] based on the field theoretic approach of Roman Jackiw. The impetus arose from the fact that complete analytical solutions for the metric of a covariant N-body system have proven elusive in general relativity. To simplify the problem, the number of dimensions was lowered to (1+1), i.e. one spatial dimension and one temporal dimension. This model problem, known as R=T theory[19] (as opposed to the general G=T theory) was amenable to exact solutions in terms of a generalization of the Lambert W function. It was also found that the field equation governing the dilaton (derived from differential geometry) was the Schrödinger equation and consequently amenable to quantization.[20]
Thus, one had a theory which combined gravity, quantization and even the electromagnetic interaction, promising ingredients of a fundamental physical theory. It is worth noting that the outcome revealed a previously unknown and already existing natural link between general relativity and quantum mechanics. However, this theory needs to be generalized in (2+1) or (3+1) dimensions although, in principle, the field equations are amenable to such generalization as shown with the inclusion of a one-graviton process[21] and yielding the correct Newtonian limit in d dimensions if a dilaton is included. However, it is not yet clear what the fully generalized field equation governing the dilaton in (3+1) dimensions should be. This is further complicated by the fact that gravitons can propagate in (3+1) dimensions and consequently that would imply gravitons and dilatons exist in the real world. Moreover, detection of the dilaton is expected to be even more elusive than the graviton. However, since this approach allows for the combination of gravitational, electromagnetic and quantum effects, their coupling could potentially lead to a means of vindicating the theory, through cosmology and perhaps even experimentally.

Nonrenormalizability of gravity

General relativity, like electromagnetism, is a classical field theory. One might expect that, as with electromagnetism, there should be a corresponding quantum field theory.
However, gravity is perturbatively nonrenormalizable.[22][23] For a quantum field theory to be well-defined according to this understanding of the subject, it must be asymptotically free or asymptotically safe. The theory must be characterized by a choice of finitely many parameters, which could, in principle, be set by experiment. For example, in quantum electrodynamics, these parameters are the charge and mass of the electron, as measured at a particular energy scale.

On the other hand, in quantizing gravity, there are infinitely many independent parameters (counterterm coefficients) needed to define the theory. For a given choice of those parameters, one could make sense of the theory, but since we can never do infinitely many experiments to fix the values of every parameter, we do not have a meaningful physical theory:
  • At low energies, the logic of the renormalization group tells us that, despite the unknown choices of these infinitely many parameters, quantum gravity will reduce to the usual Einstein theory of general relativity.
  • On the other hand, if we could probe very high energies where quantum effects take over, then every one of the infinitely many unknown parameters would begin to matter, and we could make no predictions at all.
As explained below, there is a way around this problem by treating QG as an effective field theory.

Any meaningful theory of quantum gravity that makes sense and is predictive at all energy scales must have some deep principle that reduces the infinitely many unknown parameters to a finite number that can then be measured.
  • One possibility is that normal perturbation theory is not a reliable guide to the renormalizability of the theory, and that there really is a UV fixed point for gravity. Since this is a question of non-perturbative quantum field theory, it is difficult to find a reliable answer, but some people still pursue this option.
  • Another possibility is that there are new symmetry principles that constrain the parameters and reduce them to a finite set. This is the route taken by string theory, where all of the excitations of the string essentially manifest themselves as new symmetries.

QG as an effective field theory

In an effective field theory, all but the first few of the infinite set of parameters in a non-renormalizable theory are suppressed by huge energy scales and hence can be neglected when computing low-energy effects. Thus, at least in the low-energy regime, the model is indeed a predictive quantum field theory.[12] (A very similar situation occurs for the very similar effective field theory of low-energy pions.) Furthermore, many theorists agree that even the Standard Model should really be regarded as an effective field theory as well, with "nonrenormalizable" interactions suppressed by large energy scales and whose effects have consequently not been observed experimentally.
Recent work[12] has shown that by treating general relativity as an effective field theory, one can actually make legitimate predictions for quantum gravity, at least for low-energy phenomena. An example is the well-known calculation of the tiny first-order quantum-mechanical correction to the classical Newtonian gravitational potential between two masses.

Spacetime background dependence

A fundamental lesson of general relativity is that there is no fixed spacetime background, as found in Newtonian mechanics and special relativity; the spacetime geometry is dynamic. While easy to grasp in principle, this is the hardest idea to understand about general relativity, and its consequences are profound and not fully explored, even at the classical level. To a certain extent, general relativity can be seen to be a relational theory,[24] in which the only physically relevant information is the relationship between different events in space-time.
On the other hand, quantum mechanics has depended since its inception on a fixed background (non-dynamic) structure. In the case of quantum mechanics, it is time that is given and not dynamic, just as in Newtonian classical mechanics. In relativistic quantum field theory, just as in classical field theory, Minkowski spacetime is the fixed background of the theory.

String theory


Interaction in the subatomic world: world lines of point-like particles in the Standard Model or a world sheet swept up by closed strings in string theory

String theory can be seen as a generalization of quantum field theory where instead of point particles, string-like objects propagate in a fixed spacetime background, although the interactions among closed strings give rise to space-time in a dynamical way. Although string theory had its origins in the study of quark confinement and not of quantum gravity, it was soon discovered that the string spectrum contains the graviton, and that "condensation" of certain vibration modes of strings is equivalent to a modification of the original background. In this sense, string perturbation theory exhibits exactly the features one would expect of a perturbation theory that may exhibit a strong dependence on asymptotics (as seen, for example, in the AdS/CFT correspondence) which is a weak form of background dependence.

Background independent theories

Loop quantum gravity is the fruit of an effort to formulate a background-independent quantum theory.

Topological quantum field theory provided an example of background-independent quantum theory, but with no local degrees of freedom, and only finitely many degrees of freedom globally. This is inadequate to describe gravity in 3+1 dimensions, which has local degrees of freedom according to general relativity. In 2+1 dimensions, however, gravity is a topological field theory, and it has been successfully quantized in several different ways, including spin networks.

Semi-classical quantum gravity

Quantum field theory on curved (non-Minkowskian) backgrounds, while not a full quantum theory of gravity, has shown many promising early results. In an analogous way to the development of quantum electrodynamics in the early part of the 20th century (when physicists considered quantum mechanics in classical electromagnetic fields), the consideration of quantum field theory on a curved background has led to predictions such as black hole radiation.

Phenomena such as the Unruh effect, in which particles exist in certain accelerating frames but not in stationary ones, do not pose any difficulty when considered on a curved background (the Unruh effect occurs even in flat Minkowskian backgrounds). The vacuum state is the state with the least energy (and may or may not contain particles). See Quantum field theory in curved spacetime for a more complete discussion.

Points of tension

There are other points of tension between quantum mechanics and general relativity.
  • First, classical general relativity breaks down at singularities, and quantum mechanics becomes inconsistent with general relativity in the neighborhood of singularities (however, no one is certain that classical general relativity applies near singularities in the first place).
  • Second, it is not clear how to determine the gravitational field of a particle, since under the Heisenberg uncertainty principle of quantum mechanics its location and velocity cannot be known with certainty. The resolution of these points may come from a better understanding of general relativity.[25]
  • Third, there is the problem of time in quantum gravity. Time has a different meaning in quantum mechanics and general relativity and hence there are subtle issues to resolve when trying to formulate a theory which combines the two.[26]

Candidate theories

There are a number of proposed quantum gravity theories.[27] Currently, there is still no complete and consistent quantum theory of gravity, and the candidate models still need to overcome major formal and conceptual problems. They also face the common problem that, as yet, there is no way to put quantum gravity predictions to experimental tests, although there is hope for this to change as future data from cosmological observations and particle physics experiments becomes available.[28][29]

String theory

Projection of a Calabi–Yau manifold, one of the ways of compactifying the extra dimensions posited by string theory

One suggested starting point is ordinary quantum field theories which, after all, are successful in describing the other three basic fundamental forces in the context of the standard model of elementary particle physics. However, while this leads to an acceptable effective (quantum) field theory of gravity at low energies,[30] gravity turns out to be much more problematic at higher energies. Where, for ordinary field theories such as quantum electrodynamics, a technique known as renormalization is an integral part of deriving predictions which take into account higher-energy contributions,[31] gravity turns out to be nonrenormalizable: at high energies, applying the recipes of ordinary quantum field theory yields models that are devoid of all predictive power.[32]

One attempt to overcome these limitations is to replace ordinary quantum field theory, which is based on the classical concept of a point particle, with a quantum theory of one-dimensional extended objects: string theory.[33] At the energies reached in current experiments, these strings are indistinguishable from point-like particles, but, crucially, different modes of oscillation of one and the same type of fundamental string appear as particles with different (electric and other) charges. In this way, string theory promises to be a unified description of all particles and interactions.[34] The theory is successful in that one mode will always correspond to a graviton, the messenger particle of gravity; however, the price to pay are unusual features such as six extra dimensions of space in addition to the usual three for space and one for time.[35]

In what is called the second superstring revolution, it was conjectured that both string theory and a unification of general relativity and supersymmetry known as supergravity[36] form part of a hypothesized eleven-dimensional model known as M-theory, which would constitute a uniquely defined and consistent theory of quantum gravity.[37][38] As presently understood, however, string theory admits a very large number (10500 by some estimates) of consistent vacua, comprising the so-called "string landscape". Sorting through this large family of solutions remains a major challenge.

Loop quantum gravity

Simple spin network of the type used in loop quantum gravity

Loop quantum gravity is based first of all on the idea to take seriously the insight of general relativity that spacetime is a dynamical field and therefore is a quantum object. The second idea is that the quantum discreteness that determines the particle-like behavior of other field theories (for instance, the photons of the electromagnetic field) also affects the structure of space.

The main result of loop quantum gravity is the derivation of a granular structure of space at the Planck length. This is derived as follows. In the case of electromagnetism, the quantum operator representing the energy of each frequency of the field has discrete spectrum. Therefore the energy of each frequency is quantized, and the quanta are the photons. In the case of gravity, the operators representing the area and the volume of each surface or space region have discrete spectrum. Therefore area and volume of any portion of space are quantized, and the quanta are elementary quanta of space. It follows that spacetime has an elementary quantum granular structure at the Planck scale, which cuts-off the ultraviolet infinities of quantum field theory.

The quantum state of spacetime is described in the theory by means of a mathematical structure called spin networks. Spin networks were initially introduced by Roger Penrose in abstract form, and later shown by Carlo Rovelli and Lee Smolin to derive naturally from a non perturbative quantization of general relativity. Spin networks do not represent quantum states of a field in spacetime: they represent directly quantum states of spacetime.

The theory is based on the reformulation of general relativity known as Ashtekar variables, which represent geometric gravity using mathematical analogues of electric and magnetic fields.[39][40] In the quantum theory space is represented by a network structure called a spin network, evolving over time in discrete steps.[41][42][43][44]

The dynamics of the theory is today constructed in several versions. One version starts with the canonical quantization of general relativity. The analogue of the Schrödinger equation is a Wheeler–DeWitt equation, which can be defined in the theory.[45] In the covariant, or spinfoam formulation of the theory, the quantum dynamics is obtained via a sum over discrete versions of spacetime, called spinfoams. These represent histories of spin networks.

Other approaches

There are a number of other approaches to quantum gravity. The approaches differ depending on which features of general relativity and quantum theory are accepted unchanged, and which features are modified.[46][47] Examples include:

Weinberg–Witten theorem

In quantum field theory, the Weinberg–Witten theorem places some constraints on theories of composite gravity/emergent gravity. However, recent developments attempt to show that if locality is only approximate and the holographic principle is correct, the Weinberg–Witten theorem would not be valid[citation needed].

Experimental tests

As was emphasized above, quantum gravitational effects are extremely weak and therefore difficult to test. For this reason, the possibility of experimentally testing quantum gravity had not received much attention prior to the late 1990s. However, in the past decade, physicists have realized that evidence for quantum gravitational effects can guide the development of the theory. Since theoretical development has been slow, the field of phenomenological quantum gravity, which studies the possibility of experimental tests, has obtained increased attention.[54][55]

The most widely pursued possibilities for quantum gravity phenomenology include violations of Lorentz invariance, imprints of quantum gravitational effects in the cosmic microwave background (in particular its polarization), and decoherence induced by fluctuations in the space-time foam.

The BICEP2 experiment detected what was initially thought to be primordial B-mode polarization caused by gravitational waves in the early universe. If truly primordial, these waves were born as quantum fluctuations in gravity itself. Cosmologist Ken Olum (Tufts University) stated: "I think this is the only observational evidence that we have that actually shows that gravity is quantized....It's probably the only evidence of this that we will ever have."[56]

Documentarian Scott Hamilton Kennedy explores why activists block GMO solution to African banana wilt crisis


| May 5, 2015 |
 
Original link:  http://www.geneticliteracyproject.org/2015/05/05/documentarian-scott-hamilton-kennedy-explores-why-activists-block-gmo-solution-to-african-banana-wilt-crisis/
 
Burning infected banana plants. 
Burning infected banana plants.

Scott Hamilton Kennedy, a multi-talented producer and director of the Academy Award nominated The Garden, a documentary that featured the inspiring story of how a community garden arose out of the ashes of the Los Angeles riots, is at work on examining the plight of the poor and disenfranchised. But this time his focus is in Africa.

Working with Trace Sheehan of the Grace and Mercy documentary that recounts the relief and recovery efforts in Haiti after the 2010 earthquake, Kennedy is shooting Food Evolution, which looks at food challenges over the next 35 years and how what we eat may be transformed by science and nature. It too stresses the themes of liberty, justice and fairness for the vulnerable and how the poor can mobilize and organize in response to a state of helplessness and hopelessness.
Scott Hamilton and Trace Sheehan in the company of Farmers, scientist and the Writer

During the filming, the duo witnessed how merciless the banana bacterial wilt is to African farmers and their families who are now relying on God’s Grace to find solutions. Farmers have been told to uproot and burn diseased plants which they did. They were also told to cut the male buds, and instructed to always sterilize the farm implements. They complied but the disease still persisted and the burden to manage it became too heavy for them to bear. Yet barely 30 kilometers away were fields of transgenic bananas under cultivation in field trails by scientists that are 100% resistant to the wilt. These farmers did not even know the GMO bananas even existed. When told about them, they eagerly went to see the healthy bananas, and they immediately said they wanted this variety.

But they couldn’t have them. The scientists told them they could not because they had not yet been approved, even though all the safety testing had been done. Because the solution is transgenic, the anti-GMO activists did they best to ensure the bananas would not make their way to desperate farmers.

Farmers like the ones Scott and Trace visited are losing the battle with banana bacteria wilt. They work throughout the day trying to manage the disease and they’ve been told there is no known solution, and yet in less than an hour’s ride there is a solution. It’s maddening to them. Rather than just using the technology, they see healthy GM banana plants guarded and locked all the time. The guards watch from outside. These farmers cannot access these banana because they were bred using a gene from sweet pepper, a pepper they use daily to spice their sauce–and activists call that a ‘dangerous’ Frankenfood.

Food in 2050-Fiction or reality

Food Evolution, produced in conjunction with the nonprofit, Institute of Food Technologists (IFT), aims to explore the global food security and sustainability issues that lie ahead, specifically, feeding 9+ billion people by 2050 and the interconnected challenges of population growth, limited environmental resources, inequality, shifting diets, climate change and health and nutrition. Populations are increasing at a faster rate than food production, people are becoming more affluent and demanding more calories and people are leaving longer, which puts an additional strain on limited food supplies.

Currently, Africa’s major staples like cassava and banana are being attacked by virulent crop diseases that if not brought under control could lead to the abandonment of these beloved staples. Conventional crop breeders have not been able to come up with resistant varieties. The only viable solution appears to lie in tweaking the genes to inoculate these crops against deadly viruses.

In the case of the banana, the wilt resistance trait was sourced from sweet pepper. It offers 100% resistance to the Xanthomonas bacteria that is devastating farmers’ fields. In case of cassava, scientists have used a RNA interference approach known as gene silencing to stop the virus from expressing itself. But tweaking the genes can do more than just prevent diseases. Nutritionists  believe that Africa’s reliance on these staples, which are generally low in important micro-nutrients like beta-carotene and iron, masks hunger, creating what they call “hidden hunger”. There are therefore efforts to bio-fortify these staples using both conventional and genetic engineering approaches.

In 2050, there is a possibility that countries like Uganda that are experiencing the full wrath of these crops’ diseases may see their banana and cassava crops destroyed forever–assuming nothing is done to address the diseases.
Another possibility is that, if genetically engineered products are accepted, most cassava and banana varieties will be transgenic–and healthy–by 2050. There is a third possibility, scientists say. Diseased resistant transgenic crops may be introduced that will break the disease cycle and there will be a mix of genetically engineered and traditional bananas and cassava.

Dilemma: Anti genetic engineering activists protest by offer no solutions

Despite the promising solutions to the disease scourge offered by biotech scientists, outside activists who claim to represent the interests of these very same farmers are doing everything they can to sabotage the introduction of these disease preventing varieties. Most tragically, these activists are offering no alternative solutions for farmers.
The African farmers that the anti-GMO groups claim to speak for are only interested in solutions; they are not ideological and do not care whether the solution comes through genetic engineering or conventional breeding.

The plight of the poor and the farmers in Africa mirrors what Kennedy portrayed in his documentary filmed in South Central Los Angeles. In that case, the Los Angeles City council told the urban farmers that their garden was wonderful, but this is a city after all and they could not save the garden from developers and others who wanted to destroy what they had grown.

The African garden is also wonderful with beautiful bananas wilting away under the pressure from banana bacterial wilt. They keep the garden, weed it and mulch it to keep water from evaporating recklessly and yet they end up with low yields and sometimes even no yield. The solution is transgenic. The moviemakers wondered loudly as to why the European activists opt for rehearsed communications strategies designed to kill this technology while farmers see their reliable source of food wilt and rot away and know that a solution is fenced off just few kilometers afar, but fully out of their reach.

Will we see African farmers stand up to oppose the activists who are preaching ‘no’ to GMO and yet offering no solutions? These are questions that puzzled Scott and I hope he gets to witness such scenarios unfold.

Is there a non transgenic solution?

Crop breeders normally address the issue of diseases by crossing a susceptible variety with available resistant variety with the resultant crop acquiring resistance. This time round, scientists searched, researched and looked into all the available global species of bananas both domesticated and wild and not a single one of them showed resistance to this devastating bacteria. They could have given up, saying, “…there is nothing more we can do.” But scientists did something, they found the resistant properties they have been looking for in sweet pepper. Without a solution, more than 20 million Ugandans could lose their food and income. The only known way currently possible to transfer resistance between non species is through the process of genetic engineering which they have done and done successfully.

Will science prevail over ideology and fanaticism? In Africa, crop improvements that will help protect crop yields from diseases could go a long way to ensuring Africa feeds its growing population and activists who block solutions without providing alternative ways to address problems are an impediment to the global effort to feed the growing hordes of the hungry.

Isaac Ongu is an agriculturist, science writer and an advocate on science based interventions in solving agricultural challenges in developing countries. Follow Isaac on twitter @onguisaac.

Heisenberg Uncertainty Principle Might be Wrong

Original post:  http://www.physics-astronomy.com/2014/07/heisenberg-uncertainty-principle-might.html#.VUoGrpNavDc

The Heisenberg Uncertainty Principle has been an essential principle and also an annoyance in quantum mechanics since Heisenberg wrote down that annoying formula in the initial years 20th century. In brief, it states that you cannot find both the position and the momentum of particles. The more definite you are about one, the less definite you are about the other. 

Currently, with the researchers making leaps in quantum technology, knowing correctly how accurate a measurement you can acquire is very significant. It looks like Heisenberg might have been incorrect. It is essential to know that it still relates, just not as toughly as Heisenberg initially stated.

Image Credit: Dylan Mahler, University of Toronto















As an alternative of taking one huge measurement of a particle, which disturbs the system and makes a ton of uncertainty, a group of researchers from the University of Toronto took a bunch of minor measurements in an effort to relate with the system as slight as probable. Their outcomes were extraordinary, when their measurements were arranged together, these researchers were able to acquire a more precise measurement of their test subject than the Heisenberg uncertainty principle permits. Their study indicates a new mathematical measurement-disturbance ratio generated by Dr. Masanao Ozawa in 2003 is more precise. Their results were issued in the journal “Physical Review Letters” and also offered at the Optical Society’s Annual Meeting in September 2013. Now, only time can tell if Heisenberg’s original formula pass the test of study as researchers from around the world effort to repeat these results.

(If you find any error or miscalculation in this article then please feel free to share in comment and if you want to expand this article then comment below)

Tuesday, May 5, 2015

Quantum computing


From Wikipedia, the free encyclopedia

The Bloch sphere is a representation of a qubit, the fundamental building block of quantum computers.

Quantum computing studies theoretical computation systems (quantum computers) that make direct use of quantum-mechanical phenomena, such as superposition and entanglement, to perform operations on data.[1]

Quantum computers are different from digital computers based on transistors. Whereas digital computers require data to be encoded into binary digits (bits), each of which is always in one of two definite states (0 or 1), quantum computation uses qubits (quantum bits), which can be in superpositions of states. A quantum Turing machine is a theoretical model of such a computer, and is also known as the universal quantum computer. Quantum computers share theoretical similarities with non-deterministic and probabilistic computers. The field of quantum computing was initiated by the work of Yuri Manin in 1980,[2] Richard Feynman in 1982,[3] and David Deutsch.[4] A quantum computer with spins as quantum bits was also formulated for use as a quantum space–time in 1968.[5]

As of 2015, the development of actual quantum computers is still in its infancy, but experiments have been carried out in which quantum computational operations were executed on a very small number of qubits.[6][citation needed]

Both practical and theoretical research continues, and many national governments and military agencies are funding quantum computing research in an effort to develop quantum computers for civilian, business, trade, and national security purposes, such as cryptanalysis.[7]

Large-scale quantum computers will be able to solve certain problems much more quickly than any classical computers that use even the best currently known algorithms, like integer factorization using Shor's algorithm or the simulation of quantum many-body systems. There exist quantum algorithms, such as Simon's algorithm, that run faster than any possible probabilistic classical algorithm.[8] Given sufficient computational resources, however, a classical computer could be made to simulate any quantum algorithm, as quantum computation does not violate the Church–Turing thesis. [9]

Basis

A classical computer has a memory made up of bits, where each bit represents either a one or a zero. A quantum computer maintains a sequence of qubits. A single qubit can represent a one, a zero, or any quantum superposition of those two qubit states; a pair of qubits can be in any quantum superposition of 4 states, and three qubits in any superposition of 8 states. In general, a quantum computer with n qubits can be in an arbitrary superposition of up to 2^n different states simultaneously (this compares to a normal computer that can only be in one of these 2^n states at any one time). A quantum computer operates by setting the qubits in a controlled initial state that represents the problem at hand and by manipulating those qubits with a fixed sequence of quantum logic gates. The sequence of gates to be applied is called a quantum algorithm. The calculation ends with a measurement, collapsing the system of qubits into one of the 2^n pure states, where each qubit is zero or one. The outcome can therefore be at most n classical bits of information. Quantum algorithms are often non-deterministic, in that they provide the correct solution only with a certain known probability.

An example of an implementation of qubits for a quantum computer could start with the use of particles with two spin states: "down" and "up" (typically written |{\downarrow}\rangle and |{\uparrow}\rangle, or |0{\rangle} and |1{\rangle}). But in fact any system possessing an observable quantity A, which is conserved under time evolution such that A has at least two discrete and sufficiently spaced consecutive eigenvalues, is a suitable candidate for implementing a qubit. This is true because any such system can be mapped onto an effective spin-1/2 system.

Mechanics

A quantum computer with a given number of qubits is fundamentally different from a classical computer composed of the same number of classical bits. For example, to represent the state of an n-qubit system on a classical computer would require the storage of 2n complex coefficients. Although this fact may seem to indicate that qubits can hold exponentially more information than their classical counterparts, care must be taken not to overlook the fact that the qubits are only in a probabilistic superposition of all of their states. This means that when the final state of the qubits is measured, they will only be found in one of the possible configurations they were in before measurement. Moreover, it is incorrect to think of the qubits as only being in one particular state before measurement since the fact that they were in a superposition of states before the measurement was made directly affects the possible outcomes of the computation.

Qubits are made up of controlled particles and the means of control (e.g. devices that trap particles and switch them from one state to another).[10]

For example: Consider first a classical computer that operates on a three-bit register. The state of the computer at any time is a probability distribution over the 2^3=8 different three-bit strings 000, 001, 010, 011, 100, 101, 110, 111. If it is a deterministic computer, then it is in exactly one of these states with probability 1.
However, if it is a probabilistic computer, then there is a possibility of it being in any one of a number of different states. We can describe this probabilistic state by eight nonnegative numbers A,B,C,D,E,F,G,H (where A = is the probability that the computer is in state 000, B = is the probability that the computer is in state 001, etc.). There is a restriction that these probabilities sum to 1.

The state of a three-qubit quantum computer is similarly described by an eight-dimensional vector (a,b,c,d,e,f,g,h), called a ket. Here, however, the coefficients can have complex values, and it is the sum of the squares of the coefficients' magnitudes, |a|^2+|b|^2+\cdots+|h|^2, that must equal 1. These squared magnitudes represent the probability of each of the given states. However, because a complex number encodes not just a magnitude but also a direction in the complex plane, the phase difference between any two coefficients (states) represents a meaningful parameter. This is a fundamental difference between quantum computing and probabilistic classical computing.[11]

If you measure the three qubits, you will observe a three-bit string. The probability of measuring a given string is the squared magnitude of that string's coefficient (i.e., the probability of measuring 000 = |a|^2, the probability of measuring 001 = |b|^2, etc..). Thus, measuring a quantum state described by complex coefficients (a,b,...,h) gives the classical probability distribution (|a|^2, |b|^2, \ldots, |h|^2) and we say that the quantum state "collapses" to a classical state as a result of making the measurement.

Note that an eight-dimensional vector can be specified in many different ways depending on what basis is chosen for the space. The basis of bit strings (e.g., 000, 001, …, 111) is known as the computational basis. Other possible bases are unit-length, orthogonal vectors and the eigenvectors of the Pauli-x operator. Ket notation is often used to make the choice of basis explicit. For example, the state (a,b,c,d,e,f,g,h) in the computational basis can be written as:
a\,|000\rangle + b\,|001\rangle + c\,|010\rangle + d\,|011\rangle + e\,|100\rangle + f\,|101\rangle + g\,|110\rangle + h\,|111\rangle
where, e.g., |010\rangle = \left(0,0,1,0,0,0,0,0\right)
The computational basis for a single qubit (two dimensions) is |0\rangle = \left(1,0\right) and |1\rangle = \left(0,1\right).
Using the eigenvectors of the Pauli-x operator, a single qubit is |+\rangle = \tfrac{1}{\sqrt{2}} \left(1,1\right) and |-\rangle = \tfrac{1}{\sqrt{2}} \left(1,-1\right).

Operation

While a classical three-bit state and a quantum three-qubit state are both eight-dimensional vectors, they are manipulated quite differently for classical or quantum computation. For computing in either case, the system must be initialized, for example into the all-zeros string, |000\rangle, corresponding to the vector (1,0,0,0,0,0,0,0). In classical randomized computation, the system evolves according to the application of stochastic matrices, which preserve that the probabilities add up to one (i.e., preserve the L1 norm). In quantum computation, on the other hand, allowed operations are unitary matrices, which are effectively rotations (they preserve that the sum of the squares add up to one, the Euclidean or L2 norm). (Exactly what unitaries can be applied depend on the physics of the quantum device.) Consequently, since rotations can be undone by rotating backward, quantum computations are reversible. (Technically, quantum operations can be probabilistic combinations of unitaries, so quantum computation really does generalize classical computation. See quantum circuit for a more precise formulation.)

Finally, upon termination of the algorithm, the result needs to be read off. In the case of a classical computer, we sample from the probability distribution on the three-bit register to obtain one definite three-bit string, say 000.
Quantum mechanically, we measure the three-qubit state, which is equivalent to collapsing the quantum state down to a classical distribution (with the coefficients in the classical state being the squared magnitudes of the coefficients for the quantum state, as described above), followed by sampling from that distribution. Note that this destroys the original quantum state. Many algorithms will only give the correct answer with a certain probability. However, by repeatedly initializing, running and measuring the quantum computer's results, the probability of getting the correct answer can be increased.

For more details on the sequences of operations used for various quantum algorithms, see universal quantum computer, Shor's algorithm, Grover's algorithm, Deutsch–Jozsa algorithm, amplitude amplification, quantum Fourier transform, quantum gate, quantum adiabatic algorithm and quantum error correction.

Potential

Integer factorization is believed to be computationally infeasible with an ordinary computer for large integers if they are the product of few prime numbers (e.g., products of two 300-digit primes).[12] By comparison, a quantum computer could efficiently solve this problem using Shor's algorithm to find its factors. This ability would allow a quantum computer to decrypt many of the cryptographic systems in use today, in the sense that there would be a polynomial time (in the number of digits of the integer) algorithm for solving the problem. In particular, most of the popular public key ciphers are based on the difficulty of factoring integers or the discrete logarithm problem, both of which can be solved by Shor's algorithm. In particular the RSA, Diffie-Hellman, and Elliptic curve Diffie-Hellman algorithms could be broken. These are used to protect secure Web pages, encrypted email, and many other types of data. Breaking these would have significant ramifications for electronic privacy and security.

However, other cryptographic algorithms do not appear to be broken by those algorithms.[13][14] Some public-key algorithms are based on problems other than the integer factorization and discrete logarithm problems to which Shor's algorithm applies, like the McEliece cryptosystem based on a problem in coding theory.[13][15] Lattice-based cryptosystems are also not known to be broken by quantum computers, and finding a polynomial time algorithm for solving the dihedral hidden subgroup problem, which would break many lattice based cryptosystems, is a well-studied open problem.[16] It has been proven that applying Grover's algorithm to break a symmetric (secret key) algorithm by brute force requires time equal to roughly 2n/2 invocations of the underlying cryptographic algorithm, compared with roughly 2n in the classical case,[17] meaning that symmetric key lengths are effectively halved: AES-256 would have the same security against an attack using Grover's algorithm that AES-128 has against classical brute-force search (see Key size). Quantum cryptography could potentially fulfill some of the functions of public key cryptography.

Besides factorization and discrete logarithms, quantum algorithms offering a more than polynomial speedup over the best known classical algorithm have been found for several problems,[18] including the simulation of quantum physical processes from chemistry and solid state physics, the approximation of Jones polynomials, and solving Pell's equation. No mathematical proof has been found that shows that an equally fast classical algorithm cannot be discovered, although this is considered unlikely. For some problems, quantum computers offer a polynomial speedup. The most well-known example of this is quantum database search, which can be solved by Grover's algorithm using quadratically fewer queries to the database than are required by classical algorithms. In this case the advantage is provable. Several other examples of provable quantum speedups for query problems have subsequently been discovered, such as for finding collisions in two-to-one functions and evaluating NAND trees.

Consider a problem that has these four properties:
  1. The only way to solve it is to guess answers repeatedly and check them,
  2. The number of possible answers to check is the same as the number of inputs,
  3. Every possible answer takes the same amount of time to check, and
  4. There are no clues about which answers might be better: generating possibilities randomly is just as good as checking them in some special order.
An example of this is a password cracker that attempts to guess the password for an encrypted file (assuming that the password has a maximum possible length).

For problems with all four properties, the time for a quantum computer to solve this will be proportional to the square root of the number of inputs. It can be used to attack symmetric ciphers such as Triple DES and AES by attempting to guess the secret key.[19]

Grover's algorithm can also be used to obtain a quadratic speed-up over a brute-force search for a class of problems known as NP-complete.

Since chemistry and nanotechnology rely on understanding quantum systems, and such systems are impossible to simulate in an efficient manner classically, many believe quantum simulation will be one of the most important applications of quantum computing.[20] Quantum simulation could also be used to simulate the behavior of atoms and particles at unusual conditions such as the reactions inside a collider.[21]

There are a number of technical challenges in building a large-scale quantum computer, and thus far quantum computers have yet to solve a problem faster than a classical computer. David DiVincenzo, of IBM, listed the following requirements for a practical quantum computer:[22]
  • scalable physically to increase the number of qubits;
  • qubits that can be initialized to arbitrary values;
  • quantum gates that are faster than decoherence time;
  • universal gate set;
  • qubits that can be read easily.

Quantum decoherence

One of the greatest challenges is controlling or removing quantum decoherence. This usually means isolating the system from its environment as interactions with the external world cause the system to decohere. However, other sources of decoherence also exist. Examples include the quantum gates, and the lattice vibrations and background nuclear spin of the physical system used to implement the qubits. Decoherence is irreversible, as it is non-unitary, and is usually something that should be highly controlled, if not avoided. Decoherence times for candidate systems, in particular the transverse relaxation time T2 (for NMR and MRI technology, also called the dephasing time), typically range between nanoseconds and seconds at low temperature.[11] Currently, some quantum computers require their qubits to be cooled to 20 millikelvin in order to prevent significant decoherence.[23]

These issues are more difficult for optical approaches as the timescales are orders of magnitude shorter and an often-cited approach to overcoming them is optical pulse shaping. Error rates are typically proportional to the ratio of operating time to decoherence time, hence any operation must be completed much more quickly than the decoherence time.

If the error rate is small enough, it is thought to be possible to use quantum error correction, which corrects errors due to decoherence, thereby allowing the total calculation time to be longer than the decoherence time. An often cited figure for required error rate in each gate is 10−4. This implies that each gate must be able to perform its task in one 10,000th of the decoherence time of the system.

Meeting this scalability condition is possible for a wide range of systems. However, the use of error correction brings with it the cost of a greatly increased number of required qubits. The number required to factor integers using Shor's algorithm is still polynomial, and thought to be between L and L2, where L is the number of bits in the number to be factored; error correction algorithms would inflate this figure by an additional factor of L. For a 1000-bit number, this implies a need for about 104 qubits without error correction.[24] With error correction, the figure would rise to about 107 qubits. Note that computation time is about L2 or about 107 steps and on 1 MHz, about 10 seconds.

A very different approach to the stability-decoherence problem is to create a topological quantum computer with anyons, quasi-particles used as threads and relying on braid theory to form stable logic gates.[25][26]

Developments

There are a number of quantum computing models, distinguished by the basic elements in which the computation is decomposed. The four main models of practical importance are:
The Quantum Turing machine is theoretically important but direct implementation of this model is not pursued. All four models of computation have been shown to be equivalent; each can simulate the other with no more than polynomial overhead.

For physically implementing a quantum computer, many different candidates are being pursued, among them (distinguished by the physical system used to realize the qubits):
The large number of candidates demonstrates that the topic, in spite of rapid progress, is still in its infancy, there is also a vast amount of flexibility.

Timeline

In 2001, researchers demonstrated Shor's algorithm to factor 15 using a 7-qubit NMR computer.[41]

In 2005, researchers at the University of Michigan built a semiconductor chip ion trap. Such devices from standard lithography, may point the way to scalable quantum computing.[42]

In 2009, researchers at Yale University created the first solid-state quantum processor. The two-qubit superconducting chip had artificial atom qubits made of a billion aluminum atoms that acted like a single atom that could occupy two states.[43][44]

A team at the University of Bristol, also created a silicon chip based on quantum optics, able to run Shor's algorithm.[45] Further developments were made in 2010.[46] Springer publishes a journal (Quantum Information Processing) devoted to the subject.[47]

In April 2011, a team of scientists from Australia and Japan made a breakthrough in quantum teleportation. They successfully transferred a complex set of quantum data with full transmission integrity, without affecting the qubits superpositions.[48][49]

Photograph of a chip constructed by D-Wave Systems Inc., mounted and wire-bonded in a sample holder. The D-Wave processor is designed to use 128 superconducting logic elements that exhibit controllable and tunable coupling to perform operations.

In 2011, D-Wave Systems announced the first commercial quantum annealer, the D-Wave One, claiming a 128 qubit processor.[50] On May 25, 2011 Lockheed Martin agreed to purchase a D-Wave One system.[51] Lockheed and the University of Southern California (USC) will house the D-Wave One at the newly formed USC Lockheed Martin Quantum Computing Center.[52] D-Wave's engineers designed the chips with an empirical approach, focusing on solving particular problems. Investors liked this more than academics, who said D-Wave had not demonstrated they really had a quantum computer. Criticism softened after a D-Wave paper in Nature, that proved the chips have some quantum properties.[53][54] Experts remain skeptical of D-Waves claims. Two published papers have concluded that the D-Wave machine operates classically, not via quantum computing.[55][56]

During the same year, researchers at the University of Bristol created an all-bulk optics system that ran a version of Shor's algorithm to successfully factor 21.[57]

In September 2011 researchers proved quantum computers can be made with a Von Neumann architecture (separation of RAM).[58]

In November 2011 researchers factorized 143 using 4 qubits.[59]

In February 2012 IBM scientists said that they had made several breakthroughs in quantum computing with superconducting integrated circuits.[60]

In April 2012 a multinational team of researchers from the University of Southern California, Delft University of Technology, the Iowa State University of Science and Technology, and the University of California, Santa Barbara, constructed a two-qubit quantum computer on a doped diamond crystal that can easily be scaled up and is functional at room temperature. Two logical qubit directions of electron spin and nitrogen kernels spin were used, with microwave impulses. This computer ran Grover's algorithm generating the right answer from the first try in 95% of cases.[61]

In September 2012, Australian researchers at the University of New South Wales said the world's first quantum computer was just 5 to 10 years away, after announcing a global breakthrough enabling manufacture of its memory building blocks. A research team led by Australian engineers created the first working qubit based on a single atom in silicon, invoking the same technological platform that forms the building blocks of modern day computers.[62] [63]

In October 2012, Nobel Prizes were presented to David J. Wineland and Serge Haroche for their basic work on understanding the quantum world, which may help make quantum computing possible.[64][65]

In November 2012, the first quantum teleportation from one macroscopic object to another was reported.[66][67]

In December 2012, the first dedicated quantum computing software company, 1QBit was founded in Vancouver, BC.[68] 1QBit is the first company to focus exclusively on commercializing software applications for commercially available quantum computers, including the D-Wave Two. 1QBit's research demonstrated the ability of superconducting quantum annealing processors to solve real-world problems.[69]

In February 2013, a new technique, boson sampling, was reported by two groups using photons in an optical lattice that is not a universal quantum computer but may be good enough for practical problems. Science Feb 15, 2013

In May 2013, Google announced that it was launching the Quantum Artificial Intelligence Lab, hosted by NASA‍‍ '​‍s Ames Research Center, with a 512-qubit D-Wave quantum computer. The USRA (Universities Space Research Association) will invite researchers to share time on it with the goal of studying quantum computing for machine learning.[70]

In early 2014 it was reported, based on documents provided by former NSA contractor Edward Snowden, that the U.S. National Security Agency (NSA) is running a $79.7 million research program (titled "Penetrating Hard Targets") to develop a quantum computer capable of breaking vulnerable encryption.[71]

In 2014, a group of researchers from ETH Zürich, USC, Google and Microsoft reported a definition of quantum speedup, and were not able to measure quantum speedup with the D-Wave Two device, but did not explicitly rule it out.[72][73]

In 2014, researchers at University of New South Wales used silicon as a protectant shell around qubits, making them more accurate, increasing the length of time they will hold information and possibly made quantum computers easier to build.[74]

In April 2015 IBM scientists claimed two critical advances towards the realization of a practical quantum computer. They claimed the ability to detect and measure both kinds of quantum errors simultaneously, as well as a new, square quantum bit circuit design that could scale to larger dimensions. [75]

Relation to computational complexity theory

The suspected relationship of BQP to other problem spaces.[76]

The class of problems that can be efficiently solved by quantum computers is called BQP, for "bounded error, quantum, polynomial time". Quantum computers only run probabilistic algorithms, so BQP on quantum computers is the counterpart of BPP ("bounded error, probabilistic, polynomial time") on classical computers. It is defined as the set of problems solvable with a polynomial-time algorithm, whose probability of error is bounded away from one half.[77] A quantum computer is said to "solve" a problem if, for every instance, its answer will be right with high probability. If that solution runs in polynomial time, then that problem is in BQP.

BQP is contained in the complexity class #P (or more precisely in the associated class of decision problems P#P),[78] which is a subclass of PSPACE.

BQP is suspected to be disjoint from NP-complete and a strict superset of P, but that is not known. Both integer factorization and discrete log are in BQP. Both of these problems are NP problems suspected to be outside BPP, and hence outside P. Both are suspected to not be NP-complete. There is a common misconception that quantum computers can solve NP-complete problems in polynomial time. That is not known to be true, and is generally suspected to be false.[78]

The capacity of a quantum computer to accelerate classical algorithms has rigid limits—upper bounds of quantum computation's complexity. The overwhelming part of classical calculations cannot be accelerated on a quantum computer.[79] A similar fact takes place for particular computational tasks, like the search problem, for which Grover's algorithm is optimal.[80]

Although quantum computers may be faster than classical computers, those described above can't solve any problems that classical computers can't solve, given enough time and memory (however, those amounts might be practically infeasible). A Turing machine can simulate these quantum computers, so such a quantum computer could never solve an undecidable problem like the halting problem. The existence of "standard" quantum computers does not disprove the Church–Turing thesis.[81] It has been speculated that theories of quantum gravity, such as M-theory or loop quantum gravity, may allow even faster computers to be built. Currently, defining computation in such theories is an open problem due to the problem of time, i.e., there currently exists no obvious way to describe what it means for an observer to submit input to a computer and later receive output.[82]

Representation of a Lie group

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Representation_of_a_Lie_group...