Search This Blog

Friday, July 27, 2018

Informatics

From Wikipedia, the free encyclopedia

Informatics is a branch of information engineering. It involves the practice of information processing and the engineering of information systems, and as an academic field it is an applied form of information science. The field considers the interaction between humans and information alongside the construction of interfaces, organisations, technologies and systems. As such, the field of informatics has great breadth and encompasses many subspecialties, including disciplines of computer science, information systems, information technology and statistics. Since the advent of computers, individuals and organizations increasingly process information digitally. This has led to the study of informatics with computational, mathematical, biological, cognitive and social aspects, including study of the social impact of information technologies.

Etymology

In 1956 the German computer scientist Karl Steinbuch coined the word Informatik by publishing a paper called Informatik: Automatische Informationsverarbeitung ("Informatics: Automatic Information Processing").[1] The English term Informatics is sometimes understood as meaning the same as computer science. The German word Informatik is usually translated to English as computer science.
The French term informatique was coined in 1962 by Philippe Dreyfus[2] together with various translations—informatics (English), also proposed independently and simultaneously by Walter F. Bauer and associates who co-founded Informatics Inc., and informatica (Italian, Spanish, Romanian, Portuguese, Dutch), referring to the application of computers to store and process information.

The term was coined as a combination of "information" and "automatic" to describe the science of automating information interactions. The morphology—informat-ion + -ics—uses "the accepted form for names of sciences, as conics, linguistics, optics, or matters of practice, as economics, politics, tactics",[3] and so, linguistically, the meaning extends easily to encompass both the science of information and the practice of information processing.

History

The culture of library science promotes policies and procedures for managing information that fosters the relationship between library science and the development of information science to provide benefits for health informatics development; which is traced to the 1950s with the beginning of computer uses in healthcare (Nelson & Staggers p.4). Early practitioners interested in the field soon learned that there were no formal education programs set up to educate them on the informatics science until the late 1960s and early 1970s. Professional development began to emerge, playing a significant role in the development of health informatics (Nelson &Staggers p.7) According to Imhoff et al., 2001, healthcare informatics is not only the application of computer technology to problems in healthcare but covers all aspects of generation, handling, communication, storage, retrieval, management, analysis, discovery, and synthesis of data information and knowledge in the entire scope of healthcare. Furthermore, they stated that the primary goal of health informatics can be distinguished as follows: To provide solutions for problems related to data, information, and knowledge processing. To study general principles of processing data information and knowledge in medicine and healthcare.
This new term was adopted across Western Europe, and, except in English, developed a meaning roughly translated by the English ‘computer science’, or ‘computing science’. Mikhailov advocated the Russian term informatika (1966), and the English informatics (1967), as names for the theory of scientific information, and argued for a broader meaning, including study of the use of information technology in various communities (for example, scientific) and of the interaction of technology and human organizational structures.
Informatics is the discipline of science which investigates the structure and properties (not specific content) of scientific information, as well as the regularities of scientific information activity, its theory, history, methodology and organization.[4]
Usage has since modified this definition in three ways. First, the restriction to scientific information is removed, as in business informatics or legal informatics. Second, since most information is now digitally stored, computation is now central to informatics. Third, the representation, processing and communication of information are added as objects of investigation, since they have been recognized as fundamental to any scientific account of information. Taking information as the central focus of study distinguishes informatics from computer science. Informatics includes the study of biological and social mechanisms of information processing whereas computer science focuses on the digital computation. Similarly, in the study of representation and communication, informatics is indifferent to the substrate that carries information. For example, it encompasses the study of communication using gesture, speech and language, as well as digital communications and networking.

In the English-speaking world the term informatics was first widely used in the compound medical informatics, taken to include "the cognitive, information processing, and communication tasks of medical practice, education, and research, including information science and the technology to support these tasks".[5] Many such compounds are now in use; they can be viewed as different areas of "applied informatics". Indeed, "In the U.S., however, informatics is linked with applied computing, or computing in the context of another domain."[6]

Informatics encompasses the study of systems that represent, process, and communicate information. However, the theory of computation in the specific discipline of theoretical computer science, which evolved from Alan Turing, studies the notion of a complex system regardless of whether or not information actually exists. Since both fields process information, there is some disagreement among scientists as to field hierarchy; for example Arizona State University attempted to adopt a broader definition of informatics to even encompass cognitive science at the launch of its School of Computing and Informatics in September 2006.

A broad interpretation of informatics, as "the study of the structure, algorithms, behaviour, and interactions of natural and artificial computational systems," was introduced by the University of Edinburgh in 1994 when it formed the grouping that is now its School of Informatics. This meaning is now (2006) increasingly used in the United Kingdom.[7]

The 2008 Research Assessment Exercise, of the UK Funding Councils, includes a new, Computer Science and Informatics, unit of assessment (UoA),[8] whose scope is described as follows:
The UoA includes the study of methods for acquiring, storing, processing, communicating and reasoning about information, and the role of interactivity in natural and artificial systems, through the implementation, organisation and use of computer hardware, software and other resources. The subjects are characterised by the rigorous application of analysis, experimentation and design.

Academic schools and departments

Academic research in the informatics area can be found in a number of disciplines such as computer science, information technology, Information and Computer Science, information systems, business information management and health informatics.

In France, the first degree level qualifications in Informatics (computer science) appeared in the mid-1960s.[citation needed]

In English-speaking countries, the first example of a degree level qualification in Informatics occurred in 1982 when Plymouth Polytechnic (now the University of Plymouth) offered a four-year BSc(Honours) degree in Computing and Informatics – with an initial intake of only 35 students. The course still runs today [9] making it the longest available qualification in the subject.

At the Indiana University School of Informatics, Computing, and Engineering (Bloomington, Indianapolis and Southeast), informatics is defined as "the art, science and human dimensions of information technology" and "the study, application, and social consequences of technology." It is also defined in Informatics 101, Introduction to Informatics as "the application of information technology to the arts, sciences, and professions." These definitions are widely accepted in the United States, and differ from British usage in omitting the study of natural computation.

Texas Woman's University places its informatics degrees in its department of Mathematics and Computer Science within the College of Arts & Sciences, though it offers interdisciplinary Health Informatics degrees.[10] Informatics is presented in a generalist framework, as evidenced by their definition of informatics ("Using technology and data analytics to derive meaningful information from data for data and decision driven practice in user centered systems"), though TWU is also known for its nursing and health informatics programs.

At the University of California, Irvine Department of Informatics, informatics is defined as "the interdisciplinary study of the design, application, use and impact of information technology. The discipline of informatics is based on the recognition that the design of this technology is not solely a technical matter, but must focus on the relationship between the technology and its use in real-world settings. That is, informatics designs solutions in context, and takes into account the social, cultural and organizational settings in which computing and information technology will be used."

At the University of Michigan, Ann Arbor Informatics interdisciplinary major, informatics is defined as "the study of information and the ways information is used by and affects human beings and social systems. The major involves coursework from the College of Literature, Science and the Arts, where the Informatics major is housed, as well as the School of Information and the College of Engineering. Key to this growing field is that it applies both technological and social perspectives to the study of information. Michigan's interdisciplinary approach to teaching Informatics gives a solid grounding in contemporary computer programming, mathematics, and statistics, combined with study of the ethical and social science aspects of complex information systems. Experts in the field help design new information technology tools for specific scientific, business, and cultural needs." Michigan offers four curricular tracks within the informatics degree to provide students with increased expertise. These four track topics include:[11]
  • Internet Informatics: An applied track in which students experiment with technologies behind Internet-based information systems and acquire skills to map problems to deployable Internet-based solutions. This track will replace Computational Informatics in Fall 2013.[12]
  • Data Mining & Information Analysis: Integrates the collection, analysis, and visualization of complex data and its critical role in research, business, and government to provide students with practical skills and a theoretical basis for approaching challenging data analysis problems.
  • Life Science Informatics: Examines artificial information systems, which has helped scientists make great progress in identifying core components of organisms and ecosystems.
  • Social Computing: Advances in computing have created opportunities for studying patterns of social interaction and developing systems that act as introducers, recommenders, coordinators, and record-keepers. Students, in this track, craft, evaluate, and refine social software computer applications for engaging technology in unique social contexts. This track will be phased out in Fall 2013 in favor of the new bachelor of science in information. This will be the first undergraduate degree offered by the School of Information since its founding in 1996. The School of Information already contains a Master's program, Doctorate program, and a professional master's program in conjunction with the School of Public Health. The BS in Information at the University of Michigan will be the first curriculum program of its kind in the United States, with the first graduating class to emerge in 2015. Students will be able to apply for this unique degree in 2013 for the 2014 Fall semester; the new degree will be a stem off of the most popular Social Computing track in the current Informatics interdisciplinary major in LSA. Applications will be open to upper-classmen, juniors and seniors, along with a variety of information classes available for first and second year students to gauge interest and value in the specific sector of study. The degree was approved by the University on June 11, 2012.[13] Along with a new degree in the School of Information, there has also been the first and only chapter of an Informatics Professional Fraternity, Kappa Theta Pi, chartered in Fall 2012.[14]
At the University of Washington, Seattle Informatics Undergraduate Program, Informatics is an undergraduate program offered by the Information School. Bachelor of Science in Informatics is described as "[a] program that focuses on computer systems from a user-centered perspective and studies the structure, behavior and interactions of natural and artificial systems that store, process and communicate information. Includes instruction in information sciences, human computer interaction, information system analysis and design, telecommunications structure and information architecture and management." Washington offers three degree options as well as a custom track.[15]
  • Data Science Option: Data Science is an emerging interdisciplinary field that works to extract knowledge or insight from data. It combines fields such as information science, computer science, statistics, design, and social science.
  • Human-Computer Interaction: The iSchool’s work in human-computer interaction (HCI) strives to make information and computing useful, usable, and accessible to all. The Informatics HCI option allows one to blend your technical skills and expertise with a broader perspective on how design and development work impacts users. Courses explore the design, construction, and evaluation of interactive technologies for use by individuals, groups, and organizations, and the social implications of these systems. This work encompasses user interfaces, accessibility concerns, new design techniques and methods for interactive systems and collaboration. Coursework also examines the values implicit in the design and development of technology.
  • Information Architecture: Information architecture (IA) is a crucial component in the development of successful Web sites, software, intranets, and online communities. Architects structure the underlying information and its presentation in a logical and intuitive way so that people can put information to use. As an Informatics major with an IA option, one will master the skills needed to organize and label information for improved navigation and search. One will build frameworks to effectively collect, store and deliver information. One will also learn to design the databases and XML storehouses that drive complex and interactive websites, including the navigation, content layout, personalization, and transactional features of the site.
  • Information Assurance and Cybersecurity: Information Assurance and Cybersecurity (IAC) is the practice of creating and managing safe and secure systems. It is crucial for organizations public and private, large and small. In the IAC option, one will be equipped with the knowledge to create, deploy, use, and manage systems that preserve individual and organizational privacy and security. This tri-campus concentration leverages the strengths of the Information School, the Computing and Software Systems program at UW Bothell, and the Institute of Technology at UW Tacoma. After a course in the technical, policy, and management foundations of IAC, one may take electives at any campus to learn such specialties as information assurance policy, secure coding, or networking and systems administration.
  • Custom (Student-Designed Concentration): Students may choose to develop their own concentration, with approval from the academic adviser. Student-designed concentrations are created out of a list of approved courses and also result in the Bachelor of Science degree.

Applied disciplines

Organizational informatics

One of the most significant areas of application of informatics is that of organizational informatics. Organizational informatics is fundamentally interested in the application of information, information systems and ICT within organisations of various forms including private sector, public sector and voluntary sector organisations.[16][17] As such, organisational informatics can be seen to be a sub-category of social informatics and a super-category of business informatics. Organizational informatics are also present in the computer science and information technology industry.

Discovering new drugs and materials by ‘touching’ molecules in virtual reality

Scientists can now visualize and experiment with structures and dynamics of complex molecular structures (at atomic-level precision), with real-time multi-user collaboration via the cloud
July 6, 2018
Original link:  http://www.kurzweilai.net/discovering-new-drugs-and-materials-by-touching-molecules-in-virtual-reality
To figure out how to block a bacteria’s attempt to create multi-resistance to antibiotics, a researcher grabs a simulated ligand (binding molecule) — a type of penicillin called benzylpenicillin (red) — and interactively guides that molecule to dock within a larger enzyme molecule (blue-orange) called β-lactamase, which is produced by bacteria in an attempt to disable penicillin (making a patient resistant to a class of antibiotics called β-lactam). (credit: University of Bristol)

University of Bristol researchers have designed and tested a new virtual reality (VR) cloud-based system intended to allow researchers to reach out and “touch” molecules as they move — folding them, knotting them, plucking them, and changing their shape to test how the molecules interact. Using an HTC Vive virtual-reality device, it could lead to creating new drugs and materials and improving the teaching of chemistry.

More broadly, the goal is to accelerate progress in nanoscale molecular engineering areas that include conformational mapping, drug development, synthetic biology, and catalyst design.

Real-time collaboration via the cloud

Two users passing a fullerene (C60) molecule back and forth in real time over a cloud-based network. The researchers are each wearing a VR head-mounted display (HMD) and holding two small wireless controllers that function as atomic “tweezers” to manipulate the real-time molecular dynamic of the C60 molecule. Each user’s position is determined using a real-time optical tracking system composed of synchronized infrared light sources, running locally on a GPU-accelerated computer. (credit: University of Bristol)

The multi-user system, developed by developed by a team led by University of Bristol chemists and computer scientists, uses an “interactive molecular dynamics virtual reality” (iMD VR) app that allows users to visualize and sample (with atomic-level precision) the structures and dynamics of complex molecular structures “on the fly” and to interact with other users in the same virtual environment.

Because each VR client has access to global position data of all other users, any user can see through his/her headset a co-located visual representation of all other users at the same time. So far, the system has uniquely allowed for simultaneously co-locating six users in the same room within the same simulation.

Testing on challenging molecular tasks

The team designed a series of molecular tasks for testing, using traditional mouse, keyboard, and touchscreens compared to virtual reality. The tasks included threading a small molecule through a nanotube, changing the screw-sense of a small organic helix, and tying a small string-like protein into a simple knot, and a variety of dynamic molecular problems, such as binding drugs to its target, protein folding, and chemical reactions. The researchers found that for complex 3D tasks, VR offers a significant advantage over current methods. For example, participants were ten times more likely to succeed in difficult tasks such as molecular knot tying.

Anyone can try out the tasks described in the open-access paper by downloading the software and launching their own cloud-hosted session.

David Glowacki | This video, made by University of Bristol PhD student Helen M. Deeks, shows the actions she took using a wireless set of “atomic tweezers” (using the HTC Vive) to interactively dock a single benzylpenicillin drug molecule into the active site of the β-lactamase enzyme. 

David Glowacki | The video shows the cloud-mounted virtual reality framework, with several different views overlaid to give a sense of how the interaction works. The video outlines the four different parts of the user studies: (1) manipulation of buckminsterfullerene, enabling users to familarize themselves with the interactive controls; (2) threading a methane molecule through a nanotube; (3) changing the screw-sense of a helicene molecule; and (4) tying a trefoil knot in 17-Alanine.

Ref: Science Advances (open-access). Source: University of Bristol.

How to predict the side effects of millions of drug combinations

Stanford Univ. computer scientists have figured it out, using artificial intelligence.
July 17, 2018
Original link:  http://www.kurzweilai.net/how-to-predict-the-side-effects-of-millions-of-drug-combinations
An example graph of polypharmacy side effects derived from genomic and patient population data, protein–protein interactions, drug–protein targets, and drug–drug interactions encoded by 964 different polypharmacy side effects. The graph representation is used to develop Decagon. (credit: Marinka Zitnik et al./Bioinformatics)

Millions of people take up to five or more medications a day, but doctors have no idea what side effects might arise from adding another drug.*

Now, Stanford University computer scientists have developed a deep-learning system (a kind of AI modeled after the brain) called Decagon** that could help doctors make better decisions about which drugs to prescribe. It could also help researchers find better combinations of drugs to treat complex diseases.

The problem is that with so many drugs currently on the U.S. pharmaceutical market, “it’s practically impossible to test a new drug in combination with all other drugs, because just for one drug, that would be five thousand new experiments,” said Marinka Zitnik, a postdoctoral fellow in computer science and lead author of a paper presented July 10 at the 2018 meeting of the International Society for Computational Biology.

With some new drug combinations (“polypharmacy”), she said, “truly we don’t know what will happen.”

How proteins interact and how different drugs affect these proteins

So Zitnik and associates created a network describing how the more than 19,000 proteins in our bodies interact with each other and how different drugs affect these proteins. Using more than 4 million known associations between drugs and side effects, the team then designed a method to identify patterns in how side effects arise, based on how drugs target different proteins, and also to infer patterns about drug-interaction side effects.***

Based on that method, the system could predict the consequences of taking two drugs together.

To evaluate the The research was supported by the National Science Foundation, the National Institutes of Health, the Defense Advanced Research Projects Agency, the Stanford Data Science Initiative, and the Chan Zuckerberg Biohub. system, the group looked to see if its predictions came true. In many cases, they did. For example, there was no indication in the original data that the combination of atorvastatin (marketed under the trade name Lipitor among others), a cholesterol drug, and amlopidine (Norvasc), a blood-pressure medication, could lead to muscle inflammation. Yet Decagon predicted that it would, and it was right.

In the future, the team members hope to extend their results to include more multiple drug interactions. They also hope to create a more user-friendly tool to give doctors guidance on whether it’s a good idea to prescribe a particular drug to a particular patient, and to help researchers developing drug regimens for complex diseases, with fewer side effects.

Ref.: Bioinformatics (open access). Source: Stanford University.

* More than 23 percent of Americans took three or more prescription drugs in the past 30 days, according to a 2017 CDC estimate. Furthermore, 39 percent over age 65 take five or more, a number that’s increased three-fold in the last several decades. There are about 1,000 known side effects and 5,000 drugs on the market, making for nearly 125 billion possible side effects between all possible pairs of drugs. Most of these have never been prescribed together, let alone systematically studied, according to the Stanford researchers.

** In geometry, a decagon is a ten-sided polygon.

*** The research was supported by the National Science Foundation, the National Institutes of Health, the Defense Advanced Research Projects Agency, the Stanford Data Science Initiative, and the Chan Zuckerberg Biohub.

Quantum algorithm

From Wikipedia, the free encyclopedia
 
In quantum computing, a quantum algorithm is an algorithm which runs on a realistic model of quantum computation, the most commonly used model being the quantum circuit model of computation. A classical (or non-quantum) algorithm is a finite sequence of instructions, or a step-by-step procedure for solving a problem, where each step or instruction can be performed on a classical computer. Similarly, a quantum algorithm is a step-by-step procedure, where each of the steps can be performed on a quantum computer. Although all classical algorithms can also be performed on a quantum computer, the term quantum algorithm is usually used for those algorithms which seem inherently quantum, or use some essential feature of quantum computation such as quantum superposition or quantum entanglement.

Problems which are undecidable using classical computers remain undecidable using quantum computers.[4]:127 What makes quantum algorithms interesting is that they might be able to solve some problems faster than classical algorithms[why?].

The most well known algorithms are Shor's algorithm for factoring, and Grover's algorithm for searching an unstructured database or an unordered list. Shor's algorithms runs exponentially faster than the best known classical algorithm for factoring, the general number field sieve[citation needed]. Grover's algorithm runs quadratically faster than the best possible classical algorithm for the same task[citation needed], a linear search.

Overview

Quantum algorithms are usually described, in the commonly used circuit model of quantum computation, by a quantum circuit which acts on some input qubits and terminates with a measurement. A quantum circuit consists of simple quantum gates which act on at most a fixed number of qubits[why?], usually two or three[according to whom?]. Quantum algorithms may also be stated in other models of quantum computation, such as the Hamiltonian oracle model.[5]

Quantum algorithms can be categorized by the main techniques used by the algorithm. Some commonly used techniques/ideas in quantum algorithms include phase kick-back, phase estimation, the quantum Fourier transform, quantum walks, amplitude amplification and topological quantum field theory. Quantum algorithms may also be grouped by the type of problem solved, for instance see the survey on quantum algorithms for algebraic problems.[6]

Algorithms based on the quantum Fourier transform

The quantum Fourier transform is the quantum analogue of the discrete Fourier transform, and is used in several quantum algorithms. The Hadamard transform is also an example of a quantum Fourier transform over an n-dimensional vector space over the field F2[clarification needed]. The quantum Fourier transform can be efficiently implemented on a quantum computer using only a polynomial number of quantum gates[citation needed].

Deutsch–Jozsa algorithm

The Deutsch–Jozsa algorithm solves a black-box problem which probably requires exponentially many queries to the black box for any deterministic classical computer, but can be done with exactly one query by a quantum computer. If we allow both bounded-error quantum and classical algorithms, then there is no speedup since a classical probabilistic algorithm can solve the problem with a constant number of queries with small probability of error. The algorithm determines whether a function f is either constant (0 on all inputs or 1 on all inputs) or balanced (returns 1 for half of the input domain and 0 for the other half).

Simon's algorithm

Simon's algorithm solves a black-box problem exponentially faster than any classical algorithm, including bounded-error probabilistic algorithms. This algorithm, which achieves an exponential speedup over all classical algorithms that we consider efficient, was the motivation for Shor's factoring algorithm.

Quantum phase estimation algorithm

The quantum phase estimation algorithm is used to determine the eigenphase of an eigenvector of a unitary gate given a quantum state proportional to the eigenvector and access to the gate. The algorithm is frequently used as a subroutine in other algorithms.

Shor's algorithm

Shor's algorithm solves the discrete logarithm problem and the integer factorization problem in polynomial time,[7] whereas the best known classical algorithms take super-polynomial time. These problems are not known to be in P or NP-complete. It is also one of the few quantum algorithms that solves a non–black-box problem in polynomial time where the best known classical algorithms run in super-polynomial time.

Hidden subgroup problem

The abelian hidden subgroup problem is a generalization of many problems that can be solved by a quantum computer, such as Simon's problem, solving Pell's equation, testing the principal ideal of a ring R and factoring. There are efficient quantum algorithms known for the Abelian hidden subgroup problem.[8] The more general hidden subgroup problem, where the group isn't necessarily abelian, is a generalization of the previously mentioned problems and graph isomorphism and certain lattice problems. Efficient quantum algorithms are known for certain non-abelian groups. However, no efficient algorithms are known for the symmetric group, which would give an efficient algorithm for graph isomorphism[9] and the dihedral group, which would solve certain lattice problems.[10]

Boson sampling problem

The Boson Sampling Problem in an experimental configuration assumes[11] an input of bosons (ex. photons of light) of moderate number getting randomly scattered into a large number of output modes constrained by a defined unitarity. The problem is then to produce a fair sample of the probability distribution of the output which is dependent on the input arrangement of bosons and the Unitarity.[12] Solving this problem with a classical computer algorithm requires computing the permanent[clarification needed] of the unitary transform matrix, which may be either impossible or take a prohibitively long time. In 2014, it was proposed[13] that existing technology and standard probabilistic methods of generating single photon states could be used as input into a suitable quantum computable linear optical network and that sampling of the output probability distribution would be demonstrably superior using quantum algorithms. In 2015, investigation predicted[14] the sampling problem had similar complexity for inputs other than Fock state photons and identified a transition in computational complexity from classically simulatable to just as hard as the Boson Sampling Problem, dependent on the size of coherent amplitude inputs.

Estimating Gauss sums

A Gauss sum is a type of exponential sum. The best known classical algorithm for estimating these sums takes exponential time. Since the discrete logarithm problem reduces to Gauss sum estimation, an efficient classical algorithm for estimating Gauss sums would imply an efficient classical algorithm for computing discrete logarithms, which is considered unlikely. However, quantum computers can estimate Gauss sums to polynomial precision in polynomial time.[15]

Fourier fishing and Fourier checking

We have an oracle consisting of n random Boolean functions mapping n-bit strings to a Boolean value. We are required to find n n-bit strings z1,..., zn such that for the Hadamard-Fourier transform, at least 3/4 of the strings satisfy
\left|{\tilde  {f}}\left(z_{i}\right)\right|\geqslant 1
and at least 1/4 satisfies
\left|{\tilde  {f}}\left(z_{i}\right)\right|\geqslant 2.
This can be done in BQP.[16]

Algorithms based on amplitude amplification

Amplitude amplification is a technique that allows the amplification of a chosen subspace of a quantum state. Applications of amplitude amplification usually lead to quadratic speedups over the corresponding classical algorithms. It can be considered to be a generalization of Grover's algorithm.

Grover's algorithm

Grover's algorithm searches an unstructured database (or an unordered list) with N entries, for a marked entry, using only O({\sqrt  {N}}) queries instead of the {\displaystyle O({N})} queries required classically.[17] Classically, {\displaystyle O({N})} queries are required, even if we allow bounded-error probabilistic algorithms.
Bohmian Mechanics is a non-local hidden variable interpretation of quantum mechanics. It has been shown that a non-local hidden variable quantum computer could implement a search of an N-item database at most in {\displaystyle O({\sqrt[{3}]{N}})} steps. This is slightly faster than the O({\sqrt  {N}}) steps taken by Grover's algorithm. Neither search method will allow quantum computers to solve NP-Complete problems in polynomial time.[18]

Quantum counting

Quantum counting solves a generalization of the search problem. It solves the problem of counting the number of marked entries in an unordered list, instead of just detecting if one exists. Specifically, it counts the number of marked entries in an N-element list, with error \epsilon making only \Theta \left({\frac  {1}{\epsilon }}{\sqrt  {{\frac  {N}{k}}}}\right) queries, where k is the number of marked elements in the list.[19][20] More precisely, the algorithm outputs an estimate k' for k, the number of marked entries, with the following accuracy: |k-k'|\leq \epsilon k.

Algorithms based on quantum walks

A quantum walk is the quantum analogue of a classical random walk, which can be described by a probability distribution over some states. A quantum walk can be described by a quantum superposition over states. Quantum walks are known to give exponential speedups for some black-box problems.[21][22] They also provide polynomial speedups for many problems. A framework for the creation of quantum walk algorithms exists and is quite a versatile tool.[23]

Element distinctness problem

The element distinctness problem is the problem of determining whether all the elements of a list are distinct. Classically, Ω(N) queries are required for a list of size N, since this problem is harder than the search problem which requires Ω(N) queries. However, it can be solved in \Theta (N^{{2/3}}) queries on a quantum computer. The optimal algorithm is by Andris Ambainis.[24] Yaoyun Shi first proved a tight lower bound when the size of the range is sufficiently large.[25] Ambainis[26] and Kutin[27] independently (and via different proofs) extended his work to obtain the lower bound for all functions.

Triangle-finding problem

The triangle-finding problem is the problem of determining whether a given graph contains a triangle (a clique of size 3). The best-known lower bound for quantum algorithms is Ω(N), but the best algorithm known requires O(N1.297) queries,[28] an improvement over the previous best O(N1.3) queries.[23][29]

Formula evaluation

A formula is a tree with a gate at each internal node and an input bit at each leaf node. The problem is to evaluate the formula, which is the output of the root node, given oracle access to the input.

A well studied formula is the balanced binary tree with only NAND gates.[30] This type of formula requires Θ(Nc) queries using randomness,[31] where c=\log _{2}(1+{\sqrt  {33}})/4\approx 0.754. With a quantum algorithm however, it can be solved in Θ(N0.5) queries. No better quantum algorithm for this case was known until one was found for the unconventional Hamiltonian oracle model.[5] The same result for the standard setting soon followed.[32]

Fast quantum algorithms for more complicated formulas are also known.[33]

Group commutativity

The problem is to determine if a black box group, given by k generators, is commutative. A black box group is a group with an oracle function, which must be used to perform the group operations (multiplication, inversion, and comparison with identity). We are interested in the query complexity, which is the number of oracle calls needed to solve the problem. The deterministic and randomized query complexities are \Theta (k^{2}) and \Theta (k) respectively.[34] A quantum algorithm requires \Omega (k^{{2/3}}) queries but the best known algorithm uses O(k^{{2/3}}\log k) queries.[35]

BQP-complete problems

Computing knot invariants

Witten had shown that the Chern-Simons topological quantum field theory (TQFT) can be solved in terms of Jones polynomials. A quantum computer can simulate a TQFT, and thereby approximate the Jones polynomial,[36] which as far as we know, is hard to compute classically in the worst-case scenario.[citation needed]

Quantum simulation

The idea that quantum computers might be more powerful than classical computers originated in Richard Feynman's observation that classical computers seem to require exponential time to simulate many-particle quantum systems.[37] Since then, the idea that quantum computers can simulate quantum physical processes exponentially faster than classical computers has been greatly fleshed out and elaborated. Efficient (that is, polynomial-time) quantum algorithms have been developed for simulating both Bosonic and Fermionic systems[38] and in particular, the simulation of chemical reactions beyond the capabilities of current classical supercomputers requires only a few hundred qubits.[39] Quantum computers can also efficiently simulate topological quantum field theories.[40] In addition to its intrinsic interest, this result has led to efficient quantum algorithms for estimating quantum topological invariants such as Jones[41] and HOMFLY[42] polynomials, and the Turaev-Viro invariant of three-dimensional manifolds.[43]

Hybrid Quantum/Classical Algorithms

Hybrid Quantum/Classical Algorithms combine quantum state preparation and measurement with classical optimization. These algorithms generally aim to determine the ground state eigenvector and eigenvalue of a Hermitian Operator.

QAOA

The quantum approximate optimization algorithm is a toy model of quantum annealing which can be used to solve problems in graph theory.[44] The algorithm makes use of classical optimization of quantum operations to maximize an objective function.

Variational Quantum Eigensolver

The VQE algorithm applies classical optimization to minimize the energy expectation of an ansatz state to find the ground state energy of a molecule.[45] This can also be extended to find excited energies of molecules.

Einstein's thought experiments

From Wikipedia, the free encyclopedia
A hallmark of Albert Einstein's career was his use of visualized thought experiments (German: Gedankenexperiment) as a fundamental tool for understanding physical issues and for elucidating his concepts to others. Einstein's thought experiments took diverse forms. In his youth, he mentally chased beams of light. For special relativity, he employed moving trains and flashes of lightning to explain his most penetrating insights. For general relativity, he considered a person falling off a roof, accelerating elevators, blind beetles crawling on curved surfaces and the like. In his debates with Niels Bohr on the nature of reality, he proposed imaginary devices intended to show, at least in concept, how the Heisenberg uncertainty principle might be evaded. In a profound contribution to the literature on quantum mechanics, Einstein considered two particles briefly interacting and then flying apart so that their states are correlated, anticipating the phenomenon known as quantum entanglement.

Introduction

A thought experiment is a logical argument or mental model cast within the context of an imaginary (hypothetical or even counterfactual) scenario. A scientific thought experiment, in particular, may examine the implications of a theory, law, or set of principles with the aid of fictive and/or natural particulars (demons sorting molecules, cats whose lives hinge upon a radioactive disintegration, men in enclosed elevators) in an idealized environment (massless trapdoors, absence of friction). They describe experiments that, except for some specific and necessary idealizations, could conceivably be performed in the real world.[2]

As opposed to physical experiments, thought experiments do not report new empirical data. They can only provide conclusions based on deductive or inductive reasoning from their starting assumptions. Thought experiments invoke particulars that are irrelevant to the generality of their conclusions. It is the invocation of these particulars that give thought experiments their experiment-like appearance. A thought experiment can always be reconstructed as a straightforward argument, without the irrelevant particulars. John D. Norton, a well-known philosopher of science, has noted that "a good thought experiment is a good argument; a bad thought experiment is a bad argument."[3]

When effectively used, the irrelevant particulars that convert a straightforward argument into a thought experiment can act as "intuition pumps" that stimulate readers' ability to apply their intuitions to their understanding of a scenario.[4] Thought experiments have a long history. Perhaps the best known in the history of modern science is Galileo's demonstration that falling objects must fall at the same rate regardless of their masses. This has sometimes been taken to be an actual physical demonstration, involving his climbing up the Leaning Tower of Pisa and dropping two heavy weights off it. In fact, it was a logical demonstration described by Galileo in Discorsi e dimostrazioni matematiche (1638).[5]

Einstein had a highly visual understanding of physics. His work in the patent office "stimulated [him] to see the physical ramifications of theoretical concepts." These aspects of his thinking style inspired him to fill his papers with vivid practical detail making them quite different from, say, the papers of Lorentz or Maxwell. This included his use of thought experiments.[6]:26–27;121–127

Special relativity

Pursuing a beam of light

Late in life, Einstein recalled
...a paradox upon which I had already hit at the age of sixteen: If I pursue a beam of light with the velocity c (velocity of light in a vacuum), I should observe such a beam of light as an electromagnetic field at rest though spatially oscillating. There seems to be no such thing, however, neither on the basis of experience nor according to Maxwell's equations. From the very beginning it appeared to me intuitively clear that, judged from the standpoint of such an observer, everything would have to happen according to the same laws as for an observer who, relative to the earth, was at rest. For how should the first observer know or be able to determine, that he is in a state of fast uniform motion? One sees in this paradox the germ of the special relativity theory is already contained.[p 1]:52–53

Einstein's thought experiment as a 16 year old student

Einstein's recollections of his youthful musings are widely cited because of the hints they provide of his later great discovery. However, Norton has noted that Einstein's reminiscences were probably colored by a half-century of hindsight. Norton lists several problems with Einstein's recounting, both historical and scientific:[7]
1. At 16 years old and a student at the Gymnasium in Aarau, Einstein would have had the thought experiment in late 1895 to early 1896. But various sources note that Einstein did not learn Maxwell's theory until 1898, in university.[7][8]

2. The second issue is that a 19th century aether theorist would have had no difficulties with the thought experiment. Einstein's statement, "...there seems to be no such thing...on the basis of experience," would not have counted as an objection, but would have represented a mere statement of fact, since no one had ever traveled at such speeds.

3. An aether theorist would have regarded "...nor according to Maxwell's equations" as simply representing a misunderstanding on Einstein's part. Unfettered by any notion that the speed of light represents a cosmic limit, the aether theorist would simply have set velocity equal to c, noted that yes indeed, the light would appear to be frozen, and then thought no more of it.[7]
Rather than the thought experiment being at all incompatible with aether theories (which it is not), the youthful Einstein appears to have reacted to the scenario out of an intuitive sense of wrongness. He felt that the laws of optics should obey the principle of relativity. As he grew older, his early thought experiment acquired deeper levels of significance: Einstein felt that Maxwell's equations should be the same for all observers in inertial motion. From Maxwell's equations, one can deduce a single speed of light, and there is nothing in this computation that depends on an observer's speed. Einstein sensed a conflict between Newtonian mechanics and the constant speed of light determined by Maxwell's equations.[6]:114–115

Regardless of the historical and scientific issues described above, Einstein's early thought experiment was part of the repertoire of test cases that he used to check on the viability of physical theories. Norton suggests that the real importance of the thought experiment was that it provided a powerful objection to emission theories of light, which Einstein had worked on for several years prior to 1905.[7][8][9]

Magnet and conductor

In the very first paragraph of Einstein's seminal 1905 work introducing special relativity, he writes:
It is known that the application of Maxwell's electrodynamics, as ordinarily conceived at the present time, to moving bodies, leads to asymmetries which don't seem to be connected with the phenomena. Let us, for example, think of the mutual action between a magnet and a conductor. The observed phenomenon in this case depends only on the relative motion of the conductor and the magnet, while according to the usual conception, a strict distinction must be made between the cases where the one or the other of the bodies is in motion. If, for example, the magnet moves and the conductor is at rest, then an electric field of certain energy-value is produced in the neighbourhood of the magnet, which excites a current in those parts of the field where a conductor exists. But if the magnet be at rest and the conductor be set in motion, no electric field is produced in the neighbourhood of the magnet, but an electromotive force is produced in the conductor which corresponds to no energy per se; however, this causes – equality of the relative motion in both considered cases is assumed – an electric current of the same magnitude and the same course, as the electric force in the first case.[p 2]
Magnet and conductor thought experiment

This opening paragraph recounts well-known experimental results obtained by Michael Faraday in 1831. The experiments describe what appeared to be two different phenomena: the motional EMF generated when a wire moves through a magnetic field (see Lorentz force), and the transformer EMF generated by a changing magnetic field (due to the Maxwell–Faraday equation).[9][10][11]:135–157 James Clerk Maxwell himself drew attention to this fact in his 1861 paper On Physical Lines of Force. In the latter half of Part II of that paper, Maxwell gave a separate physical explanation for each of the two phenomena.[p 3]

Although Einstein calls the asymmetry "well-known", there is no evidence that any of Einstein's contemporaries considered the distinction between motional EMF and transformer EMF to be in any way odd or pointing to a lack of understanding of the underlying physics. Maxwell, for instance, had repeatedly discussed Faraday's laws of induction, stressing that the magnitude and direction of the induced current was a function only of the relative motion of the magnet and the conductor, without being bothered by the clear distinction between conductor-in-motion and magnet-in-motion in the underlying theoretical treatment.[11]:135–138

Yet Einstein's reflection on this experiment represented the decisive moment in his long and tortuous path to special relativity. Although the equations describing the two scenarios are entirely different, there is no measurement that can distinguish whether the magnet is moving, the conductor is moving, or both.[10]

In a 1920 review on the Fundamental Ideas and Methods of the Theory of Relativity (unpublished), Einstein related how disturbing he found this asymmetry:
The idea that these two cases should essentially be different was unbearable to me. According to my conviction, the difference between the two could only lie in the choice of the point of view, but not in a real difference .[p 4]:20
Einstein needed to extend the relativity of motion that he perceived between magnet and conductor in the above thought experiment to a full theory. For years, however, he did not know how this might be done. The exact path that Einstein took to resolve this issue is unknown. We do know, however, that Einstein spent several years pursuing an emission theory of light, encountering difficulties that eventually led him to give up the attempt.[10]
Gradually I despaired of the possibility of discovering the true laws by means of constructive efforts based on known facts. The longer and more desperately I tried, the more I came to the conviction that only the discovery of a universal formal principle could lead us to assured results.[p 1]:49
That decision ultimately led to his development of special relativity as a theory founded on two postulates of which he could be sure.[10] Expressed in contemporary physics vocabulary, his postulates were as follows:[note 1]
1. The laws of physics take the same form in all inertial frames.
2. In any given inertial frame, the velocity of light c is the same whether the light be emitted by a body at rest or by a body in uniform motion. [Emphasis added by editor][12]:140–141
Einstein's wording of the second postulate was one with which nearly all theorists of his day could agree. His wording is a far more intuitive form of the second postulate than the stronger version frequently encountered in popular writings and college textbooks.[13][note 2]

Trains, embankments, and lightning flashes

The topic of how Einstein arrived at special relativity has been a fascinating one to many scholars, and it is not hard to understand why: A lowly, twenty-six year old patent officer (third class), largely self-taught in physics and completely divorced from mainstream research, nevertheless in his miracle year of 1905 produces four extraordinary works, only one of which (his paper on Brownian motion) appeared related to anything that he had ever published before.[8]

Einstein's paper, On the Electrodynamics of Moving Bodies, is a polished work that bears few traces of its gestation. Documentary evidence concerning the development of the ideas that went into it consist of, quite literally, only two sentences in a handful of preserved early letters, and various later historical remarks by Einstein himself, some of them known only second-hand and at times contradictory.[8]
 
Train and embankment thought experiment

In regards to the relativity of simultaneity, Einstein's 1905 paper develops the concept vividly by carefully considering the basics of how time may be disseminated through the exchange of signals between clocks.[15] In his popular work, Relativity: The Special and General Theory, Einstein translates the formal presentation of his paper into a thought experiment using a train, a railway embankment, and lightning flashes. The essence of the thought experiment is as follows:
  • Observer M stands on an embankment, while observer M' rides on a rapidly traveling train. At the precise moment that M and M' coincide in their positions, lightning strikes points A and B equidistant from M and M'.
  • Light from these two flashes reach M at the same time, from which M concludes that the bolts were synchronous.
  • The combination of Einstein's first and second postulates implies that, despite the rapid motion of the train relative to the embankment, M' measures exactly the same speed of light as does M. Since M' was equidistant from A and B when lightning struck, the fact that M' receives light from B before light from A means that to M', the bolts were not synchronous. Instead, the bolt at B struck first.[p 5]:29–31 [note 3]
A routine supposition among historians of science is that, in accordance with the analysis given in his 1905 special relativity paper and in his popular writings, Einstein discovered the relativity of simultaneity by thinking about how clocks could be synchronized by light signals.[15] The Einstein synchronization convention was originally developed by telegraphers in the middle 19th century. The dissemination of precise time was an increasingly important topic during this period. Trains needed accurate time to schedule use of track, cartographers needed accurate time to determine longitude, while astronomers and surveyors dared to consider the worldwide dissemination of time to accuracies of thousandths of a second.[16]:132–144;183–187 Following this line of argument, Einstein's position in the patent office, where he specialized in evaluating electromagnetic and electromechanical patents, would have exposed him to the latest developments in time technology, which would have guided him in his thoughts towards understanding the relativity of simultaneity.[16]:243–263

However, all of the above is supposition. In later recollections, when Einstein was asked about what inspired him to develop special relativity, he would mention his riding a light beam and his magnet and conductor thought experiments. He would also mention the importance of the Fizeau experiment and the observation of stellar aberration. "They were enough", he said.[17] He never mentioned thought experiments about clocks and their synchronization.[15]

The routine analyses of the Fizeau experiment and of stellar aberration, that treat light as Newtonian corpuscles, do not require relativity. But problems arise if one considers light as waves traveling through an aether, which are resolved by applying the relativity of simultaneity. It is entirely possible, therefore, that Einstein arrived at special relativity through a different path than that commonly assumed, through Einstein's examination of Fizeau's experiment and stellar aberration.[15]

We therefore do not know just how important clock synchronization and the train and embankment thought experiment were to Einstein's development of the concept of the relativity of simultaneity. We do know, however, that the train and embankment thought experiment was the preferred means whereby he chose to teach this concept to the general public.[p 5]:29–31

General relativity

Falling painters and accelerating elevators

In his unpublished 1920 review, Einstein related the genesis of his thoughts on the equivalence principle:
When I was busy (in 1907) writing a summary of my work on the theory of special relativity for the Jahrbuch für Radioaktivität und Elektronik [Yearbook for Radioactivity and Electronics], I also had to try to modify the Newtonian theory of gravitation such as to fit its laws into the theory. While attempts in this direction showed the practicability of this enterprise, they did not satisfy me because they would have had to be based upon unfounded physical hypotheses. At that moment I got the happiest thought of my life in the following form: In an example worth considering, the gravitational field has a relative existence only in a manner similar to the electric field generated by magneto-electric induction. Because for an observer in free-fall from the roof of a house there is during the fall—at least in his immediate vicinity—no gravitational field. Namely, if the observer lets go of any bodies, they remain relative to him, in a state of rest or uniform motion, independent of their special chemical or physical nature. The observer, therefore, is justified in interpreting his state as being "at rest."[p 4]:20–21
The realization "startled" Einstein, and inspired him to begin an eight-year quest that led to what is considered to be his greatest work, the theory of general relativity. Over the years, the story of the falling man has become an iconic one, much embellished by other writers. In most retellings of Einstein's story, the falling man is identified as a painter. In some accounts, Einstein was inspired after he witnessed a painter falling from the roof of a building adjacent to the patent office where he worked. This version of the story leaves unanswered the question of why Einstein might consider his observation of such an unfortunate accident to represent the happiest thought in his life.[6]:145
 
A thought experiment used by Einstein to illustrate the equivalence principle

Einstein later refined his thought experiment to consider a man inside a large enclosed chest or elevator falling freely in space. While in free fall, the man would consider himself weightless, and any loose objects that he emptied from his pockets would float alongside him. Then Einstein imagined a rope attached to the roof of the chamber. A powerful "being" of some sort begins pulling on the rope with constant force. The chamber begins to move "upwards" with a uniformly accelerated motion. Within the chamber, all of the man's perceptions are consistent with his being in a uniform gravitational field. Einstein asked, "Ought we to smile at the man and say that he errs in his conclusion?" Einstein answered no. Rather, the thought experiment provided "good grounds for extending the principle of relativity to include bodies of reference which are accelerated with respect to each other, and as a result we have gained a powerful argument for a generalised postulate of relativity."[p 5]:75–79 [6]:145–147

Through this thought experiment, Einstein addressed an issue that was so well-known, scientists rarely worried about it or considered it puzzling: Objects have "gravitational mass," which determines the force with which they are attracted to other objects. Objects also have "inertial mass," which determines the relationship between the force applied to an object and how much it accelerates. Newton had pointed out that, even though they are defined differently, gravitational mass and inertial mass always seem to be equal. But until Einstein, no one had conceived a good explanation as to why this should be so. From the correspondence revealed by his thought experiment, Einstein concluded that "it is impossible to discover by experiment whether a given system of coordinates is accelerated, or whether...the observed effects are due to a gravitational field." This correspondence between gravitational mass and inertial mass is the equivalence principle.[6]:147

An extension to his accelerating observer thought experiment allowed Einstein to deduce that "rays of light are propagated curvilinearly in gravitational fields."[p 5]:83–84 [6]:190

Quantum mechanics

Background: Einstein and the quantum

Many myths have grown up about Einstein's relationship with quantum mechanics. Freshman physics students are aware that Einstein explained the photoelectric effect and introduced the concept of the photon. But students who have grown up with the photon may not be aware of how revolutionary the concept was for his time. The best-known factoids about Einstein's relationship with quantum mechanics are his statement, "God does not play dice" and the indisputable fact that he just didn't like the theory in its final form. This has led to the general impression that, despite his initial contributions, Einstein was out of touch with quantum research and played at best a secondary role in its development.[18]:1–4 Concerning Einstein's estrangement from the general direction of physics research after 1925, his well-known scientific biographer, Abraham Pais, wrote:
Einstein is the only scientist to be justly held equal to Newton. That comparison is based exclusively on what he did before 1925. In the remaining 30 years of his life he remained active in research but his fame would be undiminished, if not enhanced, had he gone fishing instead.[19]:43
In hindsight, we know that Pais was incorrect in his assessment.

Einstein was arguably the greatest single contributor to the "old" quantum theory.[18][note 4]
  • In his 1905 paper on light quanta,[p 6] Einstein created the quantum theory of light. His proposal that light exists as tiny packets (photons) was so revolutionary, that even such major pioneers of quantum theory as Planck and Bohr refused to believe that it could be true.[18]:70–79;282–284 [note 5] Bohr, in particular, was a passionate disbeliever in light quanta, and repeatedly argued against them until 1925, when he yielded in the face of overwhelming evidence for their existence.[21]
  • In his 1906 theory of specific heats, Einstein was the first to realize that quantized energy levels explained the specific heat of solids.[p 7] In this manner, he found a rational justification for the third law of thermodynamics (i.e. the entropy of any system approaches zero as the temperature approaches absolute zero[note 6]): at very cold temperatures, atoms in a solid don't have enough thermal energy to reach even the first excited quantum level, and so cannot vibrate.[18]:141–148 [note 7]
  • Einstein proposed the wave-particle duality of light. In 1909, using a rigorous fluctuation argument based on a thought experiment and drawing on his previous work on Brownian motion, he predicted the emergence of a "fusion theory" that would combine the two views.[18]:136–140 [p 8] [p 9] Basically, he demonstrated that the Brownian motion experienced by a mirror in thermal equilibrium with black body radiation would be the sum of two terms, one due to the wave properties of radiation, the other due to its particulate properties.[3]
  • Although Planck is justly hailed as the father of quantum mechanics, his derivation of the law of black-body radiation rested on fragile ground, since it required ad hoc assumptions of an unreasonable character.[note 8] Furthermore, Planck's derivation represented an analysis of classical harmonic oscillators merged with quantum assumptions in an improvised fashion.[18]:184 In his 1916 theory of radiation, Einstein was the first to create a purely quantum explanation.[p 10] This paper, well-known for broaching the possibility of stimulated emission (the basis of the laser), changed the nature of the evolving quantum theory by introducing the fundamental role of random chance.[18]:181–192
  • In 1924, Einstein received a short manuscript by an unknown Indian professor, Satyendra Nath Bose, outlining a new method of deriving the law of blackbody radiation.[note 9] Einstein was intrigued by Bose's peculiar method of counting the number of distinct ways of putting photons into the available states, a method of counting that Bose apparently did not realize was unusual.[note 10] Einstein, however, understood that Bose's counting method implied that photons are, in a deep sense, indistinguishable. He translated the paper into German and had it published. Einstein then followed Bose's paper with an extension to Bose's work which predicted Bose-Einstein condensation, one of the fundamental research topics of condensed matter physics.[18]:215–240
  • While trying to develop a mathematical theory of light which would fully encompass its wavelike and particle-like aspects, Einstein developed the concept of "ghost fields". A guiding wave obeying Maxwell's classical laws would propagate following the normal laws of optics, but would not transmit any energy. This guiding wave, however, would govern the appearance of quanta of energy  h \nu on a statistical basis, so that the appearance of these quanta would be proportional to the intensity of the interference radiation. These ideas became widely known in the physics community, and through Born's work in 1926, later became a key concept in the modern quantum theory of radiation and matter.[18]:193–203 [note 11]
Therefore, Einstein before 1925 originated most of the key concepts of quantum theory: light quanta, wave-particle duality, the fundamental randomness of physical processes, the concept of indistinguishability, and the probability density interpretation of the wave equation. In addition, Einstein can arguably be considered the father of solid state physics and condensed matter physics.[24] He provided a correct derivation of the blackbody radiation law and sparked the notion of the laser.

What of after 1925? In 1935, working with two younger colleagues, Einstein issued a final challenge to quantum mechanics, attempting to show that it could not represent a final solution.[p 12] Despite the questions raised by this paper, it made little or no difference to how physicists employed quantum mechanics in their work. Of this paper, Pais was to write:
The only part of this article that will ultimately survive, I believe, is this last phrase [i.e. "No reasonable definition of reality could be expect to permit this" where "this" refers to the instantaneous transmission of information over a distance], which so poignantly summarizes Einstein's views on quantum mechanics in his later years....This conclusion has not affected subsequent developments in physics, and it is doubtful that it ever will.[12]:454–457
In contrast to Pais' negative assessment, this paper, outlining the EPR paradox, is currently among the top ten papers published in Physical Review, and is the centerpiece of the development of quantum information theory,[25] which has been termed the "third quantum revolution."[26] [note 12]

Einstein's light box

Einstein did not like the direction in which quantum mechanics had turned after 1925. Although excited by Heisenberg's matrix mechanics, Schroedinger's wave mechanics, and Born's clarification of the meaning of the Schroedinger wave equation (i.e. that the absolute square of the wave function is to be interpreted as a probability density), his instincts told him that something was missing.[6]:326–335 In a letter to Born, he wrote:
Quantum mechanics is very impressive. But an inner voice tells me that it is not yet the real thing. The theory produces a good deal but hardly brings us closer to the secret of the Old One.[12]:440–443
The Solvay Debates between Bohr and Einstein began in dining-room discussions at the Fifth Solvay International Conference on Electrons and Photons in 1927. Einstein's issue with the new quantum mechanics was not just that, with the probability interpretation, it rendered invalid the notion of rigorous causality. After all, as noted above, Einstein himself had introduced random processes in his 1916 theory of radiation. Rather, by defining and delimiting the maximum amount of information obtainable in a given experimental arrangement, the Heisenberg uncertainty principle denied the existence of any knowable reality in terms of a complete specification of the momenta and description of individual particles, an objective reality that would exist whether or not we could ever observe it.[6]:325–326 [12]:443–446

Over dinner, during after-dinner discussions, and at breakfast, Einstein debated with Bohr and his followers on the question whether quantum mechanics in its present form could be called complete. Einstein illustrated his points with increasingly clever thought experiments intended to prove that position and momentum could in principle be simultaneously known to arbitrary precision. For example, one of his thought experiments involved sending a beam of electrons through a shuttered screen, recording the positions of the electrons as they struck a photographic screen. Bohr and his allies would always be able to counter Einstein's proposal, usually by the end of the same day.[6]:344–347

On the final day of the conference, Einstein revealed that the uncertainty principle was not the only aspect of the new quantum mechanics that bothered him. Quantum mechanics, at least in the Copenhagen interpretation, appeared to allow action at a distance, the ability for two separated objects to communicate at speeds greater than light. By 1928, the consensus was that Einstein had lost the debate, and even his closest allies during the Fifth Solvay Conference, for example Louis de Broglie, conceded that quantum mechanics appeared to be complete.[6]:346–347
 
Einstein's light box

At the Sixth Solvay International Conference on Magnetism (1930), Einstein came armed with a new thought experiment. This involved a box with a shutter that operated so quickly, it would allow only one photon to escape at a time. The box would first be weighed exactly. Then, at a precise moment, the shutter would open, allowing a photon to escape. The box would then be re-weighed. The well-known relationship between mass and energy E=mc^{2} would allow the energy of the particle to be precisely determined. With this gadget, Einstein believed that he had demonstrated a means to obtain, simultaneously, a precise determination of the energy of the photon as well as its exact time of departure from the system.[6]:346–347 [12]:446–448

Bohr was shaken by this thought experiment. Unable to think of a refutation, he went from one conference participant to another, trying to convince them that Einstein's thought experiment couldn't be true, that if it were true, it would literally mean the end of physics. After a sleepless night, he finally worked out a response which, ironically, depended on Einstein's general relativity.[6]:348–349 Consider the illustration of Einstein's light box:[12]:446–448
1. After emitting a photon, the loss of weight causes the box to rise in the gravitational field.
2. The observer returns the box to its original height by adding weights until the pointer points to its initial position. It takes a certain amount of time t for the observer to perform this procedure. How long it takes depends on the strength of the spring and on how well-damped the system is. If undamped, the box will bounce up and down forever. If over-damped, the box will return to its original position sluggishly (See Damped spring-mass system).[note 13]
3. The longer that the observer allows the damped spring-mass system to settle, the closer the pointer will reach its equilibrium position. At some point, the observer will conclude that his setting of the pointer to its initial position is within an allowable tolerance. There will be some residual error \Delta q in returning the pointer to its initial position. Correspondingly, there will be some residual error \Delta m in the weight measurement.
4. Adding the weights imparts a momentum p to the box which can be measured with an accuracy \Delta p delimited by {\displaystyle \Delta p\Delta q\approx h.} It is clear that {\displaystyle \Delta p<gt\Delta m,} where g is the gravitational constant. Plugging in yields {\displaystyle gt\Delta m\Delta q>h.}
5. General relativity informs us that while the box has been at a height different than its original height, it has been ticking at a rate different than its original rate. The red shift formula informs us that there will be an uncertainty {\displaystyle \Delta t=c^{-2}gt\Delta q} in the determination of {\displaystyle t_{0},} the emission time of the photon.
6. Hence, {\displaystyle c^{2}\Delta m\Delta t=\Delta E\Delta t>h.} The accuracy with which the energy of the photon is measured restricts the precision with which its moment of emission can be measured, following the Heisenberg uncertainty principle.
After finding his last attempt at finding a loophole around the uncertainty principle refuted, Einstein quit trying to search for inconsistencies in quantum mechanics. Instead, he shifted his focus to the other aspects of quantum mechanics with which he was uncomfortable, focusing on his critique of action at a distance. His next paper on quantum mechanics foreshadowed his later paper on the EPR paradox.[12]:448

Einstein was gracious in his defeat. The following September, Einstein nominated Heisenberg and Schroedinger for the Nobel Prize, stating, "I am convinced that this theory undoubtedly contains a part of the ultimate truth."[12]:448

EPR Paradox

Both Bohr and Einstein were subtle men. Einstein tried very hard to show that quantum mechanics was inconsistent; Bohr, however, was always able to counter his arguments. But in his final attack Einstein pointed to something so deep, so counterintuitive, so troubling, and yet so exciting, that at the beginning of the twenty-first century it has returned to fascinate theoretical physicists. Bohr’s only answer to Einstein’s last great discovery—the discovery of entanglement—was to ignore it.
Einstein's fundamental dispute with quantum mechanics wasn't about whether God rolled dice, whether the uncertainty principle allowed simultaneous measurement of position and momentum, or even whether quantum mechanics was complete. It was about reality. Does a physical reality exist independent of our ability to observe it? To Bohr and his followers, such questions were meaningless. All that we can know are the results of measurements and observations. It makes no sense to speculate about an ultimate reality that exists beyond our perceptions.[6]:460–461

Einstein's beliefs had evolved over the years from those that he had held when he was young, when, as a logical positivist heavily influenced by his reading of David Hume and Ernst Mach, he had rejected such unobservable concepts as absolute time and space. Einstein believed:[6]:460–461
1. A reality exists independent of our ability to observe it.
2. Objects are located at distinct points in spacetime and have their own independent, real existence. In other words, he believed in separability and locality.
3. Although at a superficial level, quantum events may appear random, at some ultimate level, strict causality underlies all processes in nature.
EPR paradox thought experiment. (top) The total wave function of a particle pair spreads from the collision point. (bottom) Observation of one particle collapses the wave function.

Einstein considered that realism and localism were fundamental underpinnings of physics. After leaving Nazi Germany and settling in Princeton at the Institute for Advanced Studies, Einstein began writing up a thought experiment that he had been mulling over since attending a lecture by Léon Rosenfeld in 1933. Since the paper was to be in English, Einstein enlisted the help of the 46-year-old Boris Podolsky, a fellow who had moved to the Institute from Caltech; he also enlisted the help of the 26-year-old Nathan Rosen, also at the Institute, who did much of the math.[note 14] The result of their collaboration was the four page EPR paper, which in its title asked the question Can Quantum-Mechanical Description of Physical Reality be Considered Complete?[6]:448–450 [p 12]

After seeing the paper in print, Einstein found himself unhappy with the result. His clear conceptual visualization had been buried under layers of mathematical formalism.[6]:448–450

Einstein's thought experiment involved two particles that have collided or which have been created in such a way that they have properties which are correlated. The total wave function for the pair links the positions of the particles as well as their linear momenta.[6]:450–453 [25] The figure depicts the spreading of the wave function from the collision point. However, observation of the position of the first particle allows us to determine precisely the position of the second particle no matter how far the pair have separated. Likewise, measuring the momentum of the first particle allows us to determine precisely the momentum of the second particle. "In accordance with our criterion for reality, in the first case we must consider the quantity P as being an element of reality, in the second case the quantity Q is an element of reality."[p 12]

Einstein concluded that the second particle, which we have never directly observed, must have at any moment a position that is real and a momentum that is real. Quantum mechanics does not account for these features of reality. Therefore, quantum mechanics is not complete.[6]:451 It is known, from the uncertainty principle, that position and momentum cannot be measured at the same time. But even though their values can only be determined in distinct contexts of measurement, can they both be definite at the same time? Einstein concluded that the answer must be yes.[25]

The only alternative, claimed Einstein, would be to assert that measuring the first particle instantaneously affected the reality of the position and momentum of the second particle.[6]:451 "No reasonable definition of reality could be expected to permit this."[p 12]

Bohr was stunned when he read Einstein's paper and spent more than six weeks framing his response, which he gave exactly the same title as the EPR paper.[p 16] The EPR paper forced Bohr to make a major revision in his understanding of complementarity in the Copenhagen interpretation of quantum mechanics.[25]

Prior to EPR, Bohr had maintained that disturbance caused by the act of observation was the physical explanation for quantum uncertainty. In the EPR thought experiment, however, Bohr had to admit that "there is no question of a mechanical disturbance of the system under investigation." On the other hand, he noted that the two particles were one system described by one quantum function. Furthermore, the EPR paper did nothing to dispel the uncertainty principle.[12]:454–457 [note 15]

Later commentators have questioned the strength and coherence of Bohr's response. As a practical matter, however, physicists for the most part did not pay much attention to the debate between Bohr and Einstein, since the opposing views did not affect one's ability to apply quantum mechanics to practical problems, but only affected one's interpretation of the quantum formalism. If they thought about the problem at all, most working physicists tended to follow Bohr's leadership.[25][30][31]

So stood the situation for nearly 30 years. Then, in 1964, John Stewart Bell made the groundbreaking discovery that Einstein's local realist world view made experimentally verifiable predictions that would be in conflict with those of quantum mechanics. Bell's discovery shifted the Einstein–Bohr debate from philosophy to the realm of experimental physics. Bell's theorem showed that, for any local realist formalism, there exist limits on the predicted correlations between pairs of particles in an experimental realization of the EPR thought experiment. In 1972, the first experimental tests were carried out. Successive experiments improved the accuracy of observation and closed loopholes. To date, it is virtually certain that local realist theories have been falsified.[32]

So Einstein was wrong. But it has several times been the case that Einstein's "mistakes" have foreshadowed and provoked major shifts in scientific research. Such, for instance, has been the case with his proposal of the cosmological constant, which Einstein considered his greatest blunder, but which currently is being actively investigated for its possible role in the accelerating expansion of the universe. In his Princeton years, Einstein was virtually shunned as he pursued the unified field theory. Nowadays, innumerable physicists pursue Einstein's dream for a "theory of everything."[33]

The EPR paper did not prove quantum mechanics to be incorrect. What it did prove was that quantum mechanics, with its "spooky action at a distance," is completely incompatible with commonsense understanding.[34] Furthermore, the effect predicted by the EPR paper, quantum entanglement, has inspired approaches to quantum mechanics different from the Copenhagen interpretation, and has been at the forefront of major technological advances in quantum computing, quantum encryption, and quantum information theory.

Operator (computer programming)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Operator_(computer_programmin...