Search This Blog

Wednesday, April 10, 2019

Holographic data storage

From Wikipedia, the free encyclopedia

Holographic data storage is a potential technology in the area of high-capacity data storage currently dominated by magnetic data storage and conventional optical data storage. Magnetic and optical data storage devices rely on individual bits being stored as distinct magnetic or optical changes on the surface of the recording medium. Holographic data storage records information throughout the volume of the medium and is capable of recording multiple images in the same area utilizing light at different angles.

Additionally, whereas magnetic and optical data storage records information a bit at a time in a linear fashion, holographic storage is capable of recording and reading millions of bits in parallel, enabling data transfer rates greater than those attained by traditional optical storage.

Recording data

Holographic data storage contains information using an optical interference pattern within a thick, photosensitive optical material. Light from a single laser beam is divided into two, or more, separate optical patterns of dark and light pixels. By adjusting the reference beam angle, wavelength, or media position, a multitude of holograms (theoretically, several thousands) can be stored on a single volume.

Reading data

The stored data is read through the reproduction of the same reference beam used to create the hologram. The reference beam's light is focused on the photosensitive material, illuminating the appropriate interference pattern, the light diffracts on the interference pattern, and projects the pattern onto a detector. The detector is capable of reading the data in parallel, over one million bits at once, resulting in the fast data transfer rate. Files on the holographic drive can be accessed in less than 0.2 seconds.

Longevity

Holographic data storage can provide companies a method to preserve and archive information. The write-once, read many (WORM) approach to data storage would ensure content security, preventing the information from being overwritten or modified. Manufacturers believe this technology can provide safe storage for content without degradation for more than 50 years, far exceeding current data storage options. Counterpoints to this claim are that the evolution of data reader technology has – in the last couple of decades – changed every ten years. If this trend continues, it therefore follows that being able to store data for 50–100 years on one format is irrelevant, because you would migrate the data to a new format after only ten years. However, claimed longevity of storage has, in the past, proven to be a key indicator of shorter-term reliability of storage media. Current optical formats – such as CD – have largely lived up to the original longevity claims (where reputable media makes are used) and have proved to be more reliable shorter-term data carriers than the floppy disk and DAT media they displaced.

Terms used

Sensitivity refers to the extent of refractive index modulation produced per unit of exposure. Diffraction efficiency is proportional to the square of the index modulation times the effective thickness. 

The dynamic range determines how many holograms may be multiplexed in a single volume data.

Spatial light modulators (SLM) are pixelated input devices (liquid crystal panels), used to imprint the data to be stored on the object beam.

Technical aspects

Like other media, holographic media is divided into write once (where the storage medium undergoes some irreversible change), and rewritable media (where the change is reversible). Rewritable holographic storage can be achieved via the photorefractive effect in crystals: 

Hologram maken (1).svg
  • Mutually coherent light from two sources creates an interference pattern in the media. These two sources are called the reference beam and the signal beam.
  • Where there is constructive interference the light is bright and electrons can be promoted from the valence band to the conduction band of the material (since the light has given the electrons energy to jump the energy gap). The positively charged vacancies they leave are called holes and they must be immobile in rewritable holographic materials. Where there is destructive interference, there is less light and few electrons are promoted.
  • Electrons in the conduction band are free to move in the material. They will experience two opposing forces that determine how they move. The first force is the coulomb force between the electrons and the positive holes that they have been promoted from. This force encourages the electrons to stay put or move back to where they came from. The second is the pseudo-force of diffusion that encourages them to move to areas where electrons are less dense. If the coulomb forces are not too strong, the electrons will move into the dark areas.
  • Beginning immediately after being promoted, there is a chance that a given electron will recombine with a hole and move back into the valence band. The faster the rate of recombination, the fewer the number of electrons that will have the chance to move into the dark areas. This rate will affect the strength of the hologram.
  • After some electrons have moved into the dark areas and recombined with holes there, there is a permanent space charge field between the electrons that moved to the dark spots and the holes in the bright spots. This leads to a change in the index of refraction due to the electro-optic effect.
Hologram lezen.svg

When the information is to be retrieved or read out from the hologram, only the reference beam is necessary. The beam is sent into the material in exactly the same way as when the hologram was written. As a result of the index changes in the material that were created during writing, the beam splits into two parts. One of these parts recreates the signal beam where the information is stored. Something like a CCD camera can be used to convert this information into a more usable form.

Holograms can theoretically store one bit per cubic block the size of the wavelength of light in writing. For example, light from a helium–neon laser is red, 632.8 nm wavelength light. Using light of this wavelength, perfect holographic storage could store 500 megabytes per cubic millimeter. At the extreme end of the laser spectrum, fluorine excimer laser at 157 nm could store 30 gigabytes per cubic millimeter. In practice, the data density would be much lower, for at least four reasons:
  1. The need to add error-correction
  2. The need to accommodate imperfections or limitations in the optical system
  3. Economic payoff (higher densities may cost disproportionately more to achieve)
  4. Design technique limitations—a problem currently faced in magnetic Hard Drives wherein magnetic domain configuration prevents manufacture of disks that fully utilize the theoretical limits of the technology.
Despite those limitations, it is possible to optimize the storage capacity using all-optical signal processing techniques.

Unlike current storage technologies that record and read one data bit at a time, holographic memory writes and reads data in parallel in a single flash of light.

Two-color recording

Set up for holographic recording
 
For two-color holographic recording, the reference and signal beam fixed to a particular wavelength (green, red or IR) and the sensitizing/gating beam is a separate, shorter wavelength (blue or UV). The sensitizing/gating beam is used to sensitize the material before and during the recording process, while the information is recorded in the crystal via the reference and signal beams. It is shone intermittently on the crystal during the recording process for measuring the diffracted beam intensity. Readout is achieved by illumination with the reference beam alone. Hence the readout beam with a longer wavelength would not be able to excite the recombined electrons from the deep trap centers during readout, as they need the sensitizing light with shorter wavelength to erase them.

Usually, for two-color holographic recording, two different dopants are required to promote trap centers, which belong to transition metal and rare earth elements and are sensitive to certain wavelengths. By using two dopants, more trap centers would be created in the lithium niobate crystal. Namely a shallow and a deep trap would be created. The concept now is to use the sensitizing light to excite electrons from the deep trap farther from the valence band to the conduction band and then to recombine at the shallow traps nearer to the conduction band. The reference and signal beam would then be used to excite the electrons from the shallow traps back to the deep traps. The information would hence be stored in the deep traps. Reading would be done with the reference beam since the electrons can no longer be excited out of the deep traps by the long wavelength beam.

Effect of annealing

For a doubly doped lithium niobate (LiNbO3) crystal there exists an optimum oxidation/reduction state for desired performance. This optimum depends on the doping levels of shallow and deep traps as well as the annealing conditions for the crystal samples. This optimum state generally occurs when 95 – 98% of the deep traps are filled. In a strongly oxidized sample holograms cannot be easily recorded and the diffraction efficiency is very low. This is because the shallow trap is completely empty and the deep trap is also almost devoid of electrons. In a highly reduced sample on the other hand, the deep traps are completely filled and the shallow traps are also partially filled. This results in very good sensitivity (fast recording) and high diffraction efficiency due to the availability of electrons in the shallow traps. However, during readout, all the deep traps get filled quickly and the resulting holograms reside in the shallow traps where they are totally erased by further readout. Hence after extensive readout the diffraction efficiency drops to zero and the hologram stored cannot be fixed.

Development and marketing

In 1975, Hitachi introduced a video disc system in which chrominance, luminance and sound information were encoded holographically. Each frame was recorded as a 1mm diameter hologram on a 305mm disc, while a laser beam read out the hologram from three angles.

Developed from the pioneering work on holography in photorefractive media and holographic data storage of Gerard A. Alphonse, InPhase conducted public demonstrations of the a prototype commercial storage device, at the National Association of Broadcasters 2005 (NAB) convention in Las Vegas, at the Maxell Corporation of America booth. 

The three main companies involved in developing holographic memory, as of 2002, were InPhase and Polaroid spinoff Aprilis in the United States, and Optware in Japan. Although holographic memory has been discussed since the 1960s, and has been touted for near-term commercial application at least since 2001, it has yet to convince critics that it can find a viable market. As of 2002, planned holographic products did not aim to compete head to head with hard drives, but instead to find a market niche based on virtues such as speed of access.

InPhase Technologies, after several announcements and subsequent delays in 2006 and 2007, announced that it would soon be introducing a flagship product. InPhase went out of business in February 2010 and had its assets seized by the state of Colorado for back taxes. The company had reportedly gone through $100 million but the lead investor was unable to raise more capital.

In April 2009, GE Global Research demonstrated their own holographic storage material that could allow for discs that utilize similar read mechanisms as those found on Blu-ray Disc players.

Video game market

Nintendo filed a Joint Research Agreement with InPhase for holographic storage in 2008.

Nintendo is also mentioned in the patent as a joint applicant: "... disclosure is herein made that the claimed invention was made pursuant to a Joint Research Agreement as defined in 35 U.S.C. 103 (c)(3), that was in effect on or before the date the claimed invention was made, and as a result of activities undertaken within the scope of the Joint Research Agreement, by or on the behalf of Nintendo Co., and InPhase Technologies, Inc.".

In fiction

In Lego Star Wars: The Yoda Chronicles, the Jedi use the holocrons, holographic crystals, to store data about their history. 

In 2010: The Year We Make Contact, a tapeworm had to be employed to erase HAL’s holographic memory as “chronological erasures would not work.” 

In Robot and Frank, Robot has a holographic memory which can be half erased but, will be in half the resolution.

Want Sustainable Farming? Look To High-Tech Farms


uncaptioned image
Aberdeen Angus cow standing looking at the camera as one of her six day old twin calves feeds from her teat, while the other male calf stands beside her. They are standing on a bed of fresh clean straw in an animal pen.
Getty

Beef seems to have had a rough year. A growing body of research suggests that consumers can help mitigate the impacts of climate change by eating less meat from ruminant animals like beef and lamb. Despite the raging debate about beef’s impacts, the data shows red meat isn’t going the way of the congealed salad anytime soon.  Plenty of consumers in the west and growing numbers of people in the developing world want to eat beef, so it’s critical to make beef production as sustainable as possible.

It may surprise consumers to learn that sustainable and high-tech farming can go hand in hand, as more growers and producers rely on data-gathering technology like sensors and robotic milking machines.

Barbara Jones is the director of the Southwest Regional Dairy Center at Tarleton University in Stephenville, Texas. She studies cow comfort in dairy farms and technology in dairy systems. “Precision technology is a really rapidly growing [area of technology] we can use to prevent disease and make really timely and informed on-farm decisions.”


While it may seem like technology can create a disconnect between farmer and animal, Jones says often the opposite is true. It’s not really our fault but humans are just terrible at detecting disease within cattle” she explains, because cows are a prey animal who tend to hide the symptoms of their disease.

Brad Heins, who is an animal sciences researcher at University of Minnesota, says that technology can often improve animal welfare on farms. “Sometimes you’ll notice that the computer will indicate something may be wrong before you see clinical signs in a cow or calf [but] you still need a person to actually go treat the cow...it can take a long time, especially if you have a larger farm, to look through all of your cows,” so this helps ensure that a sick animal gets treatment faster.
 
More farms across different sectors are turning to data analytic software platforms to help farmers take those huge swaths of data and turn them into action. Land O’Lakes, for example, has developed a program for wheat and other commodity crop farms. Farmers Business Network also works to fill this niche, by providing farmers with a website for sharing information and recommendations within the farming community.
“We have all of this health information, all of this milk information all of this business information,” she says of new technologies on farm like “FitBits for cows.” “We’re not really doing a great job of combining all of those,” says Jones, of dairy farms, which she says tend to be walled off in their own data silos. “Having an analytical platform that could bring all that together would be so helpful.”

Performance Livestock Analytics, a company based in Ames, Iowa, has created a cloud-based platform to help beef producers improve decision-making about nutrition and animal health. “It’s challenging,” says Dane Kuper, CEO of the company, describing the business of beef. “You have a whole industry that’s being challenged more so than it’s ever been challenged to provide that traceability and transparency and become far more effective in reducing our carbon footprint.”

Kuper says the software provides the farmer with a recipe for what to feed each head of cattle, but also that the data is just the starting point. “It’s so much more than just feeding the animal,” he says. “It’s the buying and selling decisions, and how they look at benchmarking themselves...as compared to other producers on our platform.”

Large-scale farms often have an efficiency advantage over smaller operations, because they’re able to look at larger animal population numbers and make decisions based on a larger pool of data. Often smaller operations don’t have the bandwidth, whether literal or metaphorical, to connect with other farmers and get a sense of best practices for the industry. “For a small or midsize family farm, you may have an advisor or a consultant, but a lot of times they don’t know how well they’re doing compared to what they should be doing,” says Kuper. That’s the niche that PLA is aiming to fill. 

“We’re able to provide that intelligence better but also provide them an experience to where they can see their operation...and give them recommendations on where they’re performing high, where they’re performing low and...where they can [improve],” says Kuper. “This morning we had over a thousand cattle operations that woke up and looked at their cattle in real time.”

Quantum Bayesianism

From Wikipedia, the free encyclopedia

Each point in the Bloch ball is a possible quantum state for a qubit. In QBism, all quantum states are representations of personal probabilities.
 
In physics and the philosophy of physics, quantum Bayesianism (abbreviated QBism, pronounced "cubism") is an interpretation of quantum mechanics that takes an agent's actions and experiences as the central concerns of the theory. This interpretation is distinguished by its use of a subjective Bayesian account of probabilities to understand the quantum mechanical Born rule as a normative addition to good decision-making. Rooted in the prior work of Carlton Caves, Christopher Fuchs, and Rüdiger Schack during the early 2000s, QBism itself is primarily associated with Fuchs and Schack and has more recently been adopted by David Mermin. QBism draws from the fields of quantum information and Bayesian probability and aims to eliminate the interpretational conundrums that have beset quantum theory. The QBist interpretation is historically derivative of the views of the various physicists that are often grouped together as "the" Copenhagen interpretation, but is itself distinct from them. Theodor Hänsch has characterized QBism as sharpening those older views and making them more consistent.

More generally, any work that uses a Bayesian or personalist (aka "subjective") treatment of the probabilities that appear in quantum theory is also sometimes called quantum Bayesian. QBism, in particular, has been referred to as "the radical Bayesian interpretation".

QBism deals with common questions in the interpretation of quantum theory about the nature of wavefunction superposition, quantum measurement, and entanglement. According to QBism, many, but not all, aspects of the quantum formalism are subjective in nature. For example, in this interpretation, a quantum state is not an element of reality—instead it represents the degrees of belief an agent has about the possible outcomes of measurements. For this reason, some philosophers of science have deemed QBism a form of anti-realism. The originators of the interpretation disagree with this characterization, proposing instead that the theory more properly aligns with a kind of realism they call "participatory realism", wherein reality consists of more than can be captured by any putative third-person account of it.

In addition to presenting an interpretation of the existing mathematical structure of quantum theory, some QBists have advocated a research program of reconstructing quantum theory from basic physical principles whose QBist character is manifest. The ultimate goal of this research is to identify what aspects of the ontology of the physical world make quantum theory a good tool for agents to use. However, the QBist interpretation itself, as described in the Core positions section, does not depend on any particular reconstruction.

History and development

British philosopher, mathematician, and economist Frank Ramsey, whose interpretation of probability theory closely matches the one adopted by QBism.
 
E. T. Jaynes, a promoter of the use of Bayesian probability in statistical physics, once suggested that quantum theory is "[a] peculiar mixture describing in part realities of Nature, in part incomplete human information about Nature—all scrambled up by Heisenberg and Bohr into an omelette that nobody has seen how to unscramble." QBism developed out of efforts to separate these parts using the tools of quantum information theory and personalist Bayesian probability theory

There are many interpretations of probability theory. Broadly speaking, these interpretations fall into one of two categories: those which assert that a probability is an objective property of reality and those which assert that a probability is a subjective, mental construct which an agent may use to quantify their ignorance or degree of belief in a proposition. QBism begins by asserting that all probabilities, even those appearing in quantum theory, are most properly viewed as members of the latter category. Specifically, QBism adopts a personalist Bayesian interpretation along the lines of Italian mathematician Bruno de Finetti and English philosopher Frank Ramsey.

According to QBists, the advantages of adopting this view of probability are twofold. First, for QBists the role of quantum states, such as the wavefunctions of particles, is to efficiently encode probabilities; so quantum states are ultimately degrees of belief themselves. (If one considers any single measurement that is a minimal, informationally complete POVM, this is especially clear: A quantum state is mathematically equivalent to a single probability distribution, the distribution over the possible outcomes of that measurement.) Regarding quantum states as degrees of belief implies that the event of a quantum state changing when a measurement occurs—the "collapse of the wave function"—is simply the agent updating her beliefs in response to a new experience. Second, it suggests that quantum mechanics can be thought of as a local theory, because the Einstein–Podolsky–Rosen (EPR) criterion of reality can be rejected. The EPR criterion states, "If, without in any way disturbing a system, we can predict with certainty (i.e., with probability equal to unity) the value of a physical quantity, then there exists an element of reality corresponding to that quantity." Arguments that quantum mechanics should be considered a nonlocal theory depend upon this principle, but to a QBist, it is invalid, because a personalist Bayesian considers all probabilities, even those equal to unity, to be degrees of belief. Therefore, while many interpretations of quantum theory conclude that quantum mechanics is a nonlocal theory, QBists do not.

Fuchs introduced the term "QBism" and outlined the interpretation in more or less its present form in 2010, carrying further and demanding consistency of ideas broached earlier, notably in publications from 2002. Several subsequent papers have expanded and elaborated upon these foundations, notably a Reviews of Modern Physics article by Fuchs and Schack; an American Journal of Physics article by Fuchs, Mermin, and Schack; and Enrico Fermi Summer School lecture notes by Fuchs and Stacey.

Prior to the 2010 paper, the term "quantum Bayesianism" was used to describe the developments which have since led to QBism in its present form. However, as noted above, QBism subscribes to a particular kind of Bayesianism which does not suit everyone who might apply Bayesian reasoning to quantum theory (see, for example, the Other uses of Bayesian probability in quantum physics section below). Consequently, Fuchs chose to call the interpretation "QBism," pronounced "cubism," preserving the Bayesian spirit via the CamelCase in the first two letters, but distancing it from Bayesianism more broadly. As this neologism is a homonym of Cubism the art movement, it has motivated conceptual comparisons between the two, and media coverage of QBism has been illustrated with art by Picasso and Gris. However, QBism itself was not influenced or motivated by Cubism and has no lineage to a potential connection between Cubist art and Bohr's views on quantum theory.

Core positions

According to QBism, quantum theory is a tool which an agent may use to help manage his or her expectations, more like probability theory than a conventional physical theory. Quantum theory, QBism claims, is fundamentally a guide for decision making which has been shaped by some aspects of physical reality. Chief among the tenets of QBism are the following:
  • All probabilities, including those equal to zero or one, are valuations that an agent ascribes to his or her degrees of belief in possible outcomes. As they define and update probabilities, quantum states (density operators), channels (completely positive trace-preserving maps), and measurements (positive operator-valued measures) are also the personal judgements of an agent.
  • The Born rule is normative, not descriptive. It is a relation to which an agent should strive to adhere in his or her probability and quantum state assignments.
  • Quantum measurement outcomes are personal experiences for the agent gambling on them. Different agents may confer and agree upon the consequences of a measurement, but the outcome is the experience each of them individually has.
  • A measurement apparatus is conceptually an extension of the agent. It should be considered analogous to a sense organ or prosthetic limb—simultaneously a tool and a part of the individual.

Reception and criticism

Jean Metzinger, 1912, Danseuse au café. One advocate of QBism, physicist David Mermin, describes his rationale for choosing that term over the older and more general "quantum Bayesianism": "I prefer [the] term 'QBist' because [this] view of quantum mechanics differs from others as radically as cubism differs from renaissance painting ..."
 
Reactions to the QBist interpretation have ranged from enthusiastic to strongly negative. Some who have criticized QBism claim that it fails to meet the goal of resolving paradoxes in quantum theory. Bacciagaluppi argues that QBism's treatment of measurement outcomes does not ultimately resolve the issue of nonlocality, and Jaeger finds QBism's supposition that the interpretation of probability is key for the resolution to be unnatural and unconvincing. Norsen has accused QBism of solipsism, and Wallace identifies QBism as an instance of instrumentalism; QBists have argued insistently that these characterizations are misunderstandings, and that QBism is neither solipsist nor instrumentalist. A critical article by Nauenberg in the American Journal of Physics prompted a reply by Fuchs, Mermin, and Schack. Some assert that there may be inconsistencies; for example, Stairs argues that when a probability assignment equals one, it cannot be a degree of belief as QBists say. Further, while also raising concerns about the treatment of probability-one assignments, Timpson suggests that QBism may result in a reduction of explanatory power as compared to other interpretations. Fuchs and Schack replied to these concerns in a later article. Mermin advocated QBism in a 2012 Physics Today article, which prompted considerable discussion. Several further critiques of QBism which arose in response to Mermin's article, and Mermin's replies to these comments, may be found in the Physics Today readers' forum. Section 2 of the Stanford Encyclopedia of Philosophy entry on QBism also contains a summary of objections to the interpretation, and some replies. Others are opposed to QBism on more general philosophical grounds; for example, Mohrhoff criticizes QBism from the standpoint of Kantian philosophy.

Certain authors find QBism internally self-consistent, but do not subscribe to the interpretation. For example, Marchildon finds QBism well-defined in a way that, to him, many-worlds interpretations are not, but he ultimately prefers a Bohmian interpretation. Similarly, Schlosshauer and Claringbold state that QBism is a consistent interpretation of quantum mechanics, but do not offer a verdict on whether it should be preferred. In addition, some agree with most, but perhaps not all, of the core tenets of QBism; Barnum's position, as well as Appleby's, are examples.

Popularized or semi-popularized media coverage of QBism has appeared in New Scientist, Scientific American, Nature, Science News, the FQXi Community, the Frankfurter Allgemeine Zeitung, Quanta Magazine, Aeon, and Discover. In 2018, two popular-science books about the interpretation of quantum mechanics, Ball's Beyond Weird and Ananthaswamy's Through Two Doors at Once, devote sections to QBism. Furthermore, Harvard University Press published a popularized treatment of the subject, QBism: The Future of Quantum Physics, in 2016.

Relation to other interpretations

Group photo from the 2005 University of Konstanz conference Being Bayesian in a Quantum World.

Copenhagen interpretations

The views of many physicists (Bohr, Heisenberg, Rosenfeld, von Weizsäcker, Peres, etc.) are often grouped together as the "Copenhagen interpretation" of quantum mechanics. Several authors have deprecated this terminology, claiming that it is historically misleading and obscures differences between physicists that are as important as their similarities. QBism shares many characteristics in common with the ideas often labeled as "the Copenhagen interpretation", but the differences are important; to conflate them or to regard QBism as a minor modification of the points of view of Bohr or Heisenberg, for instance, would be a substantial misrepresentation.

QBism takes probabilities to be personal judgments of the individual agent who is using quantum mechanics. This contrasts with older Copenhagen-type views, which hold that probabilities are given by quantum states that are in turn fixed by objective facts about preparation procedures. QBism considers a measurement to be any action that an agent takes to elicit a response from the world and the outcome of that measurement to be the experience the world's response induces back on that agent. As a consequence, communication between agents is the only means by which different agents can attempt to compare their internal experiences. Most variants of the Copenhagen interpretation, however, hold that the outcomes of experiments are agent-independent pieces of reality for anyone to access. QBism claims that these points on which it differs from previous Copenhagen-type interpretations resolve the obscurities that many critics have found in the latter, by changing the role that quantum theory plays (even though QBism does not yet provide a specific underlying ontology). Specifically, QBism posits that quantum theory is a normative tool which an agent may use to better navigate reality, rather than a mechanics of reality.

Other epistemic interpretations

Approaches to quantum theory, like QBism, which treat quantum states as expressions of information, knowledge, belief, or expectation are called "epistemic" interpretations. These approaches differ from each other in what they consider quantum states to be information or expectations "about", as well as in the technical features of the mathematics they employ. Furthermore, not all authors who advocate views of this type propose an answer to the question of what the information represented in quantum states concerns. In the words of the paper that introduced the Spekkens Toy Model,
...if a quantum state is a state of knowledge, and it is not knowledge of local and noncontextual hidden variables, then what is it knowledge about? We do not at present have a good answer to this question. We shall therefore remain completely agnostic about the nature of the reality to which the knowledge represented by quantum states pertains. This is not to say that the question is not important. Rather, we see the epistemic approach as an unfinished project, and this question as the central obstacle to its completion. Nonetheless, we argue that even in the absence of an answer to this question, a case can be made for the epistemic view. The key is that one can hope to identify phenomena that are characteristic of states of incomplete knowledge regardless of what this knowledge is about.
Leifer and Spekkens propose a way of treating quantum probabilities as Bayesian probabilities, thereby considering quantum states as epistemic, which they state is "closely aligned in its philosophical starting point" with QBism. However, they remain deliberately agnostic about what physical properties or entities quantum states are information (or beliefs) about, as opposed to QBism, which offers an answer to that question. Another approach, advocated by Bub and Pitowsky, argues that quantum states are information about propositions within event spaces that form non-Boolean lattices. On occasion, the proposals of Bub and Pitowsky are also called "quantum Bayesianism".

Zeilinger and Brukner have also proposed an interpretation of quantum mechanics in which "information" is a fundamental concept, and in which quantum states are epistemic quantities. Unlike QBism, the Brukner–Zeilinger interpretation treats some probabilities as objectively fixed. In the Brukner–Zeilinger interpretation, a quantum state represents the information that a hypothetical observer in possession of all possible data would have. Put another way, a quantum state belongs in their interpretation to an optimally-informed agent, whereas in QBism, any agent can formulate a state to encode her own expectations. Despite this difference, in Cabello's classification, the proposals of Zeilinger and Brukner are also designated as "participatory realism," as QBism and the Copenhagen-type interpretations are.

Bayesian, or epistemic, interpretations of quantum probabilities were proposed in the early 1990s by Baez and Youssef.

Von Neumann's views

R. F. Streater argued that "[t]he first quantum Bayesian was von Neumann," basing that claim on von Neumann's textbook The Mathematical Foundations of Quantum Mechanics. Blake Stacey disagrees, arguing that the views expressed in that book on the nature of quantum states and the interpretation of probability are not compatible with QBism, or indeed, with any position that might be called quantum Bayesianism.

Relational quantum mechanics

Comparisons have also been made between QBism and the relational quantum mechanics (RQM) espoused by Carlo Rovelli and others. In both QBism and RQM, quantum states are not intrinsic properties of physical systems. Both QBism and RQM deny the existence of an absolute, universal wavefunction. Furthermore, both QBism and RQM insist that quantum mechanics is a fundamentally local theory. In addition, Rovelli, like several QBist authors, advocates reconstructing quantum theory from physical principles in order to bring clarity to the subject of quantum foundations. One important distinction between the two interpretations is their philosophy of probability: RQM does not adopt the Ramsey–de Finetti school of personalist Bayesianism. Moreover, RQM does not insist that a measurement outcome is necessarily an agent's experience.

Other uses of Bayesian probability in quantum physics

QBism should be distinguished from other applications of Bayesian inference in quantum physics, and from quantum analogues of Bayesian inference. For example, some in the field of computer science have introduced a kind of quantum Bayesian network, which they argue could have applications in "medical diagnosis, monitoring of processes, and genetics". Bayesian inference has also been applied in quantum theory for updating probability densities over quantum states, and MaxEnt methods have been used in similar ways. Bayesian methods for quantum state and process tomography are an active area of research.

Technical developments and reconstructing quantum theory

Conceptual concerns about the interpretation of quantum mechanics and the meaning of probability have motivated technical work. A quantum version of the de Finetti theorem, introduced by Caves, Fuchs, and Schack (independently reproving a result found using different means by Størmer) to provide a Bayesian understanding of the idea of an "unknown quantum state", has found application elsewhere, in topics like quantum key distribution and entanglement detection.

Adherents of several interpretations of quantum mechanics, QBism included, have been motivated to reconstruct quantum theory. The goal of these research efforts has been to identify a new set of axioms or postulates from which the mathematical structure of quantum theory can be derived, in the hope that with such a reformulation, the features of nature which made quantum theory the way it is might be more easily identified. Although the core tenets of QBism do not demand such a reconstruction, some QBists—Fuchs, in particular—have argued that the task should be pursued.

One topic prominent in the reconstruction effort is the set of mathematical structures known as symmetric, informationally-complete, positive operator-valued measures (SIC-POVMs). QBist foundational research stimulated interest in these structures, which now have applications in quantum theory outside of foundational studies and in pure mathematics.

The most extensively explored QBist reformulation of quantum theory involves the use of SIC-POVMs to rewrite quantum states (either pure or mixed) as a set of probabilities defined over the outcomes of a "Bureau of Standards" measurement. That is, if one expresses a density matrix as a probability distribution over the outcomes of a SIC-POVM experiment, one can reproduce all the statistical predictions implied by the density matrix from the SIC-POVM probabilities instead. The Born rule then takes the role of relating one valid probability distribution to another, rather than of deriving probabilities from something apparently more fundamental. Fuchs, Schack and others have taken to calling this restatment of the Born rule the urgleichung, from the German for "primal equation" (see Ur- prefix), because of the central role it plays in their reconstruction of quantum theory.

The following discussion presumes some familiarity with the mathematics of quantum information theory, and in particular, the modeling of measurement procedures by POVMs. Consider a quantum system to which is associated a -dimensional Hilbert space. If a set of rank-1 projectors satisfying

 
exists, then one may form a SIC-POVM . An arbitrary quantum state may be written as a linear combination of the SIC projectors
 
 
where is the Born rule probability for obtaining SIC measurement outcome implied by the state assignment . We follow the convention that operators have hats while experiences (that is, measurement outcomes) do not. Now consider an arbitrary quantum measurement, denoted by the POVM . The urgleichung is the expression obtained from forming the Born rule probabilities, , for the outcomes of this quantum measurement, 
 
 
where is the Born rule probability for obtaining outcome implied by the state assignment . The term may be understood to be a conditional probability in a cascaded measurement scenario: Imagine that an agent plans to perform two measurements, first a SIC measurement and then the measurement. After obtaining an outcome from the SIC measurement, the agent will update her state assignment to a new quantum state before performing the second measurement. If she uses the Lüders rule for state update and obtains outcome from the SIC measurement, then . Thus the probability for obtaining outcome for the second measurement conditioned on obtaining outcome for the SIC measurement is
 
Note that the urgleichung is structurally very similar to the law of total probability, which is the expression

 
They functionally differ only by a dimension-dependent affine transformation of the SIC probability vector. As QBism says that quantum theory is an empirically-motivated normative addition to probability theory, Fuchs and others find the appearance of a structure in quantum theory analogous to one in probability theory to be an indication that a reformulation featuring the urgleichung prominently may help to reveal the properties of nature which made quantum theory so successful.
 
It is important to recognize that the urgleichung does not replace the law of total probability. Rather, the urgleichung and the law of total probability apply in different scenarios because and refer to different situations. is the probability that an agent assigns for obtaining outcome on her second of two planned measurements, that is, for obtaining outcome after first making the SIC measurement and obtaining one of the outcomes. , on the other hand, is the probability an agent assigns for obtaining outcome when she does not plan to first make the SIC measurement. The law of total probability is a consequence of coherence within the operational context of performing the two measurements as described. The urgleichung, in contrast, is a relation between different contexts which finds its justification in the predictive success of quantum physics.
The SIC representation of quantum states also provides a reformulation of quantum dynamics. Consider a quantum state with SIC representation . The time evolution of this state is found by applying a unitary operator to form the new state , which has the SIC representation 


The second equality is written in the Heisenberg picture of quantum dynamics, with respect to which the time evolution of a quantum system is captured by the probabilities associated with a rotated SIC measurement of the original quantum state . Then the Schrödinger equation is completely captured in the urgleichung for this measurement:

 
In these terms, the Schrödinger equation is an instance of the Born rule applied to the passing of time; an agent uses it to relate how she will gamble on informationally complete measurements potentially performed at different times. 
 
Those QBists who find this approach promising are pursuing a complete reconstruction of quantum theory featuring the urgleichung as the key postulate. (The urgleichung has also been discussed in the context of category theory.) Comparisons between this approach and others not associated with QBism (or indeed with any particular interpretation) can be found in a book chapter by Fuchs and Stacey and an article by Appleby et al. As of 2017, alternative QBist reconstruction efforts are in the beginning stages.

Computer-aided software engineering

From Wikipedia, the free encyclopedia ...