Search This Blog

Saturday, January 19, 2019

Quantum Theory Rebuilt From Simple Physical Principles

Physicists are trying to rewrite the axioms of quantum theory from scratch in an effort to understand what it all means. The problem? They’ve been almost too successful.
 
Quantum reconstruction
Ulises Farinas for Quanta Magazine


Scientists have been using quantum theory for almost a century now, but embarrassingly they still don’t know what it means. An informal poll taken at a 2011 conference on Quantum Physics and the Nature of Reality showed that there’s still no consensus on what quantum theory says about reality — the participants remained deeply divided about how the theory should be interpreted.

Some physicists just shrug and say we have to live with the fact that quantum mechanics is weird. So particles can be in two places at once, or communicate instantaneously over vast distances? Get over it. After all, the theory works fine. If you want to calculate what experiments will reveal about subatomic particles, atoms, molecules and light, then quantum mechanics succeeds brilliantly.

But some researchers want to dig deeper. They want to know why quantum mechanics has the form it does, and they are engaged in an ambitious program to find out. It is called quantum reconstruction, and it amounts to trying to rebuild the theory from scratch based on a few simple principles.

If these efforts succeed, it’s possible that all the apparent oddness and confusion of quantum mechanics will melt away, and we will finally grasp what the theory has been trying to tell us. “For me, the ultimate goal is to prove that quantum theory is the only theory where our imperfect experiences allow us to build an ideal picture of the world,” said Giulio Chiribella, a theoretical physicist at the University of Hong Kong.

There’s no guarantee of success — no assurance that quantum mechanics really does have something plain and simple at its heart, rather than the abstruse collection of mathematical concepts used today. But even if quantum reconstruction efforts don’t pan out, they might point the way to an equally tantalizing goal: getting beyond quantum mechanics itself to a still deeper theory. “I think it might help us move towards a theory of quantum gravity,” said Lucien Hardy, a theoretical physicist at the Perimeter Institute for Theoretical Physics in Waterloo, Canada.

The Flimsy Foundations of Quantum Mechanics

The basic premise of the quantum reconstruction game is summed up by the joke about the driver who, lost in rural Ireland, asks a passer-by how to get to Dublin. “I wouldn’t start from here,” comes the reply.

Where, in quantum mechanics, is “here”? The theory arose out of attempts to understand how atoms and molecules interact with light and other radiation, phenomena that classical physics couldn’t explain. Quantum theory was empirically motivated, and its rules were simply ones that seemed to fit what was observed. It uses mathematical formulas that, while tried and trusted, were essentially pulled out of a hat by the pioneers of the theory in the early 20th century.

Take Erwin Schrödinger’s equation for calculating the probabilistic properties of quantum particles. The particle is described by a “wave function” that encodes all we can know about it. It’s basically a wavelike mathematical expression, reflecting the well-known fact that quantum particles can sometimes seem to behave like waves. Want to know the probability that the particle will be observed in a particular place? Just calculate the square of the wave function (or, to be exact, a slightly more complicated mathematical term), and from that you can deduce how likely you are to detect the particle there. The probability of measuring some of its other observable properties can be found by, crudely speaking, applying a mathematical function called an operator to the wave function.

Compare this with the ground rules, or axioms, of Einstein’s theory of special relativity, which was as revolutionary in its way as quantum mechanics. (Einstein launched them both, rather miraculously, in 1905.) Before Einstein, there was an untidy collection of equations to describe how light behaves from the point of view of a moving observer. Einstein dispelled the mathematical fog with two simple and intuitive principles: that the speed of light is constant, and that the laws of physics are the same for two observers moving at constant speed relative to one another. Grant these basic principles, and the rest of the theory follows. Not only are the axioms simple, but we can see at once what they mean in physical terms.

What are the analogous statements for quantum mechanics? The eminent physicist John Wheeler once asserted that if we really understood the central point of quantum theory, we would be able to state it in one simple sentence that anyone could understand. If such a statement exists, some quantum reconstructionists suspect that we’ll find it only by rebuilding quantum theory from scratch: by tearing up the work of Bohr, Heisenberg and Schrödinger and starting again.

Quantum Roulette

One of the first efforts at quantum reconstruction was made in 2001 by Hardy, then at the University of Oxford. He ignored everything that we typically associate with quantum mechanics, such as quantum jumps, wave-particle duality and uncertainty. Instead, Hardy focused on probability: specifically, the probabilities that relate the possible states of a system with the chance of observing each state in a measurement. Hardy found that these bare bones were enough to get all that familiar quantum stuff back again.

Hardy assumed that any system can be described by some list of properties and their possible values. For example, in the case of a tossed coin, the salient values might be whether it comes up heads or tails. Then he considered the possibilities for measuring those values definitively in a single observation. You might think any distinct state of any system can always be reliably distinguished (at least in principle) by a measurement or observation. And that’s true for objects in classical physics.

Lucien Hardy
Lucien Hardy, a physicist at the Perimeter Institute, was one of the first to derive the rules of quantum mechanics from simple principles.
Gabriela Secara – Perimeter Institute

In quantum mechanics, however, a particle can exist not just in distinct states, like the heads and tails of a coin, but in a so-called superposition — roughly speaking, a combination of those states. In other words, a quantum bit, or qubit, can be not just in the binary state of 0 or 1, but in a superposition of the two.

But if you make a measurement of that qubit, you’ll only ever get a result of 1 or 0. That is the mystery of quantum mechanics, often referred to as the collapse of the wave function: Measurements elicit only one of the possible outcomes. To put it another way, a quantum object commonly has more options for measurements encoded in the wave function than can be seen in practice.

Hardy’s rules governing possible states and their relationship to measurement outcomes acknowledged this property of quantum bits. In essence the rules were (probabilistic) ones about how systems can carry information and how they can be combined and interconverted.

Hardy then showed that the simplest possible theory to describe such systems is quantum mechanics, with all its characteristic phenomena such as wavelike interference and entanglement, in which the properties of different objects become interdependent. “Hardy’s 2001 paper was the ‘Yes, we can!’ moment of the reconstruction program,” Chiribella said. “It told us that in some way or another we can get to a reconstruction of quantum theory.”

More specifically, it implied that the core trait of quantum theory is that it is inherently probabilistic. “Quantum theory can be seen as a generalized probability theory, an abstract thing that can be studied detached from its application to physics,” Chiribella said. This approach doesn’t address any underlying physics at all, but just considers how outputs are related to inputs: what we can measure given how a state is prepared (a so-called operational perspective). “What the physical system is is not specified and plays no role in the results,” Chiribella said. These generalized probability theories are “pure syntax,” he added — they relate states and measurements, just as linguistic syntax relates categories of words, without regard to what the words mean. In other words, Chiribella explained, generalized probability theories “are the syntax of physical theories, once we strip them of the semantics.”

To distinguish quantum theory from a generalized probability theory, you need specific kinds of constraints on the probabilities and possible outcomes of measurement. But those constraints aren’t unique. So lots of possible theories of probability look quantum-like. How then do you pick out the right one?

“We can look for probabilistic theories that are similar to quantum theory but differ in specific aspects,” said Matthias Kleinmann, a theoretical physicist at the University of the Basque Country in Bilbao, Spain. If you can then find postulates that select quantum mechanics specifically, he explained, you can “drop or weaken some of them and work out mathematically what other theories appear as solutions.” Such exploration of what lies beyond quantum mechanics is not just academic doodling, for it’s possible — indeed, likely — that quantum mechanics is itself just an approximation of a deeper theory. That theory might emerge, as quantum theory did from classical physics, from violations in quantum theory that appear if we push it hard enough.

Bits and Pieces

Some researchers suspect that ultimately the axioms of a quantum reconstruction will be about information: what can and can’t be done with it. One such derivation of quantum theory based on axioms about information was proposed in 2010 by Chiribella, then working at the Perimeter Institute, and his collaborators Giacomo Mauro D’Ariano and Paolo Perinotti of the University of Pavia in Italy. “Loosely speaking,” explained Jacques Pienaar, a theoretical physicist at the University of Vienna, “their principles state that information should be localized in space and time, that systems should be able to encode information about each other, and that every process should in principle be reversible, so that information is conserved.” (In irreversible processes, by contrast, information is typically lost — just as it is when you erase a file on your hard drive.)

What’s more, said Pienaar, these axioms can all be explained using ordinary language. “They all pertain directly to the elements of human experience, namely, what real experimenters ought to be able to do with the systems in their laboratories,” he said. “And they all seem quite reasonable, so that it is easy to accept their truth.” Chiribella and his colleagues showed that a system governed by these rules shows all the familiar quantum behaviors, such as superposition and entanglement.

Giulio Chiribella
Giulio Chiribella, a physicist at the University of Hong Kong, reconstructed quantum theory from ideas in information theory.
Courtesy of CIFAR

One challenge is to decide what should be designated an axiom and what physicists should try to derive from the axioms. Take the quantum no-cloning rule, which is another of the principles that naturally arises from Chiribella’s reconstruction. One of the deep findings of modern quantum theory, this principle states that it is impossible to make a duplicate of an arbitrary, unknown quantum state.

It sounds like a technicality (albeit a highly inconvenient one for scientists and mathematicians seeking to design quantum computers). But in an effort in 2002 to derive quantum mechanics from rules about what is permitted with quantum information, Jeffrey Bub of the University of Maryland and his colleagues Rob Clifton of the University of Pittsburgh and Hans Halvorson of Princeton University made no-cloning one of three fundamental axioms. One of the others was a straightforward consequence of special relativity: You can’t transmit information between two objects more quickly than the speed of light by making a measurement on one of the objects. The third axiom was harder to state, but it also crops up as a constraint on quantum information technology. In essence, it limits how securely a bit of information can be exchanged without being tampered with: The rule is a prohibition on what is called “unconditionally secure bit commitment.”

These axioms seem to relate to the practicalities of managing quantum information. But if we consider them instead to be fundamental, and if we additionally assume that the algebra of quantum theory has a property called non-commutation, meaning that the order in which you do calculations matters (in contrast to the multiplication of two numbers, which can be done in any order), Clifton, Bub and Halvorson have shown that these rules too give rise to superposition, entanglement, uncertainty, nonlocality and so on: the core phenomena of quantum theory.

Another information-focused reconstruction was suggested in 2009 by Borivoje Dakić and Časlav Brukner, physicists at the University of Vienna. They proposed three “reasonable axioms” having to do with information capacity: that the most elementary component of all systems can carry no more than one bit of information, that the state of a composite system made up of subsystems is completely determined by measurements on its subsystems, and that you can convert any “pure” state to another and back again (like flipping a coin between heads and tails).

Dakić and Brukner showed that these assumptions lead inevitably to classical and quantum-style probability, and to no other kinds. What’s more, if you modify axiom three to say that states get converted continuously — little by little, rather than in one big jump — you get only quantum theory, not classical. (Yes, it really is that way round, contrary to what the “quantum jump” idea would have you expect — you can interconvert states of quantum spins by rotating their orientation smoothly, but you can’t gradually convert a classical heads to a tails.) “If we don’t have continuity, then we don’t have quantum theory,” Grinbaum said.

Christopher Fuchs
Christopher Fuchs, a physicist at the University of Massachusetts, Boston, argues that quantum theory describes rules for updating an observer’s personal beliefs.
Katherine Taylor for Quanta Magazine

A further approach in the spirit of quantum reconstruction is called quantum Bayesianism, or QBism. Devised by Carlton Caves, Christopher Fuchs and Rüdiger Schack in the early 2000s, it takes the provocative position that the mathematical machinery of quantum mechanics has nothing to do with the way the world really is; rather, it is just the appropriate framework that lets us develop expectations and beliefs about the outcomes of our interventions. It takes its cue from the Bayesian approach to classical probability developed in the 18th century, in which probabilities stem from personal beliefs rather than observed frequencies. In QBism, quantum probabilities calculated by the Born rule don’t tell us what we’ll measure, but only what we should rationally expect to measure.

In this view, the world isn’t bound by rules — or at least, not by quantum rules. Indeed, there may be no fundamental laws governing the way particles interact; instead, laws emerge at the scale of our observations. This possibility was considered by John Wheeler, who dubbed the scenario Law Without Law. It would mean that “quantum theory is merely a tool to make comprehensible a lawless slicing-up of nature,” said Adán Cabello, a physicist at the University of Seville. Can we derive quantum theory from these premises alone?

“At first sight, it seems impossible,” Cabello admitted — the ingredients seem far too thin, not to mention arbitrary and alien to the usual assumptions of science. “But what if we manage to do it?” he asked. “Shouldn’t this shock anyone who thinks of quantum theory as an expression of properties of nature?”

Making Space for Gravity

In Hardy’s view, quantum reconstructions have been almost too successful, in one sense: Various sets of axioms all give rise to the basic structure of quantum mechanics. “We have these different sets of axioms, but when you look at them, you can see the connections between them,” he said. “They all seem reasonably good and are in a formal sense equivalent because they all give you quantum theory.” And that’s not quite what he’d hoped for. “When I started on this, what I wanted to see was two or so obvious, compelling axioms that would give you quantum theory and which no one would argue with.”

So how do we choose between the options available? “My suspicion now is that there is still a deeper level to go to in understanding quantum theory,” Hardy said. And he hopes that this deeper level will point beyond quantum theory, to the elusive goal of a quantum theory of gravity. “That’s the next step,” he said. Several researchers working on reconstructions now hope that its axiomatic approach will help us see how to pose quantum theory in a way that forges a connection with the modern theory of gravitation — Einstein’s general relativity.

Hardy first suggested that quantum-gravitational systems might show indefinite causal structure in 2007. And in fact only quantum mechanics can display that. While working on quantum reconstructions, Chiribella was inspired to propose an experiment to create causal superpositions of quantum systems, in which there is no definite series of cause-and-effect events. This experiment has now been carried out by Philip Walther’s lab at the University of Vienna — and it might incidentally point to a way of making quantum computing more efficient.

“I find this a striking illustration of the usefulness of the reconstruction approach,” Chiribella said. “Capturing quantum theory with axioms is not just an intellectual exercise. We want the axioms to do something useful for us — to help us reason about quantum theory, invent new communication protocols and new algorithms for quantum computers, and to be a guide for the formulation of new physics.”

But can quantum reconstructions also help us understand the “meaning” of quantum mechanics? Hardy doubts that these efforts can resolve arguments about interpretation — whether we need many worlds or just one, for example. After all, precisely because the reconstructionist program is inherently “operational,” meaning that it focuses on the “user experience” — probabilities about what we measure — it may never speak about the “underlying reality” that creates those probabilities.

“When I went into this approach, I hoped it would help to resolve these interpretational problems,” Hardy admitted. “But I would say it hasn’t.” Cabello agrees. “One can argue that previous reconstructions failed to make quantum theory less puzzling or to explain where quantum theory comes from,” he said. “All of them seem to miss the mark for an ultimate understanding of the theory.” But he remains optimistic: “I still think that the right approach will dissolve the problems and we will understand the theory.”

Maybe, Hardy said, these challenges stem from the fact that the more fundamental description of reality is rooted in that still undiscovered theory of quantum gravity. “Perhaps when we finally get our hands on quantum gravity, the interpretation will suggest itself,” he said. “Or it might be worse!”
Right now, quantum reconstruction has few adherents — which pleases Hardy, as it means that it’s still a relatively tranquil field. But if it makes serious inroads into quantum gravity, that will surely change. In the 2011 poll, about a quarter of the respondents felt that quantum reconstructions will lead to a new, deeper theory. A one-in-four chance certainly seems worth a shot.

Grinbaum thinks that the task of building the whole of quantum theory from scratch with a handful of axioms may ultimately be unsuccessful. “I’m now very pessimistic about complete reconstructions,” he said. But, he suggested, why not try to do it piece by piece instead — to just reconstruct particular aspects, such as nonlocality or causality? “Why would one try to reconstruct the entire edifice of quantum theory if we know that it’s made of different bricks?” he asked. “Reconstruct the bricks first. Maybe remove some and look at what kind of new theory may emerge.”

“I think quantum theory as we know it will not stand,” Grinbaum said. “Which of its feet of clay will break first is what reconstructions are trying to explore.” He thinks that, as this daunting task proceeds, some of the most vexing and vague issues in standard quantum theory — such as the process of measurement and the role of the observer — will disappear, and we’ll see that the real challenges are elsewhere. “What is needed is new mathematics that will render these notions scientific,” he said. Then, perhaps, we’ll understand what we’ve been arguing about for so long.

Fermentation

From Wikipedia, the free encyclopedia

Fermentation in progress: Bubbles of CO2 form a froth on top of the fermentation mixture.
 
Fermentation is a metabolic process that produces chemical changes in organic substrates through the action of enzymes. In biochemistry, it is narrowly defined as the extraction of energy from carbohydrates in the absence of oxygen. In the context of food production, it may more broadly refer to any process in which the activity of microorganisms brings about a desirable change to a foodstuff or beverage. The science of fermentation is known as zymology.

In microorganisms, fermentation is the primary means of producing ATP by the degradation of organic nutrients anaerobically. Humans have used fermentation to produce foodstuffs and beverages since the Neolithic age. For example, fermentation is used for preservation in a process that produces lactic acid found in such sour foods as pickled cucumbers, kimchi, and yogurt, as well as for producing alcoholic beverages such as wine and beer. Fermentation occurs within the gastrointestinal tracts of all animals, including humans.

Definitions

Below are some definitions of fermentation. They range from informal, general usages to more scientific definitions:
  • Preservation methods for food via microorganisms (general use);
  • Any process that produces alcoholic beverages or acidic dairy products (general use);
  • Any large-scale microbial process occurring with or without air (common definition used in industry);
  • Any energy-releasing metabolic process that takes place only under anaerobic conditions (becoming more scientific);
  • Any metabolic process that releases energy from a sugar or other organic molecule, does not require oxygen or an electron transport system, and uses an organic molecule as the final electron acceptor (most scientific).

Biological role

Along with photosynthesis and aerobic respiration, fermentation is a way of extracting energy from molecules, but it is the only one common to all bacteria and eukaryotes. It is therefore considered the oldest metabolic pathway, suitable for an environment that does not yet have oxygen. Yeast, a form of fungus, occurs in almost any environment capable of supporting microbes, from the skins of fruits to the guts of insects and mammals and the deep ocean, and they harvest sugar-rich materials to produce ethanol and carbon dioxide.

The basic mechanism for fermentation remains present in all cells of higher organisms. Mammalian muscle carries out the fermentation that occurs during periods of intense exercise where oxygen supply becomes limited, resulting in the creation of lactic acid. In invertebrates, fermentation also produces succinate and alanine.

Fermenting bacteria play an essential role in the production of methane in habitats ranging from the rumens of cattle to sewage digesters and freshwater sediments. They produce hydrogen, carbon dioxide, formate and acetate and carboxylic acids; and then consortia of microbes convert the carbon dioxide and acetate to methane. Acetogenic bacteria oxidize the acids, obtaining more acetate and either hydrogen or formate. Finally, methanogens (which are in the domain Archea) convert acetate to methane.

Biochemical overview

Comparison of aerobic respiration and most known fermentation types in eucaryotic cell. Numbers in circles indicate counts of carbon atoms in molecules, C6 is glucose C6H12O6, C1 carbon dioxide CO2. Mitochondrial outer membrane is omitted.
 
Fermentation reacts NADH with an endogenous, organic electron acceptor. Usually this is pyruvate formed from sugar through glycolysis. The reaction produces NAD+ and an organic product, typical examples being ethanol, lactic acid, carbon dioxide, and hydrogen gas (H2). However, more exotic compounds can be produced by fermentation, such as butyric acid and acetone. Fermentation products contain chemical energy (they are not fully oxidized), but are considered waste products, since they cannot be metabolized further without the use of oxygen. 

Fermentation normally occurs in an anaerobic environment. In the presence of O2, NADH, and pyruvate are used to generate ATP in respiration. This is called oxidative phosphorylation, and it generates much more ATP than glycolysis alone. For that reason, fermentation is rarely utilized when oxygen is available. However, even in the presence of abundant oxygen, some strains of yeast such as Saccharomyces cerevisiae prefer fermentation to aerobic respiration as long as there is an adequate supply of sugars (a phenomenon known as the Crabtree effect). Some fermentation processes involve obligate anaerobes, which cannot tolerate oxygen. 

Although yeast carries out the fermentation in the production of ethanol in beers, wines, and other alcoholic drinks, this is not the only possible agent: bacteria carry out the fermentation in the production of xanthan gum.

Products

Ethanol

Overview of ethanol fermentation.

In ethanol fermentation, one glucose molecule is converted into two ethanol molecules and two carbon dioxide molecules. It is used to make bread dough rise: the carbon dioxide forms bubbles, expanding the dough into a foam. The ethanol is the intoxicating agent in alcoholic beverages such as wine, beer and liquor. Fermentation of feedstocks, including sugarcane, corn, and sugar beets, produces ethanol that is added to gasoline. In some species of fish, including goldfish and carp, it provides energy when oxygen is scarce (along with lactic acid fermentation).

The figure illustrates the process. Before fermentation, a glucose molecule breaks down into two pyruvate molecules. The energy from this exothermic reaction is used to bind inorganic phosphates to ATP and convert NAD+ to NADH. The pyruvates break down into two acetaldehyde molecules and give off two carbon dioxide molecules as a waste product. The acetaldehyde is reduced into ethanol using the energy and hydrogen from NADH, and the NADH is oxidized into NAD+ so that the cycle may repeat. The reaction is catalysed by the enzymes pyruvate decarboxylase and alcohol dehydrogenase.

Lactic acid

Homolactic fermentation (producing only lactic acid) is the simplest type of fermentation. The pyruvate from glycolysis undergoes a simple redox reaction, forming lactic acid. It is unique because it is one of the only respiration processes to not produce a gas as a byproduct. Overall, one molecule of glucose (or any six-carbon sugar) is converted to two molecules of lactic acid:
C6H12O6 → 2 CH3CHOHCOOH
It occurs in the muscles of animals when they need energy faster than the blood can supply oxygen. It also occurs in some kinds of bacteria (such as lactobacilli) and some fungi. It is the type of bacteria that converts lactose into lactic acid in yogurt, giving it its sour taste. These lactic acid bacteria can carry out either homolactic fermentation, where the end-product is mostly lactic acid, or Heterolactic fermentation, where some lactate is further metabolized and results in ethanol and carbon dioxide (via the phosphoketolase pathway), acetate, or other metabolic products, e.g.:
C6H12O6 → CH3CHOHCOOH + C2H5OH + CO2
If lactose is fermented (as in yogurts and cheeses), it is first converted into glucose and galactose (both six-carbon sugars with the same atomic formula):
C12H22O11 + H2O → 2 C6H12O6
Heterolactic fermentation is in a sense intermediate between lactic acid fermentation and other types, e.g. alcoholic fermentation. The reasons to go further and convert lactic acid into anything else are:
  • The acidity of lactic acid impedes biological processes. This can be beneficial to the fermenting organism as it drives out competitors not adapted to the acidity. As a result, the food will have a longer shelf life (part of the reason foods are purposely fermented in the first place); however, beyond a certain point, the acidity starts affecting the organism that produces it.
  • The high concentration of lactic acid (the final product of fermentation) drives the equilibrium backwards (Le Chatelier's principle), decreasing the rate at which fermentation can occur and slowing down growth.
  • Ethanol, into which lactic acid can be easily converted, is volatile and will readily escape, allowing the reaction to proceed easily. CO2 is also produced, but it is only weakly acidic and even more volatile than ethanol.
  • Acetic acid (another conversion product) is acidic and not as volatile as ethanol; however, in the presence of limited oxygen, its creation from lactic acid releases additional energy. It is a lighter molecule than lactic acid, that forms fewer hydrogen bonds with its surroundings (due to having fewer groups that can form such bonds), thus is more volatile and will also allow the reaction to move forward more quickly.
  • If propionic acid, butyric acid, and longer monocarboxylic acids are produced, the amount of acidity produced per glucose consumed will decrease, as with ethanol, allowing faster growth.

Hydrogen gas

Hydrogen gas is produced in many types of fermentation (mixed acid fermentation, butyric acid fermentation, caproate fermentation, butanol fermentation, glyoxylate fermentation) as a way to regenerate NAD+ from NADH. Electrons are transferred to ferredoxin, which in turn is oxidized by hydrogenase, producing H2. Hydrogen gas is a substrate for methanogens and sulfate reducers, which keep the concentration of hydrogen low and favor the production of such an energy-rich compound, but hydrogen gas at a fairly high concentration can nevertheless be formed, as in flatus

As an example of mixed acid fermentation, bacteria such as Clostridium pasteurianum ferment glucose producing butyrate, acetate, carbon dioxide, and hydrogen gas: The reaction leading to acetate is:
C6H12O6 + 4 H2O → 2 CH3COO + 2 HCO3 + 4 H+ + 4 H2
Glucose could theoretically be converted into just CO2 and H2, but the global reaction releases little energy.

Modes of operation

Most industrial fermentation uses batch or fed-batch procedures, although continuous fermentation can be more economical if various challenges, particularly the difficulty of maintaining sterility, can be met.

Batch

In a batch process, all the ingredients are combined and the reactions proceed without any further input. Batch fermentation has been used for millennia to make bread and alcoholic beverages, and it is still a common method, especially when the process is not well understood. However, it can be expensive because the fermentor must be sterilized using high pressure steam between batches. Strictly speaking, there is often addition of small quantities of chemicals to control the pH or suppress foaming.

Batch fermentation goes through a series of phases. There is a lag phase in which cells adjust to their environment; then a phase in which exponential growth occurs. Once many of the nutrients have been consumed, the growth slows and becomes non-exponential, but production of secondary metabolites (including commercially important antibiotics and enzymes) accelerates. This continues through a stationary phase after most of the nutrients have been consumed, and then the cells die.

Fed-batch

Fed-batch fermentation is a variation of batch fermentation where some of the ingredients are added during the fermentation. This allows greater control over the stages of the process. In particular, production of secondary metabolites can be increased by adding a limited quantity of nutrients during the non-exponential growth phase. Fed-batch operations are often sandwiched between batch operations.

Open

The high cost of sterilizing the fermentor between batches can be avoided using various open fermentation approaches that are able to resist contamination. One is to use a naturally evolved mixed culture. This is particularly favored in wastewater treatment, since mixed populations can adapt to a wide variety of wastes. Thermophilic bacteria can produce lactic acid at temperatures of around 50 degrees Celsius, sufficient to discourage microbial contamination; and ethanol has been produced at a temperature of 70 °C. This is just below its boiling point (78 °C), making it easy to extract. Halophilic bacteria can produce bioplastics in hypersaline conditions. Solid-state fermentation adds a small amount of water to a solid substrate; it is widely used in the food industry to produce flavors, enzymes and organic acids.

Continuous

In continuous fermentation, substrates are added and final products removed continuously. There are three varieties: chemostats, which hold nutrient levels constant; turbidostats, which keep cell mass constant; and plug flow reactors in which the culture medium flows steadily through a tube while the cells are recycled from the outlet to the inlet. If the process works well, there is a steady flow of feed and effluent and the costs of repeatedly setting up a batch are avoided. Also, it can prolong the exponential growth phase and avoid byproducts that inhibit the reactions by continuously removing them. However, it is difficult to maintain a steady state and avoid contamination, and the design tends to be complex. Typically the fermentor must run for over 500 hours to be more economical than batch processors.

History of human use

The use of fermentation, particularly for beverages, has existed since the Neolithic and has been documented dating from 7000–6600 BCE in Jiahu, China, 5000 BCE in India, Ayurveda mentions many Medicated Wines, 6000 BCE in Georgia, 3150 BCE in ancient Egypt, 3000 BCE in Babylon, 2000 BCE in pre-Hispanic Mexico, and 1500 BC in Sudan. Fermented foods have a religious significance in Judaism and Christianity. The Baltic god Rugutis was worshiped as the agent of fermentation.

Louis Pasteur in his laboratory

In 1837, Charles Cagniard de la Tour, Theodor Schwann and Friedrich Traugott Kützing independently published papers concluding, as a result of microscopic investigations, that yeast is a living organism that reproduces by budding. Schwann boiled grape juice to kill the yeast and found that no fermentation would occur until new yeast was added. However, a lot of chemists , including Antoine Lavoisier, continued to view fermentation as a simple chemical reaction and rejected the notion that living organisms could be involved. This was seen as a reversion to vitalism and was lampooned in an anonymous publication by Justus von Liebig and Friedrich Wöhler.

The turning point came when Louis Pasteur (1822–1895), during the 1850s and 1860s, repeated Schwann's experiments and showed that fermentation is initiated by living organisms in a series of investigations. In 1857, Pasteur showed that lactic acid fermentation is caused by living organisms. In 1860, he demonstrated that bacteria cause souring in milk, a process formerly thought to be merely a chemical change, and his work in identifying the role of microorganisms in food spoilage led to the process of pasteurization. In 1877, working to improve the French brewing industry, Pasteur published his famous paper on fermentation, "Etudes sur la Bière", which was translated into English in 1879 as "Studies on fermentation". He defined fermentation (incorrectly) as "Life without air", but correctly showed that specific types of microorganisms cause specific types of fermentations and specific end-products. 

Although showing fermentation to be the result of the action of living microorganisms was a breakthrough, it did not explain the basic nature of the fermentation process, or prove that it is caused by the microorganisms that appear to be always present. Many scientists, including Pasteur, had unsuccessfully attempted to extract the fermentation enzyme from yeast. Success came in 1897 when the German chemist Eduard Buechner ground up yeast, extracted a juice from them, then found to his amazement that this "dead" liquid would ferment a sugar solution, forming carbon dioxide and alcohol much like living yeasts. Buechner's results are considered to mark the birth of biochemistry. The "unorganized ferments" behaved just like the organized ones. From that time on, the term enzyme came to be applied to all ferments. It was then understood that fermentation is caused by enzymes that are produced by microorganisms. In 1907, Buechner won the Nobel Prize in chemistry for his work.

Advances in microbiology and fermentation technology have continued steadily up until the present. For example, in the 1930s, it was discovered that microorganisms could be mutated with physical and chemical treatments to be higher-yielding, faster-growing, tolerant of less oxygen, and able to use a more concentrated medium. Strain selection and hybridization developed as well, affecting most modern food fermentations.

Etymology

The word "ferment" is derived from the Latin verb fervere, which means to boil. It is thought to have been first used in the late 14th century in alchemy, but only in a broad sense. It was not used in the modern scientific sense until around 1600.

Microbial fuel cell

From Wikipedia, the free encyclopedia

A microbial fuel cell (MFC), or biological fuel cell, is a bio-electrochemical system that drives an electric current by using bacteria and mimicking bacterial interactions found in nature. MFCs can be grouped into two general categories: mediated and unmediated. The first MFCs, demonstrated in the early 20th century, used a mediator: a chemical that transfers electrons from the bacteria in the cell to the anode. Unmediated MFCs emerged in the 1970s; in this type of MFC the bacteria typically have electrochemically active redox proteins such as cytochromes on their outer membrane that can transfer electrons directly to the anode. In the 21st century MFCs started to find a commercial use in wastewater treatment.

History

The idea of using microbes to produce electricity was conceived in the early twentieth century. M. C. Potter initiated the subject in 1911. Potter managed to generate electricity from Saccharomyces cerevisiae, but the work received little coverage. In 1931, Branet Cohen created microbial half fuel cells that, when connected in series, were capable of producing over 35 volts with only a current of 2 milliamps.

A study by DelDuca et al. used hydrogen produced by the fermentation of glucose by Clostridium butyricum as the reactant at the anode of a hydrogen and air fuel cell. Though the cell functioned, it was unreliable owing to the unstable nature of hydrogen production by the micro-organisms. This issue was resolved by Suzuki et al. in 1976, who produced a successful MFC design a year later.

In the late 1970s little was understood about how microbial fuel cells functioned. The idea was studied by Robin M. Allen and later by H. Peter Bennetto. People saw the fuel cell as a possible method for the generation of electricity for developing countries. Bennetto's work, starting in the early 1980s, helped build an understanding of how fuel cells operate and he was seen by many as the topic's foremost authority.

In May 2007, the University of Queensland, Australia completed a prototype MFC as a cooperative effort with Foster's Brewing. The prototype, a 10 L design, converted brewery wastewater into carbon dioxide, clean water and electricity. The group had plans to create a pilot-scale model for an upcoming international bio-energy conference.

Definition

A microbial fuel cell (MFC) is a device that converts chemical energy to electrical energy by the action of microorganisms. These electrochemical cells are constructed using either a bioanode and/or a biocathode. Most MFCs contain a membrane to separate the compartments of the anode (where oxidation takes place) and the cathode (where reduction takes place). The electrons produced during oxidation are transferred directly to an electrode or, to a redox mediator species. The electron flux is moved to the cathode. The charge balance of the system is compensated by ionic movement inside the cell, usually across an ionic membrane. Most MFCs use an organic electron donor that is oxidized to produce CO2, protons and electrons. Other electron donors have been reported, such as sulfur compounds or hydrogen. The cathode reaction uses a variety of electron acceptors that includes the reduction of oxygen as the most studied process. However, other electron acceptors have been studied, including metal recovery by reduction, water to hydrogen, nitrate reduction and sulfate reduction.

Applications

Power generation

MFCs are attractive for power generation applications that require only low power, but where replacing batteries may be impractical, such as wireless sensor networks.

Virtually any organic material could be used to feed the fuel cell, including coupling cells to wastewater treatment plants. Chemical process wastewater and synthetic wastewater have been used to produce bioelectricity in dual- and single-chamber mediatorless MFCs (uncoated graphite electrodes).

Higher power production was observed with a biofilm-covered graphite anode. Fuel cell emissions are well under regulatory limits. MFCs use energy more efficiently than standard internal combustion engines, which are limited by the Carnot Cycle. In theory, an MFC is capable of energy efficiency far beyond 50%. Rozendal obtained energy conversion to hydrogen 8 times that of conventional hydrogen production technologies. 

However; MFCs can also work at a smaller scale. Electrodes in some cases need only be 7 μm thick by 2 cm long. such that an MFC can replace a battery. It provides a renewable form of energy and does not need to be recharged.

MFCs operate well in mild conditions, 20 °C to 40 °C and also at pH of around 7. They lack the stability required for long-term medical applications such as in pacemakers.

Power stations can be based on aquatic plants such as algae. If sited adjacent to an existing power system, the MFC system can share its electricity lines.

Education

Soil-based microbial fuel cells serve as educational tools, as they encompass multiple scientific disciplines (microbiology, geochemistry, electrical engineering, etc.) and can be made using commonly available materials, such as soils and items from the refrigerator. Kits for home science projects and classrooms are available. One example of microbial fuel cells being used in the classroom is in the IBET (Integrated Biology, English, and Technology) curriculum for Thomas Jefferson High School for Science and Technology. Several educational videos and articles are also available on the International Society for Microbial Electrochemistry and Technology (ISMET Society).

Biosensor

The current generated from a microbial fuel cell is directly proportional to the energy content of wastewater used as the fuel. MFCs can measure the solute concentration of wastewater (i.e., as a biosensor).

Wastewater is commonly assessed for its biochemical oxygen demand (BOD) values. BOD values are determined by incubating samples for 5 days with proper source of microbes, usually activated sludge collected from wastewater plants.

An MFC-type BOD sensor can provide real-time BOD values. Oxygen and nitrate are preferred electron acceptors over the electrode, reducing current generation from an MFC. MFC BOD sensors underestimate BOD values in the presence of these electron acceptors. This can be avoided by inhibiting aerobic and nitrate respiration in the MFC using terminal oxidase inhibitors such as cyanide and azide. Such BOD sensors are commercially available.

The United States Navy is considering microbial fuel cells for environmental sensors. The use of microbial fuel cells to power environmental sensors would be able to provide power for longer periods and enable the collection and retrieval of undersea data without a wired infrastructure. The energy created by these fuel cells is enough to sustain the sensors after an initial startup time. Due to undersea conditions (high salt concentrations, fluctuating temperatures and limited nutrient supply), the Navy may deploy MFCs with a mixture of salt-tolerant microorganisms. A mixture would allow for a more complete utilization of available nutrients. Shewanella oneidensis is their primary candidate, but may include other heat- and cold-tolerant Shewanella spp.

A first self-powered and autonomous BOD/COD biosensor has been developed and allows to detect organic contaminants in freshwater. The senor relies only on power produced by MFCs and operates continuously without maintenance. The biosensor turns on the alarm to inform about contamination level: the increased frequency of the signal informs about higher contamination level, while low frequency warns about low contamination level.

Biorecovery

In 2010, A. ter Heijne et al. constructed a device capable of producing electricity and reducing Cu (II) (ion) to copper metal.

Microbial electrolysis cells have been demonstrated to produce hydrogen.

Wastewater treatment

MFCs are used in water treatment to harvest energy utilizing anaerobic digestion. The process can also reduce pathogens. However, it requires temperatures upwards of 30 degrees C and requires an extra step in order to convert biogas to electricity. Spiral spacers may be used to increase electricity generation by creating a helical flow in the MFC. Scaling MFCs is a challenge because of the power output challenges of a larger surface area.

Types

Mediated

Most microbial cells are electrochemically inactive. Electron transfer from microbial cells to the electrode is facilitated by mediators such as thionine, methyl viologen, methyl blue, humic acid and neutral red. Most available mediators are expensive and toxic.

Mediator-free

A plant microbial fuel cell (PMFC)

Mediator-free microbial fuel cells use electrochemically active bacteria to transfer electrons to the electrode (electrons are carried directly from the bacterial respiratory enzyme to the electrode). Among the electrochemically active bacteria are Shewanella putrefaciens, Aeromonas hydrophila and others. Some bacteria are able to transfer their electron production via the pili on their external membrane. Mediator-free MFCs are less well characterized, such as the strain of bacteria used in the system, type of ion-exchange membrane and system conditions (temperature, pH, etc.) 

Mediator-free microbial fuel cells can run on wastewater and derive energy directly from certain plants. This configuration is known as a plant microbial fuel cell. Possible plants include reed sweetgrass, cordgrass, rice, tomatoes, lupines and algae. Given that the power is derived from living plants (in situ-energy production), this variant can provide ecological advantages.

Microbial electrolysis

One variation of the mediator-less MFC is the microbial electrolysis cell (MEC). While MFCs produce electric current by the bacterial decomposition of organic compounds in water, MECs partially reverse the process to generate hydrogen or methane by applying a voltage to bacteria. This supplements the voltage generated by the microbial decomposition of organics, leading to the electrolysis of water or methane production. A complete reversal of the MFC principle is found in microbial electrosynthesis, in which carbon dioxide is reduced by bacteria using an external electric current to form multi-carbon organic compounds.

Soil-based

A soil-based MFC

Soil-based microbial fuel cells adhere to the basic MFC principles, whereby soil acts as the nutrient-rich anodic media, the inoculum and the proton exchange membrane (PEM). The anode is placed at a particular depth within the soil, while the cathode rests on top the soil and is exposed to air.

Soils naturally teem with diverse microbes, including electrogenic bacteria needed for MFCs, and are full of complex sugars and other nutrients that have accumulated from plant and animal material decay. Moreover, the aerobic (oxygen consuming) microbes present in the soil act as an oxygen filter, much like the expensive PEM materials used in laboratory MFC systems, which cause the redox potential of the soil to decrease with greater depth. Soil-based MFCs are becoming popular educational tools for science classrooms.

Sediment microbial fuel cells (SMFCs) have been applied for wastewater treatment. Simple SMFCs can generate energy while decontaminating wastewater. Most such SMFCs contain plants to mimic constructed wetlands. By 2015 SMFC tests had reached more than 150 l.

In 2015 researchers announced an SMFC application that extracts energy and charges a battery. Salts dissociate into positively and negatively charged ions in water and move and adhere to the respective negative and positive electrodes, charging the battery and making it possible to remove the salt effecting microbial capacitive desalination. The microbes produce more energy than is required for the desalination process.

Phototrophic biofilm

Phototrophic biofilm MFCs (ner) use a phototrophic biofilm anode containing photosynthetic microorganism such as chlorophyta and candyanophyta. They carry out photosynthesis and thus produce organic metabolites and donate electrons.

One study found that PBMFCs display a power density sufficient for practical applications.

The sub-category of phototrophic MFCs that use purely oxygenic photosynthetic material at the anode are sometimes called biological photovoltaic systems.

Nanoporous membrane

The United States Naval Research Laboratory developed nanoporous membrane microbial fuel cells that use a non-PEM to generate passive diffusion within the cell. The membrane is a nonporous polymer filter (nylon, cellulose, or polycarbonate). It offers comparable power densities to Nafion (a well known PEM) with greater durability. Porous membranes allow passive diffusion thereby reducing the necessary power supplied to the MFC in order to keep the PEM active and increasing the total energy output.

MFCs that do not use a membrane can deploy anaerobic bacteria in aerobic environments. However, membrane-less MFCs experience cathode contamination by the indigenous bacteria and the power-supplying microbe. The novel passive diffusion of nanoporous membranes can achieve the benefits of a membrane-less MFC without worry of cathode contamination.

Nanoporous membranes are also eleven times cheaper than Nafion (Nafion-117, $0.22/cm2 vs. polycarbonate, <$0.02/cm2).

Ceramic membrane

PEM membranes can be replaced with ceramic materials. Ceramic membrane costs can be as low as $5.66 /m2. The macroporous structure of ceramic membranes allows good transport of ionic species.

The materials that have been successfully employed in ceramic MFCs are earthenware, alumina, mullite, pyrophyllite and terracotta.

Generation process

When microorganisms consume a substance such as sugar in aerobic conditions, they produce carbon dioxide and water. However, when oxygen is not present, they produce carbon dioxide, protons/hydrogen ions and electrons, as described below:
C12H22O11 + 13H2O → 12CO2 + 48H+ + 48e
(Eq. 1)
Microbial fuel cells use inorganic mediators to tap into the electron transport chain of cells and channel electrons produced. The mediator crosses the outer cell lipid membranes and bacterial outer membrane; then, it begins to liberate electrons from the electron transport chain that normally would be taken up by oxygen or other intermediates. 

The now-reduced mediator exits the cell laden with electrons that it transfers to an electrode; this electrode becomes the anode. The release of the electrons recycles the mediator to its original oxidised state, ready to repeat the process. This can happen only under anaerobic conditions; if oxygen is present, it will collect the electrons, as it has greater electronegativity

In MFC operation, the anode is the terminal electron acceptor recognized by bacteria in the anodic chamber. Therefore, the microbial activity is strongly dependent on the anode's redox potential. A Michaelis-Menten curve was obtained between the anodic potential and the power output of an acetate-driven MFC. A critical anodic potential seems to provide maximum power output.

Potential mediators include natural red, methylene blue, thionine and resorufin.

Organisms capable of producing an electric current are termed exoelectrogens. In order to turn this current into usable electricity, exoelectrogens have to be accommodated in a fuel cell. 

The mediator and a micro-organism such as yeast, are mixed together in a solution to which is added a substrate such as glucose. This mixture is placed in a sealed chamber to stop oxygen entering, thus forcing the micro-organism to undertake anaerobic respiration. An electrode is placed in the solution to act as the anode.

In the second chamber of the MFC is another solution and the positively charged cathode. It is the equivalent of the oxygen sink at the end of the electron transport chain, external to the biological cell. The solution is an oxidizing agent that picks up the electrons at the cathode. As with the electron chain in the yeast cell, this could be a variety of molecules such as oxygen, although a more convenient option is a solid oxidizing agent, which requires less volume. 

Connecting the two electrodes is a wire (or other electrically conductive path). Completing the circuit and connecting the two chambers is a salt bridge or ion-exchange membrane. This last feature allows the protons produced, as described in Eq. 1, to pass from the anode chamber to the cathode chamber.

The reduced mediator carries electrons from the cell to the electrode. Here the mediator is oxidized as it deposits the electrons. These then flow across the wire to the second electrode, which acts as an electron sink. From here they pass to an oxidizing material. Also the hydrogen ions/protons are moved from the anode to the cathode via a proton exchange membrane such as nafion. They will move across to the lower concentration gradient and be combined with the oxygen but to do this they need an electron. This forms current and the hydrogen is used sustaining the concentration gradient.

Algae Biomass has been observed to give high energy when used as substrates in microbial fuel cell.

Entropy (information theory)

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Entropy_(information_theory) In info...