Research concerning the relationship between the thermodynamic quantity entropy and the evolution of life began around the turn of the 20th century. In 1910, American historian Henry Adams printed and distributed to university libraries and history professors the small volume A Letter to American Teachers of History proposing a theory of history based on the second law of thermodynamics and on the principle of entropy.[1][2] The 1944 book What is Life? by Nobel-laureate physicist Erwin Schrödinger stimulated research in the field. In his book, Schrödinger originally stated that life feeds on negative entropy, or negentropy as it is sometimes called, but in a later edition corrected himself in response to complaints and stated the true source is free energy. More recent work has restricted the discussion to Gibbs free energy because biological processes on Earth normally occur at a constant temperature and pressure, such as in the atmosphere or at the bottom of an ocean, but not across both over short periods of time for individual organisms.
Origin
In 1863, Rudolf Clausius published his noted memoir "On the Concentration of Rays of Heat and Light, and on the Limits of its Action" wherein he outlined a preliminary relationship, as based on his own work and that of William Thomson (Lord Kelvin), between his newly developed concept of entropy and life.[citation needed] Building on this, one of the first to speculate on a possible thermodynamic perspective of evolution was the Austrian physicist Ludwig Boltzmann. In 1875, building on the works of Clausius and Kelvin, Boltzmann reasoned:The general struggle for existence of animate beings is not a struggle for raw materials – these, for organisms, are air, water and soil, all abundantly available – nor for energy which exists in plenty in any body in the form of heat, but a struggle for [negative] entropy, which becomes available through the transition of energy from the hot sun to the cold earth.[3]
Early views
In 1876, American civil engineer Richard Sears McCulloh, in his Treatise on the Mechanical Theory of Heat and its Application to the Steam-Engine, which was an early thermodynamics textbook, states, after speaking about the laws of the physical world, that "there are none that are established on a firmer basis than the two general propositions of Joule and Carnot; which constitute the fundamental laws of our subject." McCulloch then goes on to show that these two laws may be combined in a single expression as follows:-
- entropy
- a differential amount of heat passed into a thermodynamic system
- absolute temperature
When we reflect how generally physical phenomena are connected with thermal changes and relations, it at once becomes obvious that there are few, if any, branches of natural science which are not more or less dependent upon the great truths under consideration. Nor should it, therefore, be a matter of surprise that already, in the short space of time, not yet one generation, elapsed since the mechanical theory of heat has been freely adopted, whole branches of physical science have been revolutionized by it.[4]:p. 267McCulloch then gives a few of what he calls the “more interesting examples” of the application of these laws in extent and utility. The first example he gives is physiology, wherein he states that “the body of an animal, not less than a steamer, or a locomotive, is truly a heat engine, and the consumption of food in the one is precisely analogous to the burning of fuel in the other; in both, the chemical process is the same: that called combustion.” He then incorporates a discussion of Lavoisier’s theory of respiration with cycles of digestion and excretion, perspiration, but then contradicts Lavoisier with recent findings, such as internal heat generated by friction, according to the new theory of heat, which, according to McCulloch, states that the “heat of the body generally and uniformly is diffused instead of being concentrated in the chest”. McCulloch then gives an example of the second law, where he states that friction, especially in the smaller blooded-vessels, must develop heat. Without doubt, animal heat is thus in part produced. He then asks: “but whence the expenditure of energy causing that friction, and which must be itself accounted for?"
To answer this question he turns to the mechanical theory of heat and goes on to loosely outline how the heart is what he calls a “force-pump”, which receives blood and sends it to every part of the body, as discovered by William Harvey, that “acts like the piston of an engine and is dependent upon and consequently due to the cycle of nutrition and excretion which sustains physical or organic life.” It is likely, here, that McCulloch was modeling parts of this argument on that of the famous Carnot cycle. In conclusion, he summarizes his first and second law argument as such:
Everything physical being subject to the law of conservation of energy, it follows that no physiological action can take place except with expenditure of energy derived from food; also, that an animal performing mechanical work must from the same quantity of food generate less heat than one abstaining from exertion, the difference being precisely the heat equivalent of that of work.[4]:p. 270
Negative entropy
Later, building on this premise, in the famous 1944 book What is Life?, Nobel-laureate physicist Erwin Schrödinger theorizes that life, contrary to the general tendency dictated by the Second law of thermodynamics, decreases or maintains its entropy by feeding on negative entropy.[5] In his note to Chapter 6 of What is Life?, however, Schrödinger remarks on his usage of the term negative entropy:Let me say first, that if I had been catering for them [physicists] alone I should have let the discussion turn on free energy instead. It is the more familiar notion in this context. But this highly technical term seemed linguistically too near to energy for making the average reader alive to the contrast between the two things.This is what is argued to differentiate life from other forms of matter organization. In this direction, although life's dynamics may be argued to go against the tendency of second law, which states that the entropy of an isolated system tends to increase, it does not in any way conflict or invalidate this law, because the principle that entropy can only increase or remain constant applies only to a closed system which is adiabatically isolated, meaning no heat can enter or leave. Whenever a system can exchange either heat or matter with its environment, an entropy decrease of that system is entirely compatible with the second law.[6] The problem of organization in living systems increasing despite the second law is known as the Schrödinger paradox.[7]
In 1964, James Lovelock was among a group of scientists who were requested by NASA to make a theoretical life detection system to look for life on Mars during the upcoming space mission. When thinking about this problem, Lovelock wondered “how can we be sure that Martian life, if any, will reveal itself to tests based on Earth’s lifestyle?”[8] To Lovelock, the basic question was “What is life, and how should it be recognized?” When speaking about this issue with some of his colleagues at the Jet Propulsion Laboratory, he was asked what he would do to look for life on Mars. To this, Lovelock replied "I’d look for an entropy reduction, since this must be a general characteristic of life."[8]
Gibbs free energy and biological evolution
In recent years, the thermodynamic interpretation of evolution in relation to entropy has begun to utilize the concept of the Gibbs free energy, rather than entropy.[9] This is because biological processes on earth take place at roughly constant temperature and pressure, a situation in which the Gibbs free energy is an especially useful way to express the second law of thermodynamics. The Gibbs free energy is given by:Similarly, according to the chemist John Avery, from his 2003 book Information Theory and Evolution, we find a presentation in which the phenomenon of life, including its origin and evolution, as well as human cultural evolution, has its basis in the background of thermodynamics, statistical mechanics, and information theory. The (apparent) paradox between the second law of thermodynamics and the high degree of order and complexity produced by living systems, according to Avery, has its resolution "in the information content of the Gibbs free energy that enters the biosphere from outside sources."[11] The process of natural selection responsible for such local increase in order may be mathematically derived directly from the expression of the second law equation for connected non-equilibrium open systems.[12]
Entropy and the origin of life
The second law of thermodynamics applied on the origin of life is a far more complicated issue than the further development of life, since there is no "standard model" of how the first biological lifeforms emerged; only a number of competing hypotheses. The problem is discussed within the area of abiogenesis, implying gradual pre-Darwinian chemical evolution. In 1924, Alexander Oparin suggested that sufficient energy was provided in a primordial soup. The Belgian scientist Ilya Prigogine was awarded with a Nobel prize in 1977 for an analysis in this area. A related topic is the probability that life would emerge, which has been discussed in several studies, for example by Russell Doolittle.[13]Entropy and the search for life elsewhere in the Universe
In 2013 Azua-Bustos and Vega argued that disregarding the type of lifeform that could be envisioned both on Earth and elsewhere in the Universe, all should share in common the attribute of being entities that decrease their internal entropy at the expense of free energy obtained from its surroundings. As entropy allows the quantification of the degree of disorder in a system, any envisioned lifeform must have a higher degree of order than its supporting environment. These authors showed that by using fractal mathematics analysis alone, they could readily quantify the degree of structural complexity difference (and thus entropy) of living processes as distinct entities separate from their similar abiotic surroundings. This approach may allow the future detection of unknown forms of life both in the Solar System and on recently discovered exoplanets based on nothing more than entropy differentials of complementary datasets (morphology, coloration, temperature, pH, isotopic composition, etc.). Detecting ‘life as we don't know it’ by fractal analysisOther terms
For nearly a century and a half, beginning with Clausius' 1863 memoir "On the Concentration of Rays of Heat and Light, and on the Limits of its Action", much writing and research has been devoted to the relationship between thermodynamic entropy and the evolution of life. The argument that life feeds on negative entropy or negentropy was asserted by physicist Erwin Schrödinger in a 1944 book What is Life?. He posed, "How does the living organism avoid decay?" The obvious answer is: "By eating, drinking, breathing and (in the case of plants) assimilating." Recent writings have used the concept of Gibbs free energy to elaborate on this issue.[14] While energy from nutrients is necessary to sustain an organism's order, there is also the Schrödinger prescience: "An organism's astonishing gift of concentrating a stream of order on itself and thus escaping the decay into atomic chaos – of drinking orderliness from a suitable environment – seems to be connected with the presence of the aperiodic solids..." We now know that the 'aperiodic' crystal is DNA and that the irregular arrangement is a form of information. "The DNA in the cell nucleus contains the master copy of the software, in duplicate. This software seems to control by "specifying an algorithm, or set of instructions, for creating and maintaining the entire organism containing the cell."[15] DNA and other macromolecules determine an organism's life cycle: birth, growth, maturity, decline, and death. Nutrition is necessary but not sufficient to account for growth in size as genetics is the governing factor. At some point, organisms normally decline and die even while remaining in environments that contain sufficient nutrients to sustain life. The controlling factor must be internal and not nutrients or sunlight acting as causal exogenous variables. Organisms inherit the ability to create unique and complex biological structures; it is unlikely for those capabilities to be reinvented or be taught each generation. Therefore, DNA must be operative as the prime cause in this characteristic as well. Applying Boltzmann's perspective of the second law, the change of state from a more probable, less ordered and high entropy arrangement to one of less probability, more order, and lower entropy seen in biological ordering calls for a function like that known of DNA. DNA's apparent information processing function provides a resolution of the paradox posed by life and the entropy requirement of the second law.[16]In 1982, American biochemist Albert Lehninger argued that the "order" produced within cells as they grow and divide is more than compensated for by the "disorder" they create in their surroundings in the course of growth and division. "Living organisms preserve their internal order by taking from their surroundings free energy, in the form of nutrients or sunlight, and returning to their surroundings an equal amount of energy as heat and entropy."[17]
Evolution-related concepts:
- Negentropy – a shorthand colloquial phrase for negative entropy.[18]
- Ectropy – a measure of the tendency of a dynamical system to do useful work and grow more organized.[19]
- Extropy – a metaphorical term defining the extent of a living or organizational system's intelligence, functional order, vitality, energy, life, experience, and capacity and drive for improvement and growth.
- Ecological entropy – a measure of biodiversity in the study of biological ecology.
Objections
Entropy is defined for equilibrium systems,[21] so objections to the extension of the second law and entropy to biological systems, especially as it pertains to its use to support or discredit the theory of evolution, have been stated.[22] Live systems and indeed much of the systems and processes in the universe operate far from equilibrium, whereas the second law succinctly states that isolated systems evolve toward thermodynamic equilibrium — the state of maximum entropy.However, entropy is well defined much more broadly based on the probabilities of a system's states, whether or not the system is a dynamical one (for which equilibrium could be relevant). Even in those physical systems where equilibrium could be relevant, (1) live systems cannot persist in isolation and (2) the second principle of thermodynamics does not require that free energy be transformed into entropy along the shortest path: live organisms absorb energy from sunlight or from energy-rich chemical compounds and finally return part of such energy to the environment as entropy (heat and low free-energy compounds such as water and CO2).