In classical thermodynamics, entropy (from Greekτρoπή (tropḗ) 'transformation') is a property of a thermodynamic system that expresses the direction or outcome of spontaneous changes in the system. The term was introduced by Rudolf Clausius in the mid-19th century to explain the relationship of the internal energy that is available or unavailable for transformations in form of heat and work. Entropy predicts that certain processes are irreversible or impossible, despite not violating the conservation of energy. The definition of entropy is central to the establishment of the second law of thermodynamics, which states that the entropy of isolated systems cannot decrease with time, as they always tend to arrive at a state of thermodynamic equilibrium, where the entropy is highest. Entropy is therefore also considered to be a measure of disorder in the system.
Ludwig Boltzmann explained the entropy as a measure of the number of possible microscopic configurations Ω
of the individual atoms and molecules of the system (microstates) which
correspond to the macroscopic state (macrostate) of the system. He
showed that the thermodynamic entropy is k ln Ω, where the factor k has since been known as the Boltzmann constant.
Concept
Differences in pressure, density, and temperature of a thermodynamic system
tend to equalize over time. For example, in a room containing a glass
of melting ice, the difference in temperature between the warm room and
the cold glass of ice and water is equalized by energy flowing as heat
from the room to the cooler ice and water mixture. Over time, the
temperature of the glass and its contents and the temperature of the
room achieve a balance. The entropy of the room has decreased. However,
the entropy of the glass of ice and water has increased more than the
entropy of the room has decreased. In an isolated system,
such as the room and ice water taken together, the dispersal of energy
from warmer to cooler regions always results in a net increase in
entropy. Thus, when the system of the room and ice water system has
reached thermal equilibrium, the entropy change from the initial state
is at its maximum. The entropy of the thermodynamic system is a measure of the progress of the equalization.
Many irreversible processes result in an increase of entropy. One
of them is mixing of two or more different substances, occasioned by
bringing them together by removing a wall that separates them, keeping
the temperature and pressure constant. The mixing is accompanied by the entropy of mixing.
In the important case of mixing of ideal gases, the combined system
does not change its internal energy by work or heat transfer; the
entropy increase is then entirely due to the spreading of the different
substances into their new common volume.
From a macroscopic perspective, in classical thermodynamics, the entropy is a state function of a thermodynamic system:
that is, a property depending only on the current state of the system,
independent of how that state came to be achieved. Entropy is a key
ingredient of the Second law of thermodynamics, which has important consequences e.g. for the performance of heat engines, refrigerators, and heat pumps.
Definition
According to the Clausius equality, for a closed homogeneous system, in which only reversible processes take place,
With being the uniform temperature of the closed system and the incremental reversible transfer of heat energy into that system.
That means the line integral is path-independent.
A state function , called entropy, may be defined which satisfies
Entropy measurement
The thermodynamic state of a uniform closed system is determined by its temperature T and pressure P. A change in entropy can be written as
The first contribution depends on the heat capacity at constant pressure CP through
This is the result of the definition of the heat capacity by δQ = CP dT and T dS = δQ. The second term may be rewritten with one of the Maxwell relations
and the definition of the volumetric thermal-expansion coefficient
so that
With this expression the entropy S at arbitrary P and T can be related to the entropy S0 at some reference state at P0 and T0 according to
In classical thermodynamics, the entropy of the reference state can
be put equal to zero at any convenient temperature and pressure. For
example, for pure substances, one can take the entropy of the solid at
the melting point at 1 bar equal to zero. From a more fundamental point
of view, the third law of thermodynamics suggests that there is a preference to take S = 0 at T = 0 (absolute zero) for perfectly ordered materials such as crystals.
S(P, T) is determined by followed a specific path in the P-T diagram: integration over T at constant pressure P0, so that dP = 0, and in the second integral one integrates over P at constant temperature T, so that dT = 0. As the entropy is a function of state the result is independent of the path.
The above relation shows that the determination of the entropy
requires knowledge of the heat capacity and the equation of state (which
is the relation between P,V, and T of the
substance involved). Normally these are complicated functions and
numerical integration is needed. In simple cases it is possible to get
analytical expressions for the entropy. In the case of an ideal gas, the heat capacity is constant and the ideal gas lawPV = nRT gives that αVV = V/T = nR/p, with n the number of moles and R the molar ideal-gas constant. So, the molar entropy of an ideal gas is given by
In this expression CP now is the molar heat capacity.
The entropy of inhomogeneous systems is the sum of the entropies
of the various subsystems. The laws of thermodynamics hold rigorously
for inhomogeneous systems even though they may be far from internal
equilibrium. The only condition is that the thermodynamic parameters of
the composing subsystems are (reasonably) well-defined.
Temperature-entropy diagrams
Entropy values of important substances may be obtained from reference
works or with commercial software in tabular form or as diagrams. One
of the most common diagrams is the temperature-entropy diagram
(TS-diagram). For example, Fig.2 shows the TS-diagram of nitrogen, depicting the melting curve and saturated liquid and vapor values with isobars and isenthalps.
We now consider inhomogeneous systems in which internal transformations (processes) can take place. If we calculate the entropy S1 before and S2 after such an internal process the Second Law of Thermodynamics demands that S2 ≥ S1 where the equality sign holds if the process is reversible. The difference Si = S2 − S1
is the entropy production due to the irreversible process. The Second
law demands that the entropy of an isolated system cannot decrease.
Suppose a system is thermally and mechanically isolated from the
environment (isolated system). For example, consider an insulating rigid
box divided by a movable partition into two volumes, each filled with
gas. If the pressure of one gas is higher, it will expand by moving the
partition, thus performing work on the other gas. Also, if the gases are
at different temperatures, heat can flow from one gas to the other
provided the partition allows heat conduction. Our above result
indicates that the entropy of the system as a whole will increase
during these processes. There exists a maximum amount of entropy the
system may possess under the circumstances. This entropy corresponds to a
state of stable equilibrium, since a transformation to any other
equilibrium state would cause the entropy to decrease, which is
forbidden. Once the system reaches this maximum-entropy state, no part
of the system can perform work on any other part. It is in this sense
that entropy is a measure of the energy in a system that cannot be used
to do work.
An irreversible process degrades the performance of a thermodynamic system, designed to do work or produce cooling, and results in entropy production. The entropy generation during a reversible process
is zero. Thus entropy production is a measure of the irreversibility
and may be used to compare engineering processes and machines.
Thermal machines
Clausius' identification of S as a significant quantity was motivated by the study of reversible and irreversible thermodynamic transformations. A heat engine
is a thermodynamic system that can undergo a sequence of
transformations which ultimately return it to its original state. Such a
sequence is called a cyclic process, or simply a cycle. During some transformations, the engine may exchange energy with its environment. The net result of a cycle is
heat transferred from one part of the environment to another. In the steady state, by the conservation of energy, the net energy lost by the environment is equal to the work done by the engine.
If every transformation in the cycle is reversible, the cycle is
reversible, and it can be run in reverse, so that the heat transfers
occur in the opposite directions and the amount of work done switches
sign.
Heat engines
Consider a heat engine working between two temperatures TH and Ta. With Ta
we have ambient temperature in mind, but, in principle it may also be
some other low temperature. The heat engine is in thermal contact with
two heat reservoirs which are supposed to have a very large heat
capacity so that their temperatures do not change significantly if heat QH is removed from the hot reservoir and Qa is added to the lower reservoir. Under normal operation TH > Ta and QH, Qa, and W are all positive.
As our thermodynamical system we take a big system which includes
the engine and the two reservoirs. It is indicated in Fig.3 by the
dotted rectangle. It is inhomogeneous, closed (no exchange of matter
with its surroundings), and adiabatic (no exchange of heat with its surroundings). It is not isolated since per cycle a certain amount of work W is produced by the system given by the first law of thermodynamics
We used the fact that the engine itself is periodic, so its internal
energy has not changed after one cycle. The same is true for its
entropy, so the entropy increase S2 − S1
of our system after one cycle is given by the reduction of entropy of
the hot source and the increase of the cold sink. The entropy increase
of the total system S2 - S1 is equal to the entropy production Si due to irreversible processes in the engine so
The Second law demands that Si ≥ 0. Eliminating Qa from the two relations gives
The first term is the maximum possible work for a heat engine, given by a reversible engine, as one operating along a Carnot cycle. Finally
This equation tells us that the production of work is reduced by the generation of entropy. The term TaSi gives the lost work, or dissipated energy, by the machine.
Correspondingly, the amount of heat, discarded to the cold sink, is increased by the entropy generation
These important relations can also be obtained without the inclusion of the heat reservoirs. See the article on entropy production.
Refrigerators
The same principle can be applied to a refrigerator working between a low temperature TL and ambient temperature. The schematic drawing is exactly the same as Fig.3 with TH replaced by TL, QH by QL, and the sign of W reversed. In this case the entropy production is
and the work needed to extract heat QL from the cold source is
The first term is the minimum required work, which corresponds to a reversible refrigerator, so we have
i.e., the refrigerator compressor has to perform extra work to
compensate for the dissipated energy due to irreversible processes which
lead to entropy production.
In thermodynamics,
entropy is a numerical quantity that shows that many physical processes
can go in only one direction in time. For example, cream and coffee can
be mixed together, but cannot be "unmixed"; a piece of wood can be
burned, but cannot be "unburned". The word 'entropy' has entered popular
usage to refer to a lack of order or predictability, or of a gradual
decline into disorder.[1]
A more physical interpretation of thermodynamic entropy refers to
spread of energy or matter, or to extent and diversity of microscopic
motion.
If a movie that shows coffee being mixed or wood being burned is
played in reverse, it would depict processes highly improbable in
reality. Mixing coffee and burning wood are "irreversible".
Irreversibility is described by a law of nature known as the second law of thermodynamics,
which states that in an isolated system (a system not connected to any
other system) which is undergoing change, entropy increases over time.
Entropy does not increase indefinitely. A body of matter and
radiation eventually will reach an unchanging state, with no detectable
flows, and is then said to be in a state of thermodynamic equilibrium.
Thermodynamic entropy has a definite value for such a body and is at
its maximum value. When bodies of matter or radiation, initially in
their own states of internal thermodynamic equilibrium, are brought
together so as to intimately interact and reach a new joint equilibrium,
then their total entropy increases. For example, a glass of warm water
with an ice cube in it will have a lower entropy than that same system
some time later when the ice has melted leaving a glass of cool water.
Such processes are irreversible: A glass of cool water will not spontaneously
turn into a glass of warm water with an ice cube in it. Some processes
in nature are almost reversible. For example, the orbiting of the
planets around the Sun may be thought of as practically reversible: A
movie of the planets orbiting the Sun which is run in reverse would not
appear to be impossible.
While the second law, and thermodynamics in general, accurately
predicts the intimate interactions of complex physical systems,
scientists are not content with simply knowing how a system behaves,
they also want to know why it behaves the way it does. The
question of why entropy increases until equilibrium is reached was
answered in 1877 by physicist Ludwig Boltzmann. The theory developed by Boltzmann and others, is known as statistical mechanics.
Statistical mechanics explains thermodynamics in terms of the
statistical behavior of the atoms and molecules which make up the
system. The theory not only explains thermodynamics, but also a host of
other phenomena which are outside the scope of thermodynamics.
Explanation
Thermodynamic entropy
The concept of thermodynamic entropy arises from the second law of thermodynamics. This law of entropy increase quantifies the reduction in the capacity of an isolated compound thermodynamic system to do thermodynamic work
on its surroundings, or indicates whether a thermodynamic process may
occur. For example, whenever there is a suitable pathway, heat
spontaneously flows from a hotter body to a colder one.
Thermodynamic entropy is measured as a change in entropy ()
to a system containing a sub-system which undergoes heat transfer to
its surroundings (inside the system of interest). It is based on the macroscopic relationship between heat flow into the sub-system and the temperature at which it occurs summed over the boundary of that sub-system.
Following the formalism of Clausius, the basic calculation can be mathematically stated as:
where is the increase or decrease in entropy, is the heat added to the system or subtracted from it, and is temperature. The 'equals' sign and the symbol imply that the heat transfer should be so small and slow that it scarcely changes the temperature .
If the temperature is allowed to vary, the equation must be integrated
over the temperature path. This calculation of entropy change does not
allow the determination of absolute value, only differences. In this
context, the Second Law of Thermodynamics may be stated that for heat
transferred over any valid process for any system, whether isolated or
not,
According to the first law of thermodynamics, which deals with the conservation of energy, the loss of heat will result in a decrease in the internal energy of the thermodynamic system.
Thermodynamic entropy provides a comparative measure of the amount of
decrease in internal energy and the corresponding increase in internal
energy of the surroundings at a given temperature. In many cases, a
visualization of the second law is that energy of all types changes from
being localized to becoming dispersed or spread out, if it is not
hindered from doing so. When applicable, entropy increase is the
quantitative measure of that kind of a spontaneous process: how much
energy has been effectively lost or become unavailable, by dispersing
itself, or spreading itself out, as assessed at a specific temperature.
For this assessment, when the temperature is higher, the amount of
energy dispersed is assessed as 'costing' proportionately less. This is
because a hotter body is generally more able to do thermodynamic work,
other factors, such as internal energy, being equal. This is why a steam
engine has a hot firebox.
The second law of thermodynamics deals only with changes of entropy (). The absolute entropy (S) of a system may be determined using the third law of thermodynamics, which specifies that the entropy of all perfectly crystalline substances is zero at the absolute zero of temperature.
The entropy at another temperature is then equal to the increase in
entropy on heating the system reversibly from absolute zero to the
temperature of interest.
Statistical mechanics and information entropy
Thermodynamic entropy bears a close relationship to the concept of information entropy (H).
Information entropy is a measure of the "spread" of a probability
density or probability mass function. Thermodynamics makes no
assumptions about the atomistic nature of matter, but when matter is
viewed in this way, as a collection of particles constantly moving and
exchanging energy with each other, and which may be described in a
probabilistic manner, information theory may be successfully applied to
explain the results of thermodynamics. The resulting theory is known as statistical mechanics.
An important concept in statistical mechanics is the idea of the microstate and the macrostate
of a system. If we have a container of gas, for example, and we know
the position and velocity of every molecule in that system, then we know
the microstate of that system. If we only know the thermodynamic
description of that system, the pressure, volume, temperature, and/or
the entropy, then we know the macrostate of that system. Boltzmann
realized that there are many different microstates that can yield the
same macrostate, and, because the particles are colliding with each
other and changing their velocities and positions, the microstate of the
gas is always changing. But if the gas is in equilibrium, there seems
to be no change in its macroscopic behavior: No changes in pressure,
temperature, etc. Statistical mechanics relates the thermodynamic
entropy of a macrostate to the number of microstates that could yield
that macrostate. In statistical mechanics, the entropy of the system is
given by Ludwig Boltzmann's equation:
where S is the thermodynamic entropy, W is the number of microstates that may yield the macrostate, and is the Boltzmann constant. The natural logarithm of the number of microstates () is known as the information entropy of the system. This can be illustrated by a simple example:
If you flip two coins, you can have four different results. If H is heads and T is tails, we can have (H,H), (H,T), (T,H), and (T,T).
We can call each of these a "microstate" for which we know exactly the
results of the process. But what if we have less information? Suppose we
only know the total number of heads?. This can be either 0, 1, or 2. We
can call these "macrostates". Only microstate (T,T) will give macrostate zero, (H,T) and (T,H) will give macrostate 1, and only (H,H)
will give macrostate 2. So we can say that the information entropy of
macrostates 0 and 2 are ln(1) which is zero, but the information entropy
of macrostate 1 is ln(2) which is about 0.69. Of all the microstates,
macrostate 1 accounts for half of them.
It turns out that if you flip a large number of coins, the
macrostates at or near half heads and half tails accounts for almost all
of the microstates. In other words, for a million coins, you can be
fairly sure that about half will be heads and half tails. The
macrostates around a 50–50 ratio of heads to tails will be the
"equilibrium" macrostate. A real physical system in equilibrium has a
huge number of possible microstates and almost all of them are the
equilibrium macrostate, and that is the macrostate you will almost
certainly see if you wait long enough. In the coin example, if you start
out with a very unlikely macrostate (like all heads, for example with
zero entropy) and begin flipping one coin at a time, the entropy of the
macrostate will start increasing, just as thermodynamic entropy does,
and after a while, the coins will most likely be at or near that 50–50
macrostate, which has the greatest information entropy – the equilibrium
entropy.
The macrostate of a system is what we know about the system, for example the temperature, pressure, and volume
of a gas in a box. For each set of values of temperature, pressure,
and volume there are many arrangements of molecules which result in
those values. The number of arrangements of molecules which could result
in the same values for temperature, pressure and volume is the number
of microstates.
The concept of information entropy has been developed to describe
any of several phenomena, depending on the field and the context in
which it is being used. When it is applied to the problem of a large
number of interacting particles, along with some other constraints, like
the conservation of energy, and the assumption that all microstates are
equally likely, the resultant theory of statistical mechanics is
extremely successful in explaining the laws of thermodynamics.
Ice melting provides an example in which entropy increases in a small
system, a thermodynamic system consisting of the surroundings (the warm
room) and the entity of glass container, ice and water which has been
allowed to reach thermodynamic equilibrium at the melting temperature of ice. In this system, some heat (δQ)
from the warmer surroundings at 298 K (25 °C; 77 °F) transfers to the
cooler system of ice and water at its constant temperature (T) of 273 K (0 °C; 32 °F), the melting temperature of ice. The entropy of the system, which is δQ/T, increases by δQ/273 K. The heat δQ for this process is the energy required to change water from the solid state to the liquid state, and is called the enthalpy of fusion, i.e. ΔH for ice fusion.
The entropy of the surrounding room decreases less than the
entropy of the ice and water increases: the room temperature of 298 K is
larger than 273 K and therefore the ratio, (entropy change), of δQ/298 K for the surroundings is smaller than the ratio (entropy change), of δQ/273 K
for the ice and water system. This is always true in spontaneous events
in a thermodynamic system and it shows the predictive importance of
entropy: the final net entropy after such an event is always greater
than was the initial entropy.
As the temperature of the cool water rises to that of the room and the room further cools imperceptibly, the sum of the δQ/T
over the continuous range, "at many increments", in the initially cool
to finally warm water can be found by calculus. The entire miniature
'universe', i.e. this thermodynamic system, has increased in entropy.
Energy has spontaneously become more dispersed and spread out in that
'universe' than when the glass of ice and water was introduced and
became a 'system' within it.
Origins and uses
Originally,
entropy was named to describe the "waste heat", or more accurately,
energy loss, from heat engines and other mechanical devices which could
never run with 100% efficiency in converting energy into work. Later,
the term came to acquire several additional descriptions, as more was
understood about the behavior of molecules on the microscopic level. In
the late 19th century, the word "disorder" was used by Ludwig Boltzmann in developing statistical views of entropy using probability theory
to describe the increased molecular movement on the microscopic level.
That was before quantum behavior came to be better understood by Werner Heisenberg
and those who followed. Descriptions of thermodynamic (heat) entropy on
the microscopic level are found in statistical thermodynamics and statistical mechanics.
For most of the 20th century, textbooks tended to describe
entropy as "disorder", following Boltzmann's early conceptualisation of
the "motional" (i.e. kinetic) energy of molecules. More recently, there has been a trend in chemistry and physics textbooks to describe entropy as energy dispersal.
Entropy can also involve the dispersal of particles, which are
themselves energetic. Thus there are instances where both particles and
energy disperse at different rates when substances are mixed together.
The mathematics developed in statistical thermodynamics were
found to be applicable in other disciplines. In particular, information
sciences developed the concept of information entropy, which lacks the Boltzmann constant inherent in thermodynamic entropy.
Classical calculation of entropy
When
the word 'entropy' was first defined and used in 1865, the very
existence of atoms was still controversial, though it had long been
speculated that temperature was due to the motion of microscopic
constituents and that "heat" was the transferring of that motion from
one place to another. Entropy change, ,
was described in macroscopic terms that could be directly measured,
such as volume, temperature, or pressure. However, today the classical
equation of entropy, can be explained, part by part, in modern terms describing how molecules are responsible for what is happening:
is the change in entropy of a system (some physical substance of
interest) after some motional energy ("heat") has been transferred to it
by fast-moving molecules. So, .
Then, ,
the quotient of the motional energy ("heat") q that is transferred
"reversibly" (rev) to the system from the surroundings (or from another
system in contact with the first system) divided by T, the absolute
temperature at which the transfer occurs.
"Reversible" or "reversibly" (rev) simply means that T, the
temperature of the system, has to stay (almost) exactly the same while
any energy is being transferred to or from it. That is easy in the case
of phase changes, where the system absolutely must stay in the solid or
liquid form until enough energy is given to it to break bonds between
the molecules before it can change to a liquid or a gas. For example, in
the melting of ice at 273.15 K, no matter what temperature the
surroundings are – from 273.20 K to 500 K or even higher, the
temperature of the ice will stay at 273.15 K until the last molecules in
the ice are changed to liquid water, i.e., until all the hydrogen bonds
between the water molecules in ice are broken and new, less-exactly
fixed hydrogen bonds between liquid water molecules are formed. This
amount of energy necessary for ice melting per mole has been found to be
6008 joules at 273 K. Therefore, the entropy change per mole is , or 22 J/K.
When the temperature is not at the melting or boiling point of a
substance no intermolecular bond-breaking is possible, and so any
motional molecular energy ("heat") from the surroundings transferred to a
system raises its temperature, making its molecules move faster and
faster. As the temperature is constantly rising, there is no longer a
particular value of "T" at which energy is transferred. However, a
"reversible" energy transfer can be measured at a very small temperature
increase, and a cumulative total can be found by adding each of many
small temperature intervals or increments. For example, to find the
entropy change
from 300 K to 310 K, measure the amount of energy transferred at dozens
or hundreds of temperature increments, say from 300.00 K to 300.01 K
and then 300.01 to 300.02 and so on, dividing the q by each T, and
finally adding them all.
Calculus can be used to make this calculation easier if the effect
of energy input to the system is linearly dependent on the temperature
change, as in simple heating of a system at moderate to relatively high
temperatures. Thus, the energy being transferred "per incremental change
in temperature" (the heat capacity, ), multiplied by the integral of from to , is directly given by .
Alternate explanations of entropy
Thermodynamic entropy
A measure of energy unavailable for work:
This is an often-repeated phrase which, although it is true, requires
considerable clarification to be understood. It is only true for cyclic
reversible processes, and is in this sense misleading. By "work" is
meant moving an object, for example, lifting a weight, or bringing a
flywheel up to speed, or carrying a load up a hill. To convert heat into
work, using a coal-burning steam engine, for example, one must have two
systems at different temperatures, and the amount of work you can
extract depends on how large the temperature difference is, and how
large the systems are. If one of the systems is at room temperature, and
the other system is much larger, and near absolute zero temperature,
then almost ALL of the energy of the room temperature system can be
converted to work. If they are both at the same room temperature, then
NONE of the energy of the room temperature system can be converted to
work. Entropy is then a measure of how much energy cannot be converted
to work, given these conditions. More precisely, for an isolated system
comprising two closed systems at different temperatures, in the process
of reaching equilibrium the amount of entropy lost by the hot system,
multiplied by the temperature of the hot system, is the amount of energy
that cannot converted to work.
An indicator of irreversibility: fitting closely with the
'unavailability of energy' interpretation is the 'irreversibility'
interpretation. Spontaneous thermodynamic processes are irreversible, in
the sense that they do not spontaneously undo themselves. Thermodynamic
processes artificially imposed by agents in the surroundings of a body
also have irreversible effects on the body. For example, when James Prescott Joule
used a device that delivered a measured amount of mechanical work from
the surroundings through a paddle that stirred a body of water, the
energy transferred was received by the water as heat. There was scarce
expansion of the water doing thermodynamic work back on the
surroundings. The body of water showed no sign of returning the energy
by stirring the paddle in reverse. The work transfer appeared as heat,
and was not recoverable without a suitably cold reservoir in the
surroundings. Entropy gives a precise account of such irreversibility.
Dispersal: Edward A. Guggenheim
proposed an ordinary language interpretation of entropy that may be
rendered as "dispersal of modes of microscopic motion throughout their
accessible range".Later, along with a criticism of the idea of entropy as 'disorder', the dispersal interpretation was advocated by Frank L. Lambert, and is used in some student textbooks.
The interpretation properly refers to dispersal in abstract
microstate spaces, but it may be loosely visualised in some simple
examples of spatial spread of matter or energy. If a partition is
removed from between two different gases, the molecules of each gas
spontaneously disperse as widely as possible into their respectively
newly accessible volumes; this may be thought of as mixing. If a
partition, that blocks heat transfer between two bodies of different
temperatures, is removed so that heat can pass between the bodies, then
energy spontaneously disperses or spreads as heat from the hotter to the
colder.
Beyond such loose visualizations, in a general thermodynamic
process, considered microscopically, spontaneous dispersal occurs in
abstract microscopic phase space.
According to Newton's and other laws of motion, phase space provides a
systematic scheme for the description of the diversity of microscopic
motion that occurs in bodies of matter and radiation. The second law of
thermodynamics may be regarded as quantitatively accounting for the
intimate interactions, dispersal, or mingling of such microscopic
motions. In other words, entropy may be regarded as measuring the extent
of diversity of motions of microscopic constituents of bodies of matter
and radiation in their own states of internal thermodynamic
equilibrium.
Information entropy and statistical mechanics
As a measure of disorder: Traditionally, 20th century textbooks have introduced entropy as order and disorder
so that it provides "a measurement of the disorder or randomness of a
system". It has been argued that ambiguities in, and arbitrary
interpretations of, the terms used (such as "disorder" and "chaos")
contribute to widespread confusion and can hinder comprehension of
entropy for most students. On the other hand, in a convenient though
arbitrary interpretation, "disorder" may be sharply defined as the Shannon entropy of the probability distribution of microstates given a particular macrostate, in which case the connection of "disorder" to thermodynamic entropy is straightforward, but arbitrary and not immediately obvious to anyone unfamiliar with information theory.
Missing information: The idea that information entropy is a measure of how much one does not know about a system is quite useful.
If, instead of using the natural logarithm to define information
entropy, we instead use the base 2 logarithm, then the information
entropy is roughly equal to the average number of (carefully chosen)
yes/no questions that would have to be asked to get complete
information about the system under study. In the introductory example of
two flipped coins, the information entropy for the macrostate which
contains one head and one tail, one would only need one question to
determine its exact state, (e.g. is the first one heads?") and instead
of expressing the entropy as ln(2) one could say, equivalently, that it
is log2(2) which equals the number of binary questions we
would need to ask: One. When measuring entropy using the natural
logarithm (ln), the unit of information entropy is called a "nat", but
when it is measured using the base-2 logarithm, the unit of information
entropy is called a "shannon" (alternatively, "bit"). This is just a
difference in units, much like the difference between inches and
centimeters. (1 nat = log2e shannons). Thermodynamic
entropy is equal to the Boltzmann constant times the information entropy
expressed in nats. The information entropy expressed with the unit shannon
(Sh) is equal to the number of yes–no questions that need to be
answered in order to determine the microstate from the macrostate.
The concepts of "disorder" and "spreading" can be analyzed with this
information entropy concept in mind. For example, if we take a new deck
of cards out of the box, it is arranged in "perfect order" (spades,
hearts, diamonds, clubs, each suit beginning with the ace and ending
with the king), we may say that we then have an "ordered" deck with an
information entropy of zero. If we thoroughly shuffle the deck, the
information entropy will be about 225.6 shannons: We will need to ask
about 225.6 questions, on average, to determine the exact order of the
shuffled deck. We can also say that the shuffled deck has become
completely "disordered" or that the ordered cards have been "spread"
throughout the deck. But information entropy does not say that the deck
needs to be ordered in any particular way. If we take our shuffled deck
and write down the names of the cards, in order, then the information
entropy becomes zero. If we again shuffle the deck, the information
entropy would again be about 225.6 shannons, even if by some miracle it
reshuffled to the same order as when it came out of the box, because
even if it did, we would not know that. So the concept of "disorder" is
useful if, by order, we mean maximal knowledge and by disorder we mean
maximal lack of knowledge. The "spreading" concept is useful because it
gives a feeling to what happens to the cards when they are shuffled. The
probability of a card being in a particular place in an ordered deck is
either 0 or 1, in a shuffled deck it is 1/52. The probability has
"spread out" over the entire deck. Analogously, in a physical system,
entropy is generally associated with a "spreading out" of mass or
energy.
The connection between thermodynamic entropy and information entropy is given by Boltzmann's equation, which says that S = kB ln W. If we take the base-2 logarithm of W,
it will yield the average number of questions we must ask about the
microstate of the physical system in order to determine its macrostate.