Search This Blog

Friday, July 28, 2017

Maxwell–Boltzmann distribution

From Wikipedia, the free encyclopedia
Maxwell–Boltzmann
Probability density function
Maxwell-Boltzmann distribution pdf.svg
Cumulative distribution function
Maxwell-Boltzmann distribution cdf.svg
Parameters a>0
Support x\in (0;\infty)
PDF \sqrt{\frac{2}{\pi}} \frac{x^2 e^{-x^2/\left(2a^2\right)}}{a^3}
CDF \operatorname{erf}\left(\frac{x}{\sqrt{2} a}\right) -\sqrt{\frac{2}{\pi}} \frac{x e^{-x^2/\left(2a^2\right)}}{a} where erf is the error function
Mean \mu=2a \sqrt{\frac{2}{\pi}}
Mode \sqrt{2} a
Variance \sigma^2=\frac{a^2(3 \pi - 8)}{\pi}
Skewness \gamma_1=\frac{2 \sqrt{2} (16 -5 \pi)}{(3 \pi - 8)^{3/2}}
Ex. kurtosis \gamma_2=4\frac{\left(-96+40\pi-3\pi^2\right)}{(3 \pi - 8)^2}
Entropy \ln\left(a\sqrt{2\pi}\right)+\gamma-\frac{1}{2}

In statistics the Maxwell–Boltzmann distribution is a particular probability distribution named after James Clerk Maxwell and Ludwig Boltzmann. It was first defined and used in physics (in particular in statistical mechanics) for describing particle speeds in idealized gases where the particles move freely inside a stationary container without interacting with one another, except for very brief collisions in which they exchange energy and momentum with each other or with their thermal environment. Particle in this context refers to gaseous particles (atoms or molecules), and the system of particles is assumed to have reached thermodynamic equilibrium.[1] While the distribution was first derived by Maxwell in 1860 on heuristic grounds,[2] Boltzmann later carried out significant investigations into the physical origins of this distribution.

A particle speed probability distribution indicates which speeds are more likely: a particle will have a speed selected randomly from the distribution, and is more likely to be within one range of speeds than another. The distribution depends on the temperature of the system and the mass of the particle.[3] The Maxwell–Boltzmann distribution applies to the classical ideal gas, which is an idealization of real gases. In real gases, there are various effects (e.g., van der Waals interactions, vortical flow, relativistic speed limits, and quantum exchange interactions) that can make their speed distribution different from the Maxwell–Boltzmann form. However, rarefied gases at ordinary temperatures behave very nearly like an ideal gas and the Maxwell speed distribution is an excellent approximation for such gases. Thus, it forms the basis of the Kinetic theory of gases, which provides a simplified explanation of many fundamental gaseous properties, including pressure and diffusion.[4]

Distribution function

The speed probability density functions of the speeds of a few noble gases at a temperature of 298.15 K (25 °C). The y-axis is in s/m so that the area under any section of the curve (which represents the probability of the speed being in that range) is dimensionless.

The Maxwell–Boltzmann distribution is the function[5]
 f(v) = \sqrt{\left(\frac{m}{2 \pi kT}\right)^3}\, 4\pi v^2 e^{- \frac{mv^2}{2kT}},
where m is the particle mass and kT is the product of Boltzmann's constant and thermodynamic temperature. An interesting point to be noted is that the Maxwell-Boltzmann distribution will not vary with the value of m/T i.e the ratio of mass of the molecule to its absolute temperature; mathematically (Derivative of f(v)/derivative of (m/T))=0. This probability density function gives the probability, per unit speed, of finding the particle with a speed near v. This equation is simply the Maxwell distribution (given in the infobox) with distribution parameter a=\sqrt{kT/m}. In probability theory the Maxwell–Boltzmann distribution is a chi distribution with three degrees of freedom and scale parameter a=\sqrt{kT/m}.

The simplest ordinary differential equation satisfied by the distribution is:
{\displaystyle kTvf'(v)+f(v)(mv^{2}-2kT)=0,}
f(1)={\sqrt  {{\frac  {2}{\pi }}}}e^{{-{\frac  {m}{2kT}}}}\left({\frac  {m}{kT}}\right)^{{3/2}}
or in unitless presentation:
a^{2}xf'(x)+\left(x^{2}-2a^{2}\right)f(x)=0,
f(1)={\frac  {{\sqrt  {{\frac  {2}{\pi }}}}e^{{-{\frac  {1}{2a^{2}}}}}}{a^{3}}}.
Note that a distribution (function) is not the same as the probability. The distribution (function) stands for an average number, as in all three kinds of statistics (Maxwell–Boltzmann, Bose–Einstein, Fermi–Dirac). With the Darwin–Fowler method of mean values the Maxwell–Boltzmann distribution is obtained as an exact result.

Typical speeds

The mean speed, most probable speed (mode), and root-mean-square can be obtained from properties of the Maxwell distribution.
  • The most probable speed, vp, is the speed most likely to be possessed by any molecule (of the same mass m) in the system and corresponds to the maximum value or mode of f(v). To find it, we calculate the derivative df/dv, set it to zero and solve for v:
    \frac{df(v)}{dv} =  0
    which yields:
    v_p = \sqrt { \frac{2kT}{m} } = \sqrt { \frac{2RT}{M} }
    where R is the gas constant and M = NA m is the molar mass of the substance.
    For diatomic nitrogen (N2, the primary component of air) at room temperature (300 K), this gives v_p = 422 m/s
  • The mean speed is the expected value of the speed distribution
     \langle v \rangle = \int_0^{\infty} v \, f(v) \, dv= \sqrt { \frac{8kT}{\pi m}}= \sqrt { \frac{8RT}{\pi M}} = \frac{2}{\sqrt{\pi}} v_p
  • The root mean square speed is the second-order moment of speed:
     \sqrt{\langle v^2 \rangle} = \left(\int_0^{\infty} v^2 \, f(v) \, dv  \right)^{1/2}= \sqrt { \frac{3kT}{m}}= \sqrt { \frac{3RT}{M} } = \sqrt{ \frac{3}{2} } v_p
The typical speeds are related as follows:
 0.886 \langle v \rangle = v_p < \langle v \rangle < \sqrt{\langle v^2 \rangle} = 1.085 \langle v \rangle.

Derivation and related distributions

The original derivation in 1860 by James Clerk Maxwell was an argument based on molecular collisions of the Kinetic theory of gases as well as certain symmetries in the speed distribution function; Maxwell also gave an early argument that these molecular collisions entail a tendency towards equilibrium.[2][6] After Maxwell, Ludwig Boltzmann in 1872[7] also derived the distribution on mechanical grounds and argued that gases should over time tend toward this distribution, due to collisions (see H-theorem). He later (1877)[8] derived the distribution again under the framework of statistical thermodynamics. The derivations in this section are along the lines of Boltzmann's 1877 derivation, starting with result known as Maxwell–Boltzmann statistics (from statistical thermodynamics). Maxwell–Boltzmann statistics gives the average number of particles found in a given single-particle microstate, under certain assumptions:[1][9]
{\displaystyle {\frac {N_{i}}{N}}={\frac {\exp(-E_{i}/kT)}{\sum _{j}\exp(-E_{j}/kT)}}}




(1)
where:
  • i and j are indices (or labels) of the single-particle micro states,
  • Ni is the average number of particles in the single-particle microstate i,
  • N is the total number of particles in the system,
  • Ei is the energy of microstate i,
  • T is the equilibrium temperature of the system,
  • k is the Boltzmann constant.
The assumptions of this equation are that the particles do not interact, and that they are classical; this means that each particle's state can be considered independently from the other particles' states. Additionally, the particles are assumed to be in thermal equilibrium. The denominator in Equation (1) is simply a normalizing factor so that the Ni/N add up to 1 — in other words it is a kind of partition function (for the single-particle system, not the usual partition function of the entire system).

Because velocity and speed are related to energy, Equation (1) can be used to derive relationships between temperature and the speeds of gas particles. All that is needed is to discover the density of microstates in energy, which is determined by dividing up momentum space into equal sized regions.

Distribution for the momentum vector

The potential energy is taken to be zero, so that all energy is in the form of kinetic energy. The relationship between kinetic energy and momentum for massive non-relativistic particles is
E=\frac{p^2}{2m}




(2)
where p2 is the square of the momentum vector p = [pxpypz]. We may therefore rewrite Equation (1) as:

\frac{N_i}{N} = 
\frac{1}{Z} 
\exp \left[
-\frac{p_{i, x}^2 + p_{i, y}^2 + p_{i, z}^2}{2mkT}
\right]




(3)
where Z is the partition function, corresponding to the denominator in Equation (1). Here m is the molecular mass of the gas, T is the thermodynamic temperature and k is the Boltzmann constant. This distribution of Ni/N is proportional to the probability density function fp for finding a molecule with these values of momentum components, so:

f_\mathbf{p} (p_x, p_y, p_z) =
\frac{c}{Z} 
\exp \left[
-\frac{p_x^2 + p_y^2 + p_z^2}{2mkT}
\right]




(4)
The normalizing constant c, can be determined by recognizing that the probability of a molecule having some momentum must be 1. Therefore the integral of equation (4) over all px, py, and pz must be 1.

It can be shown that:

c = \frac{Z}{(2 \pi mkT)^{3/2}}




(5)
Substituting Equation (5) into Equation (4) gives:

f_\mathbf{p} (p_x, p_y, p_z) =
\left( 2 \pi mkT \right)^{-3/2}
\exp \left[
-\frac{p_x^2 + p_y^2 + p_z^2}{2mkT}
\right]   (6)
The distribution is seen to be the product of three independent normally distributed variables p_{x}, p_{y}, and p_{z}, with variance mkT. Additionally, it can be seen that the magnitude of momentum will be distributed as a Maxwell–Boltzmann distribution, with a=\sqrt{mkT}. The Maxwell–Boltzmann distribution for the momentum (or equally for the velocities) can be obtained more fundamentally using the H-theorem at equilibrium within the Kinetic theory of gases framework.

Distribution for the energy

The energy distribution is found imposing
f_{E}(E)dE=f_{p}({\textbf  p})d^{3}{\textbf  p},




(7)
where d^{3}{\textbf  p} is the infinitesimal phase-space volume of momenta corresponding to the energy interval dE. Making use of the spherical symmetry of the energy-momentum dispersion relation E=|{\textbf  p}|^{2}/2m, this can be expressed in terms of dE as
d^{3}{\textbf  p}=4\pi |{\textbf  p}|^{2}d|{\textbf  p}|=4\pi m{\sqrt  {2mE}}dE.




(8)
Using then (8) in (7), and expressing everything in terms of the energy E, we get
f_{E}(E)dE={\frac  {1}{(2\pi mkT)^{{3/2}}}}e^{{-E/kT}}4\pi m{\sqrt  {2mE}}dE=2{\sqrt  {{\frac  {E}{\pi }}}}\left({\frac  {1}{kT}}\right)^{{3/2}}\exp \left({\frac  {-E}{kT}}\right)dE
and finally
f_{E}(E)=2{\sqrt  {{\frac  {E}{\pi }}}}\left({\frac  {1}{kT}}\right)^{{3/2}}\exp \left({\frac  {-E}{kT}}\right)   (9)
Since the energy is proportional to the sum of the squares of the three normally distributed momentum components, this distribution is a gamma distribution; in particular, it is a chi-squared distribution with three degrees of freedom.

By the equipartition theorem, this energy is evenly distributed among all three degrees of freedom, so that the energy per degree of freedom is distributed as a chi-squared distribution with one degree of freedom:[10]
{\displaystyle f_{\epsilon }(\epsilon )\,d\epsilon ={\sqrt {\frac {1}{\pi \epsilon kT}}}~\exp \left[{\frac {-\epsilon }{kT}}\right]\,d\epsilon }
where \epsilon is the energy per degree of freedom. At equilibrium, this distribution will hold true for any number of degrees of freedom. For example, if the particles are rigid mass dipoles of fixed dipole moment, they will have three translational degrees of freedom and two additional rotational degrees of freedom. The energy in each degree of freedom will be described according to the above chi-squared distribution with one degree of freedom, and the total energy will be distributed according to a chi-squared distribution with five degrees of freedom. This has implications in the theory of the specific heat of a gas.

The Maxwell–Boltzmann distribution can also be obtained by considering the gas to be a type of quantum gas for which the approximation ε >> k T may be made.

Distribution for the velocity vector

Recognizing that the velocity probability density fv is proportional to the momentum probability density function by

f_\mathbf{v} d^3v = f_\mathbf{p} \left(\frac{dp}{dv}\right)^3 d^3v
and using p = mv we get
f_{{\mathbf  {v}}}(v_{x},v_{y},v_{z})=\left({\frac  {m}{2\pi kT}}\right)^{{3/2}}\exp \left[-{\frac  {m(v_{x}^{2}+v_{y}^{2}+v_{z}^{2})}{2kT}}\right]
which is the Maxwell–Boltzmann velocity distribution. The probability of finding a particle with velocity in the infinitesimal element [dvxdvydvz] about velocity v = [vxvyvz] is

f_\mathbf{v} \left(v_x, v_y, v_z\right)\, dv_x\, dv_y\, dv_z.
Like the momentum, this distribution is seen to be the product of three independent normally distributed variables v_x, v_y, and v_z, but with variance \frac{kT}{m}. It can also be seen that the Maxwell–Boltzmann velocity distribution for the vector velocity [vxvyvz] is the product of the distributions for each of the three directions:

f_v \left(v_x, v_y, v_z\right) = f_v (v_x)f_v (v_y)f_v (v_z)
where the distribution for a single direction is

f_v (v_i) =
\sqrt{\frac{m}{2 \pi kT}}
\exp \left[
\frac{-mv_i^2}{2kT}
\right].
Each component of the velocity vector has a normal distribution with mean \mu_{v_x} = \mu_{v_y} = \mu_{v_z} = 0 and standard deviation \sigma_{v_x} = \sigma_{v_y} = \sigma_{v_z} = \sqrt{\frac{kT}{m}}, so the vector has a 3-dimensional normal distribution, a particular kind of multivariate normal distribution, with mean  \mu_{\mathbf{v}} = {\mathbf{0}} and standard deviation \sigma_{\mathbf{v}} = \sqrt{\frac{3kT}{m}}.

The Maxwell–Boltzmann distribution for the speed follows immediately from the distribution of the velocity vector, above. Note that the speed is
v = \sqrt{v_x^2 + v_y^2 + v_z^2}
and the volume element in spherical coordinates
 dv_x\, dv_y\, dv_z = v^2 \sin \theta\, dv\, d\theta\, d\phi
where \phi and \theta are the "course" (azimuth of the velocity vector) and "path angle" (elevation angle of the velocity vector). Integration of the normal probability density function of the velocity, above, over the course (from 0 to 2\pi ) and path angle (from 0 to \pi ), with substitution of the speed for the sum of the squares of the vector components, yields the speed distribution.

Thursday, July 27, 2017

Ideal gas

From Wikipedia, the free encyclopedia

An ideal gas is a theoretical gas composed of many randomly moving point particles whose only interactions are perfectly elastic collisions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is amenable to analysis under statistical mechanics.

One mole of an ideal gas has a volume of 22.710947(13) litres[1] at STP (a temperature of 273.15 K and an absolute pressure of exactly 105 Pa) as defined by IUPAC since 1982. (Until 1982, STP was defined as a temperature of 273.15 K and an absolute pressure of exactly 1 atm. The volume of one mole of an ideal gas at this temperature and pressure is 22.413962(13) litres.[2] IUPAC recommends that the former use of this definition should be discontinued;[3] however, some textbooks still use these old values.)

At normal conditions such as standard temperature and pressure, most real gases behave qualitatively like an ideal gas. Many gases such as nitrogen, oxygen, hydrogen, noble gases, and some heavier gases like carbon dioxide can be treated like ideal gases within reasonable tolerances.[4] Generally, a gas behaves more like an ideal gas at higher temperature and lower pressure,[4] as the potential energy due to intermolecular forces becomes less significant compared with the particles' kinetic energy, and the size of the molecules becomes less significant compared to the empty space between them.

The ideal gas model tends to fail at lower temperatures or higher pressures, when intermolecular forces and molecular size become important. It also fails for most heavy gases, such as many refrigerants,[4] and for gases with strong intermolecular forces, notably water vapor. At high pressures, the volume of a real gas is often considerably greater than that of an ideal gas. At low temperatures, the pressure of a real gas is often considerably less than that of an ideal gas. At some point of low temperature and high pressure, real gases undergo a phase transition, such as to a liquid or a solid. The model of an ideal gas, however, does not describe or allow phase transitions. These must be modeled by more complex equations of state. The deviation from the ideal gas behaviour can be described by a dimensionless quantity, the compressibility factor, Z.

The ideal gas model has been explored in both the Newtonian dynamics (as in "kinetic theory") and in quantum mechanics (as a "gas in a box"). The ideal gas model has also been used to model the behavior of electrons in a metal (in the Drude model and the free electron model), and it is one of the most important models in statistical mechanics.

Types of ideal gas

There are three basic classes of ideal gas[citation needed]:
The classical ideal gas can be separated into two types: The classical thermodynamic ideal gas and the ideal quantum Boltzmann gas. Both are essentially the same, except that the classical thermodynamic ideal gas is based on classical statistical mechanics, and certain thermodynamic parameters such as the entropy are only specified to within an undetermined additive constant. The ideal quantum Boltzmann gas overcomes this limitation by taking the limit of the quantum Bose gas and quantum Fermi gas in the limit of high temperature to specify these additive constants. The behavior of a quantum Boltzmann gas is the same as that of a classical ideal gas except for the specification of these constants. The results of the quantum Boltzmann gas are used in a number of cases including the Sackur–Tetrode equation for the entropy of an ideal gas and the Saha ionization equation for a weakly ionized plasma.

Classical thermodynamic ideal gas

Macroscopic account

The ideal gas law is an extension of experimentally discovered gas laws. Real fluids at low density and high temperature approximate the behavior of a classical ideal gas. However, at lower temperatures or a higher density, a real fluid deviates strongly from the behavior of an ideal gas, particularly as it condenses from a gas into a liquid or as it deposits from a gas into a solid. This deviation is expressed as a compressibility factor.

The classical thermodynamic properties of an ideal gas can be described by two equations of state:.[5][6]

One of them is the well known ideal gas law
PV=nRT\,
where
This equation is derived from Boyle's law: V = k/P (at constant T and n); Charles's law: V = bT (at constant P and n); and Avogadro's law: V = an (at constant T and P); where
  • k is a constant used in Boyle's law
  • b is a proportionality constant; equal to V/T
  • a is a proportionality constant; equal to V/n.
Multiplying the equations representing the three laws:
{\displaystyle V*V*V=kba\left({\frac {Tn}{P}}\right)}
Gives:
{\displaystyle V*V*V=\left({\frac {kba}{3}}\right)\left({\frac {Tn}{P}}\right)}.
Under ideal conditions,
V=R\left({\frac {Tn}{P}}\right) ;
that is,
PV=nRT.
The other equation of state of an ideal gas must express Joule's law, that the internal energy of a fixed mass of ideal gas is a function only of its temperature. For the present purposes it is convenient to postulate an exemplary version of this law by writing:
U={\hat {c}}_{V}nRT
where

Microscopic model

In order to switch from macroscopic quantities (left hand side of the following equation) to microscopic ones (right hand side), we use
{\displaystyle nR=Nk_{\mathrm {B} }\ }
where
  • N is the number of gas particles
  • kB is the Boltzmann constant (1.381×10−23 J·K−1).
The probability distribution of particles by velocity or energy is given by the Maxwell speed distribution.

The ideal gas model depends on the following assumptions:
  • The molecules of the gas are indistinguishable, small, hard spheres
  • All collisions are elastic and all motion is frictionless (no energy loss in motion or collision)
  • Newton's laws apply
  • The average distance between molecules is much larger than the size of the molecules
  • The molecules are constantly moving in random directions with a distribution of speeds
  • There are no attractive or repulsive forces between the molecules apart from those that determine their point-like collisions
  • The only forces between the gas molecules and the surroundings are those that determine the point-like collisions of the molecules with the walls
  • In the simplest case, there are no long-range forces between the molecules of the gas and the surroundings.
The assumption of spherical particles is necessary so that there are no rotational modes allowed, unlike in a diatomic gas. The following three assumptions are very related: molecules are hard, collisions are elastic, and there are no inter-molecular forces. The assumption that the space between particles is much larger than the particles themselves is of paramount importance, and explains why the ideal gas approximation fails at high pressures.

Heat capacity

The heat capacity at constant volume, including an ideal gas is:
{\hat {c}}_{V}={\frac {1}{nR}}T\left({\frac {\partial S}{\partial T}}\right)_{V}={\frac {1}{nR}}\left({\frac {\partial U}{\partial T}}\right)_{V}
where S is the entropy. This is the dimensionless heat capacity at constant volume, which is generally a function of temperature due to intermolecular forces. For moderate temperatures, the constant for a monatomic gas is ĉV = 3/2 while for a diatomic gas it is ĉV = 5/2. It is seen that macroscopic measurements on heat capacity provide information on the microscopic structure of the molecules.

The heat capacity at constant pressure of 1/R mole of ideal gas is:
{\displaystyle {\hat {c}}_{P}={\frac {1}{nR}}T\left({\frac {\partial S}{\partial T}}\right)_{P}={\frac {1}{nR}}\left({\frac {\partial H}{\partial T}}\right)_{P}={\hat {c}}_{V}+1}
where H = U + PV is the enthalpy of the gas.

Sometimes, a distinction is made between an ideal gas, where ĉV and ĉP could vary with temperature, and a perfect gas, for which this is not the case.

The ratio of the constant volume and constant pressure heat capacity is
\gamma ={\frac {c_{P}}{c_{V}}}
For air, which is a mixture of gases, this ratio is 1.4.

Entropy

Using the results of thermodynamics only, we can go a long way in determining the expression for the entropy of an ideal gas. This is an important step since, according to the theory of thermodynamic potentials, if we can express the entropy as a function of U (U is a thermodynamic potential), volume V and the number of particles N, then we will have a complete statement of the thermodynamic behavior of the ideal gas. We will be able to derive both the ideal gas law and the expression for internal energy from it.

Since the entropy is an exact differential, using the chain rule, the change in entropy when going from a reference state 0 to some other state with entropy S may be written as ΔS where:
\Delta S=\int _{S_{0}}^{S}dS=\int _{T_{0}}^{T}\left({\frac {\partial S}{\partial T}}\right)_{V}\!dT+\int _{V_{0}}^{V}\left({\frac {\partial S}{\partial V}}\right)_{T}\!dV
where the reference variables may be functions of the number of particles N. Using the definition of the heat capacity at constant volume for the first differential and the appropriate Maxwell relation for the second we have:
{\displaystyle \Delta S=\int _{T_{0}}^{T}{\frac {C_{V}}{T}}\,dT+\int _{V_{0}}^{V}\left({\frac {\partial P}{\partial T}}\right)_{V}dV.}
Expressing CV in terms of ĉV as developed in the above section, differentiating the ideal gas equation of state, and integrating yields:
\Delta S={\hat {c}}_{V}Nk\ln \left({\frac {T}{T_{0}}}\right)+Nk\ln \left({\frac {V}{V_{0}}}\right)
which implies that the entropy may be expressed as:
{\displaystyle S=Nk\ln \left({\frac {VT^{{\hat {c}}_{V}}}{f(N)}}\right)}
where all constants have been incorporated into the logarithm as f(N) which is some function of the particle number N having the same dimensions as VTĉV in order that the argument of the logarithm be dimensionless. We now impose the constraint that the entropy be extensive. This will mean that when the extensive parameters (V and N) are multiplied by a constant, the entropy will be multiplied by the same constant. Mathematically:
S(T,aV,aN)=aS(T,V,N).\,
From this we find an equation for the function f(N)
af(N)=f(aN).\,
Differentiating this with respect to a, setting a equal to 1, and then solving the differential equation yields f(N):
f(N)=\Phi N\,
where Φ may vary for different gases, but will be independent of the thermodynamic state of the gas. It will have the dimensions of VTĉV/N. Substituting into the equation for the entropy:
{\displaystyle {\frac {S}{Nk}}=\ln \left({\frac {VT^{{\hat {c}}_{V}}}{N\Phi }}\right).\,}
and using the expression for the internal energy of an ideal gas, the entropy may be written:
{\displaystyle {\frac {S}{Nk}}=\ln \left[{\frac {V}{N}}\,\left({\frac {U}{{\hat {c}}_{V}kN}}\right)^{{\hat {c}}_{V}}\,{\frac {1}{\Phi }}\right]}
Since this is an expression for entropy in terms of U, V, and N, it is a fundamental equation from which all other properties of the ideal gas may be derived.

This is about as far as we can go using thermodynamics alone. Note that the above equation is flawed — as the temperature approaches zero, the entropy approaches negative infinity, in contradiction to the third law of thermodynamics. In the above "ideal" development, there is a critical point, not at absolute zero, at which the argument of the logarithm becomes unity, and the entropy becomes zero. This is unphysical. The above equation is a good approximation only when the argument of the logarithm is much larger than unity — the concept of an ideal gas breaks down at low values of V/N. Nevertheless, there will be a "best" value of the constant in the sense that the predicted entropy is as close as possible to the actual entropy, given the flawed assumption of ideality. A quantum-mechanical derivation of this constant is developed in the derivation of the Sackur–Tetrode equation which expresses the entropy of a monatomic (ĉV = 3/2) ideal gas. In the Sackur–Tetrode theory the constant depends only upon the mass of the gas particle. The Sackur–Tetrode equation also suffers from a divergent entropy at absolute zero, but is a good approximation for the entropy of a monatomic ideal gas for high enough temperatures.

Thermodynamic potentials

Expressing the entropy as a function of T, V, and N:
{\frac {S}{kN}}=\ln \left({\frac {VT^{{\hat {c}}_{V}}}{N\Phi }}\right)
The chemical potential of the ideal gas is calculated from the corresponding equation of state (see thermodynamic potential):
\mu =\left({\frac {\partial G}{\partial N}}\right)_{T,P}
where G is the Gibbs free energy and is equal to U + PVTS so that:
\mu (T,V,N)=kT\left({\hat {c}}_{P}-\ln \left({\frac {VT^{{\hat {c}}_{V}}}{N\Phi }}\right)\right)
The thermodynamic potentials for an ideal gas can now be written as functions of T, V, and N as:
U\,
={\hat {c}}_{V}NkT\,
A\, {\displaystyle =U-TS\,} =\mu N-NkT\,
H\, {\displaystyle =U+PV\,} ={\hat {c}}_{P}NkT\,
G\, {\displaystyle =U+PV-TS\,} =\mu N\,
where, as before,
{\hat {c}}_{P}={\hat {c}}_{V}+1.
The most informative way of writing the potentials is in terms of their natural variables, since each of these equations can be used to derive all of the other thermodynamic variables of the system. In terms of their natural variables, the thermodynamic potentials of a single-species ideal gas are:
U(S,V,N)={\hat {c}}_{V}Nk\left({\frac {N\Phi }{V}}\,e^{S/Nk}\right)^{1/{\hat {c}}_{V}}
A(T,V,N)=NkT\left({\hat {c}}_{V}-\ln \left({\frac {VT^{{\hat {c}}_{V}}}{N\Phi }}\right)\right)
H(S,P,N)={\hat {c}}_{P}Nk\left({\frac {P\Phi }{k}}\,e^{S/Nk}\right)^{1/{\hat {c}}_{P}}
G(T,P,N)=NkT\left({\hat {c}}_{P}-\ln \left({\frac {kT^{{\hat {c}}_{P}}}{P\Phi }}\right)\right)

In statistical mechanics, the relationship between the Helmholtz free energy and the partition function is fundamental, and is used to calculate the thermodynamic properties of matter; see configuration integral for more details.

Speed of sound

The speed of sound in an ideal gas is given by
{\displaystyle c_{\text{sound}}={\sqrt {\left({\frac {\partial P}{\partial \rho }}\right)_{s}}}={\sqrt {\frac {\gamma P}{\rho }}}={\sqrt {\frac {\gamma RT}{M}}}}
where
γ is the adiabatic index (ĉP/ĉV)
s is the entropy per particle of the gas.
ρ is the mass density of the gas.
P is the pressure of the gas.
R is the universal gas constant
T is the temperature
M is the molar mass of the gas.

Ideal quantum gases

In the above-mentioned Sackur–Tetrode equation, the best choice of the entropy constant was found to be proportional to the quantum thermal wavelength of a particle, and the point at which the argument of the logarithm becomes zero is roughly equal to the point at which the average distance between particles becomes equal to the thermal wavelength. In fact, quantum theory itself predicts the same thing. Any gas behaves as an ideal gas at high enough temperature and low enough density, but at the point where the Sackur–Tetrode equation begins to break down, the gas will begin to behave as a quantum gas, composed of either bosons or fermions. (See the gas in a box article for a derivation of the ideal quantum gases, including the ideal Boltzmann gas.)

Gases tend to behave as an ideal gas over a wider range of pressures when the temperature reaches the Boyle temperature.

Ideal Boltzmann gas

The ideal Boltzmann gas yields the same results as the classical thermodynamic gas, but makes the following identification for the undetermined constant Φ:
{\displaystyle \Phi ={\frac {T^{\frac {3}{2}}\Lambda ^{3}}{g}}}
where Λ is the thermal de Broglie wavelength of the gas and g is the degeneracy of states.

Ideal Bose and Fermi gases

An ideal gas of bosons (e.g. a photon gas) will be governed by Bose–Einstein statistics and the distribution of energy will be in the form of a Bose–Einstein distribution. An ideal gas of fermions will be governed by Fermi–Dirac statistics and the distribution of energy will be in the form of a Fermi–Dirac distribution.

Political psychology

From Wikipedia, the free encyclopedia ...