In physics, an entropic force acting in a system is a force resulting from the entire system's thermodynamical tendency to increase its entropy, rather than from a particular underlying microscopic force.[1]
For instance, the internal energy of an ideal gas depends only on its temperature, and not on the volume of its containing box, so it is not an energy effect that tends to increase the volume of the box as gas pressure does. This implies that the pressure of an ideal gas has an entropic origin.[2]
What is the origin of such an entropic force? The most general answer is that the effect of thermal fluctuations tends to bring a thermodynamic system toward a macroscopic state that corresponds to a maximum in the number of microscopic states (or micro-states) that are compatible with this macroscopic state. In other words, thermal fluctuations tend to bring a system toward its macroscopic state of maximum entropy.[2]
Mathematical formulation
In the canonical ensemble, the entropic force associated to a macrostate partition is given by:[3][4]where is the temperature, is the entropy associated to the macrostate and is the present macrostate.
Examples
Brownian motion
The entropic approach to Brownian movement was initially proposed by R. M. Neumann.[3][5] Neumann derived the entropic force for a particle undergoing three-dimensional Brownian motion using the Boltzmann equation, denoting this force as a diffusional driving force or radial force. In the paper, three example systems are shown to exhibit such a force:Polymers
A standard example of an entropic force is the elasticity of a freely-jointed polymer molecule.[5] For an ideal chain, maximizing its entropy means reducing the distance between its two free ends. Consequently, a force that tends to collapse the chain is exerted by the ideal chain between its two free ends. This entropic force is proportional to the distance between the two ends.[2][6] The entropic force by a freely-jointed chain has a clear mechanical origin, and can be computed using constrained Lagrangian dynamics.[7]Hydrophobic force
Another example of an entropic force is the hydrophobic force. At room temperature, it partly originates from the loss of entropy by the 3D network of water molecules when they interact with molecules of dissolved substance. Each water molecule is capable of
- donating two hydrogen bonds through the two protons
- accepting two more hydrogen bonds through the two sp3-hybridized lone pairs
If the introduced surface had an ionic or polar nature, there would be water molecules standing upright on 1 (along the axis of an orbital for ionic bond) or 2 (along a resultant polarity axis) of the four sp3 orbitals.[8] These orientations allow easy movement, i.e. degrees of freedom, and thus lowers entropy minimally. But a non-hydrogen-bonding surface with a moderate curvature forces the water molecule to sit tight on the surface, spreading 3 hydrogen bonds tangential to the surface, which then become locked in a clathrate-like basket shape. Water molecules involved in this clathrate-like basket around the non-hydrogen-bonding surface are constrained in their orientation. Thus, any event that would minimize such a surface is entropically favored. For example, when two such hydrophobic particles come very close, the clathrate-like baskets surrounding them merge. This releases some of the water molecules into the bulk of the water, leading to an increase in entropy.
Another related and counter-intuitive example of entropic force is protein folding, which is a spontaneous process and where hydrophobic effect also plays a role.[9] Structures of water-soluble proteins typically have a core in which hydrophobic side chains are buried from water, which stabilizes the folded state.[10] Charged and polar side chains are situated on the solvent-exposed surface where they interact with surrounding water molecules. Minimizing the number of hydrophobic side chains exposed to water is the principal driving force behind the folding process,[10][11] [12] although formation of hydrogen bonds within the protein also stabilizes protein structure.[13][14]
Colloids
Entropic forces are important and widespread in the physics of colloids,[15] where they are responsible for the depletion force, and the ordering of hard particles, such as the crystallization of hard spheres, the isotropic-nematic transition in liquid crystal phases of hard rods, and the ordering of hard polyhedra.[15][16] Entropic forces arise in colloidal systems due to the osmotic pressure that comes from particle crowding. This was first discovered in, and is most intuitive for, colloid-polymer mixtures described by the Asakura-Oosawa model. In this model, polymers are approximated as finite-sized spheres that can penetrate one another, but cannot penetrate the colloidal particles. The inability of the polymers to penetrate the colloids leads to a region around the colloids in which the polymer density is reduced. If the regions of reduced polymer density around two colloids overlap with one another, by means of the colloids approaching one another, the polymers in the system gain an additional free volume that is equal to the volume of the intersection of the reduced density regions. The additional free volume causes an increase in the entropy of the polymers, and drives them to form locally dense-packed aggregates. A similar effect occurs in sufficiently dense colloidal systems without polymers, where osmotic pressure also drives the local dense packing[15] of colloids into a diverse array of structures [16] that can be rationally designed by modifying the shape of the particles.[17]Controversial examples
Some forces that are generally regarded as conventional forces have been argued to be actually entropic in nature. These theories remain controversial and are the subject of ongoing work. Matt Visser, professor of mathematics at Victoria University of Wellington, NZ in "Conservative Entropic Forces" [18] criticizes selected approaches but generally concludes:There is no reasonable doubt concerning the physical reality of entropic forces, and no reasonable doubt that classical (and semi-classical) general relativity is closely related to thermodynamics. Based on the work of Jacobson, Thanu Padmanabhan, and others, there are also good reasons to suspect a thermodynamic interpretation of the fully relativistic Einstein equations might be possible.
Gravity
In 2009, Erik Verlinde argued that gravity can be explained as an entropic force.[19] It claimed (similar to Jacobson's result) that gravity is a consequence of the "information associated with the positions of material bodies". This model combines the thermodynamic approach to gravity with Gerard 't Hooft's holographic principle. It implies that gravity is not a fundamental interaction, but an emergent phenomenon.[19]Other forces
In the wake of the discussion started by Erik Verlinde, entropic explanations for other fundamental forces have been suggested,[18] including Coulomb's law,[20][21][22] the electroweak and strong forces.[23] The same approach was argued to explain dark matter, dark energy and Pioneer effect.[24]Links to adaptive behavior
It was argued that causal entropic forces lead to spontaneous emergence of tool use and social cooperation.[25][26][27] Causal entropic forces by definition maximize entropy production between the present and future time horizon, rather than just greedily maximizing instantaneous entropy production like typical entropic forces.A formal simultaneous connection between the mathematical structure of the discovered laws of nature, intelligence and the entropy-like measures of complexity was previously noted in 2000 by Andrei Soklakov[28][29] in the context of Occam's razor principle.