is a Slippery Concept; It doesn’t have to be
Entropy is associated with the Second Law of Thermodynamics, which says that the entropy for any isolated system never decreases. But what is entropy? Few physical properties are as confusing as entropy. This is for a very simple reason; physics cannot define entropy as a fundamental property of state. That fact leaves entropy open to confusion and different interpretations.
It wasn’t always so.
Before physics subsumed thermodynamics as a statistical approximation of mechanics, there was a simple interpretation of the Second Law: “useful” energy (meaning available for useful work) is irreversibly dissipated to ambient heat, which has no availability for work. Simply stated, real processes irreversibly dissipate energy. The Second Law is also commonly expressed as saying heat flows irreversibly from hot to cold. But since hotter heat can do more work than cooler heat, such as by a steam engine, this is just another example of dissipation.
Thermodynamics describes a system’s heat (q), as the energy that is not available for work at the system’s temperature. The thermal exergy (Xₜ) of heat is its potential work on the ambient surroundings. The remaining thermal energy is ambient heat (Q), which has no potential for work on the ambient surroundings. These quantities are related by:
Xₜ=q’(T-Tₐ)/T;
Q=q’(Tₐ/T); and
q’=Xₜ+Q,
where q’ is the thermal energy of heat at temperature T relative to heat at the ambient temperature Tₐ, less than or equal to T.
The equations show that a gas in equilibrium with its ambient surroundings (T=Tₐ) has no thermal exergy and its heat can do no work. The same gas at the same temperature, but as it exists with respect with respect to surroundings at absolute zero (Tₐ=0), has zero ambient heat and its thermal energy is all exergy.
Thermal exergy, ambient heat, and q’ are contextual properties, meaning they depend on the system’s ambient surroundings from which they are reversibly measured. Given the ambient temperature of the system’s surroundings, exergy and ambient heat are well defined, and thermodynamics’ interpretation of the Second Law as the dissipation of exergy is straightforward. Exergy is dissipated, and the increase in entropy (ΔS) is simply a measure of dissipation, given by ΔS = −ΔX/Tₐ.
The ultimate isolated system is the universe itself. The thermodynamic interpretation of the Second Law states that the universe’s exergy is continuously and inexorably dissipated. High-exergy hydrogen is fused to lower-exergy elements; matter collapses into black holes; black holes eventually evaporate; and mass is eventually dispersed as photons. Ultimately, the universe is fated to end in heat death, when useful energy is fully dissipated and there is no potential for directed activity.
When mechanics subsumed thermodynamics as statistical mechanics, it ran into a problem. Mechanics defines a system’s actual physical state non-contextually, as it exists isolation. Mechanics defines a system’s microstate by perfect observation in the absence of thermal noise, which is to say at absolute zero. For measurement at absolute zero, there is no ambient heat, no dissipation of exergy, and no fundamental irreversibility. Mechanics describes friction as the dispersion of mechanical energy to particles too small to resolve. Mechanical energy is dispersed, but mechanical energy and exergy are conserved. The contextuality of exergy and ambient heat is incompatible with the non-contextuality of mechanics, and they are not fundamental properties of the mechanical microstate.
Mechanics attempted to resolve this problem by replacing the contextual properties of exergy and ambient heat with entropy, which it defined by
The integral function sums the increments of dq/T as the system is incrementally heated from absolute zero to the system temperature Tₛ.
Heat and entropy are defined relative to absolute zero, but they are still defined by measurement at the system’s temperature, and they are still contextual properties of the thermodynamic macrostate. The thermodynamic macrostate is defined by properties, such as temperature, pressure and volume, that are measurable at the system’s temperature. Entropy is therefore not a property of microstate. Physics’ failure to formally define entropy as a fundamental property of the underlying physical state has led to contextual interpretations of entropy, in terms of an observer’s ignorance of the actual state or missing information. These interpretations are incompatible with physics’ assumption of a non-contextual reality definable in isolation and independent of its surroundings or observation.
Boltzmann formalized the contextual interpretation of entropy with his statistical mechanical definition of entropy, given by:
S=k ln(W),
where W is the number of microstates consistent with the thermodynamic macrostate description. This is commonly described as the system’s disorder. Boltzmann’s entropy is equal to the thermodynamic entropy. The Equal a priori Probability Assumption states that all microstates are equally possible. It then follows that a higher-entropy macrostate, comprising more possibilities, is more probable than a macrostate comprising fewer microstates and having a lower entropy. This has led to the interpretation of the Second Law as stating that a system evolves toward states of higher probability and disorder.
The mechanical interpretation is the Second Law is statistical. If a gas is initially compressed in a small corner of its container, it has a high probability to maximize its disorder by uniformly expanding throughout its container. The underlying physical state, however, is governed by the deterministic and time-symmetrical laws of physics. Given sufficient time, the laws of physics predict that the gas would eventually and momentarily recompress itself into a small corner of its container. The mechanical interpretation of physics cannot accommodate fundamental irreversibility.
Mechanics’ assumption of perfect measurement from absolute zero is an unattainable idealization. In the essay Reinventing Time, I describe the Dissipative Conceptual Model (DCM) of a system’s physical state as it contextually exists with respect to the system’s actual ambient surroundings. Even the universe as a whole has a positive ambient temperature, defined by its Cosmic Microwave Background (CMB) at 2.7 kelvins. The DCM defines entropy as a fundamental contextual property of a system’s physical state, by:
The DCM entropy is contextual, but it is independent of observation or the existence of observers, and it is objectively defined as a fundamental property of state.
The contextual definition of entropy leads to two distinct paths of increasing entropy. The first path describes the dissipation of exergy. For a fixed ambient temperature, the increase in DCM entropy is measured by the dissipation of exergy. This is the original interpretation of the Second Law of thermodynamics as the path of dissipation.
The second path of increasing entropy occurs when the ambient temperature declines. As the ambient temperature declines, the spread between ambient and system temperatures increases, and the DCM entropy therefore increases. A decline in ambient temperature leads to the fine-graining of a contextually-defined physical state. The increase in DCM entropy measures the increase in the number of potential fine-grained states available to a system as it re-equilibrates at its cooler ambient temperature. This defines the path of refinement.
The DCM provides a very different interpretation of the universe’s evolution compared to thermodynamic heat death. As the universe expands and its CMB cools, the declining ambient temperature creates new fine-grained potentialities. Random selection of potentialities actualizes the present from an indeterminate future. At the same time, as the ambient temperature cools, thermal energy is converted to exergy. Exergy provides the drive for dissipative processes and the creation of dissipative structures. As long as the universe expands, exergy will be created and the universe will continue to evolve.