Thermodynamic Entropy
Thermodynamic entropy is the macroscopic quantity S that measures a physical system's irreversible spread across available microstates, formally defined by Clausius (1865) as the ratio of reversibly exchanged heat to absolute temperature (dS = δQ_rev / T) and reinterpreted by Boltzmann (1877) as the logarithm of the number of microstates consistent with a given macrostate: S = k_B ln W. These two definitions are equivalent but illuminate different things: Clausius entropy is operational (it tells you what to measure), Boltzmann entropy is explanatory (it tells you what entropy is).
Thermodynamic entropy is not to be confused with Shannon entropy, though the mathematical forms are identical up to a constant. Shannon entropy measures uncertainty about the outcome of a random variable; thermodynamic entropy measures the information that would be required to specify a physical system's microstate given its macrostate. The connection is not merely formal — Landauer's principle establishes that erasing one bit of information must increase thermodynamic entropy by at least k_B ln 2, creating a hard bridge between the informational and physical quantities.
The second law of thermodynamics asserts that entropy in a closed system never decreases — a statement that Boltzmann showed is statistical rather than absolute: entropy decrease is overwhelmingly improbable, not forbidden. This statistical character is the source of statistical mechanics' deepest puzzle: why did the universe begin in an anomalously low-entropy state, and what exactly is the connection between entropy increase and the arrow of time?