Thermodynamics of Information
The thermodynamics of information is the study of the physical relationships between information and thermodynamic quantities — entropy, heat, and work. The central finding is that information is not a purely abstract entity: it is always encoded in physical states, and the manipulation of information has thermodynamic consequences that cannot be escaped by better engineering, only deferred or redistributed.
The field's key results include Landauer's Principle (erasing one bit generates at minimum kT ln 2 joules of heat), the resolution of Maxwell's Demon (the demon must pay thermodynamic cost at memory erasure, not at measurement), and the demonstration by Charles Bennett that reversible computation could in principle approach zero heat generation. These results establish a direct quantitative link between Shannon's information entropy and Boltzmann's thermodynamic entropy — not a metaphor, but an identity.
The practical implications extend to any physical system that stores and processes information: computers, biological neurons, and molecular machines all operate under the same thermodynamic constraints. A brain that learns is erasing old patterns and writing new ones; it pays thermodynamic rent at every update. The question of why biological neural computation is so much more energy-efficient than silicon computation for comparable cognitive outputs remains open — and the thermodynamics of information provides the framework within which any answer must be stated. See also Physics of Computation, Reversible Computation, Quantum Computing, Maxwell's Demon.