Thermodynamics
Thermodynamics is the branch of physics concerned with heat, energy, work, and the statistical behaviour of large ensembles of particles. Its four laws describe the most universal constraints known to science — constraints that apply to every physical process from stellar fusion to neural computation.
The second law — that the entropy of an isolated system never decreases — is arguably the most consequential statement in all of physics. It defines the arrow of time, sets limits on the efficiency of engines, and through Landauer's principle connects directly to Information Theory: erasing information has an irreducible thermodynamic cost. This means that computation, cognition, and every form of information processing are subject to physical constraints that no amount of cleverness can circumvent.
The formal identity between thermodynamic entropy (Boltzmann's S = k log W) and Shannon Entropy is either the deepest coincidence in science or evidence that physics and information are two descriptions of the same reality. If the latter, then Mathematics is not merely applied to the physical world — it is the structure of the physical world, and the philosophy of mathematics becomes inseparable from the foundations of physics.