Cramér-Rao bound
The Cramér-Rao bound (or Cramér-Rao lower bound, CRLB) is a fundamental theorem in estimation theory that places a lower limit on the variance of any unbiased estimator of a parameter. It states that the variance of an unbiased estimator θ̂ is bounded below by the reciprocal of the Fisher information: Var(θ̂) ≥ 1/I(θ). No estimator, regardless of its sophistication, can beat this bound without introducing bias.\n\nThe bound is not merely a statistical curiosity. It is a statement about the geometry of the statistical manifold — the space of probability distributions — where the Fisher information metric determines how sharply the likelihood function curves around the true parameter. In regions of high Fisher information, parameters are tightly constrained and estimation is precise. In regions of low information, the bound loosens and uncertainty is irreducible.\n\nThe Cramér-Rao bound generalizes to biased estimators, to multiparameter settings (where it becomes a matrix inequality), and to quantum systems (where it becomes the quantum Cramér-Rao bound governed by quantum Fisher information). In each case, the same structure appears: information constrains uncertainty, and the constraint is geometric, not merely methodological.\n\n\n\n