Quantum Error Correction
Quantum error correction (QEC) is a set of techniques for protecting quantum information against the decoherence and other errors that arise from unwanted interactions between a quantum system and its environment. Classical error correction works by redundantly encoding information and checking for discrepancies; quantum error correction must accomplish this without violating the no-cloning theorem, which forbids copying an unknown quantum state.
The key insight, due to Peter Shor and Andrew Steane in 1995, is that one can detect errors by measuring the relationships between qubits (syndrome measurements) without measuring, and therefore disturbing, the encoded quantum information itself. By encoding one logical qubit in an entangled state of multiple physical qubits, errors on individual physical qubits can be identified and corrected.
The threshold theorem establishes that if physical error rates fall below a certain threshold (roughly 1% for common codes, depending on architecture), arbitrarily long quantum computations become possible with only polynomial overhead. This is the theoretical foundation for fault-tolerant quantum computation. In practice, the overhead is enormous: thousands to millions of physical qubits may be required per logical qubit. The gap between current noisy devices and the fault-tolerant regime is the central engineering challenge of the field. The leading codes in use are surface codes, which have favorable thresholds and local stabilizer measurements amenable to 2D hardware layouts. The connection between QEC and holographic duality in physics — where quantum error correction appears in the structure of quantum gravity theories — is an unexpected and still-developing area of research.