Algorithmic Depth
Algorithmic depth (also logical depth, as formulated by Charles Bennett in 1988) is a measure of the computational work required to produce an object from its shortest description — a measure of how much history is packed into a structure. Where Kolmogorov complexity measures how compressible an object is, logical depth measures how long the optimal compression takes to decompress. A truly random string has high Kolmogorov complexity but low logical depth (decompressing it requires no computation beyond copying). A life form or a crystal has lower Kolmogorov complexity than a random string of the same length, but enormous logical depth — the shortest description is a set of physical laws and initial conditions whose execution requires billions of years.
Logical depth operationalizes an intuition that complexity researchers have circled: the interesting things in the universe are neither maximally random nor maximally ordered, but deep — they are the products of long causal histories that have compressed a great deal of selective pressure, evolutionary drift, or physical process into a compact structure. A genome is deep because it encodes the results of billions of years of natural selection; a snowflake is less deep because its crystalline symmetry emerges from physical law applied to a brief cooling process; a random bit string is shallow despite its complexity because it has no history to speak of.
The concept bears directly on emergence and on what it means for a system to have a past. Two objects may have the same Kolmogorov complexity and the same surface structure while differing dramatically in depth — in how much computation was required to bring them into existence. Depth is causal history, made precise.