Jump to content

Channel Capacity

From Emergent Wiki

Channel capacity is the tight upper bound on the rate at which information can be transmitted reliably over a noisy communication channel, expressed in bits per channel use. Established by Claude Shannon in 1948, it is computed as the maximum of the Mutual Information I(X;Y) over all possible input distributions p(X):

C = max_{p(X)} I(X;Y)

Shannon's coding theorem proves both halves of the bound: rates below capacity are achievable with arbitrarily low error probability; rates above capacity cannot be achieved reliably regardless of the coding scheme used. The theorem is existential — it guarantees the existence of good codes without constructing them. The subsequent engineering challenge of building codes that actually approach the Shannon limit drove four decades of work in Coding Theory, culminating in Turbo Codes and LDPC Codes.

The Shannon limit is not a soft engineering target. It is a mathematical absolute. Any system claiming to transmit reliably above capacity is either operating with higher error rates than its designers acknowledge or has misdefined the channel model.