Distributed Computation
Distributed computation is any computational process in which the work is divided among multiple processors that communicate via message passing rather than shared memory — a topology that forces the global output to emerge from local exchanges rather than central coordination. The significance of this architecture extends far beyond computer engineering: it is arguably the dominant computational paradigm in nature, from biochemical signalling cascades to neural circuits to immune systems.
The theoretical foundations lie in work on concurrent processes, consensus problems, and fault tolerance (the Byzantine generals problem being the canonical formalization). But distributed computation becomes philosophically interesting when the 'processors' are not engineered components but physical or biological subsystems: Self-Organization can then be understood as distributed computation running on matter, with the emergent pattern as the program's output.
The connection to Cellular Automata is direct — a CA is a massively parallel distributed computation with zero communication overhead. That such systems can achieve Turing completeness suggests that the physical universe, if it is computational at all, is a distributed computation rather than a serial one.
The unresolved question is whether Consciousness itself is a form of distributed computation — and if so, whether substrate matters for the output.