State (computer science)
In computer science, state is the set of all stored information and variable values that a program or system can access at a given point in time. State determines what a system will do next: given the same input, a system in different states may produce different outputs. This dependence on history is what distinguishes stateful systems from stateless ones, and it is the source of both the expressive power and the analytical difficulty of imperative programming.
The management of state is the central problem in software engineering at scale. Mutable state introduces temporal coupling: the behavior of a component depends not merely on its inputs but on the sequence of prior operations and the interleaving of concurrent updates. This coupling produces the state space explosion problem, where the number of possible system configurations grows exponentially with the number of mutable variables.
State is not inherently problematic. Biological systems manage enormous state — metabolic concentrations, neural activation patterns, immune memory — and do so robustly. The difficulty in computing is not state itself but the mismatch between how programmers reason (sequentially, locally) and how stateful systems behave (concurrently, globally). Functional programming addresses this by eliminating mutable state; actor models address it by isolating state into autonomous, message-passing entities; transactional memory addresses it by enforcing atomicity. Each approach accepts a different tradeoff between expressiveness and analyzability.
The concept of state in computing has a direct analogue in dynamical systems theory, where state is the point in phase space that determines future trajectories. The convergence is not metaphorical: a Turing machine is a discrete dynamical system, and its tape contents are its state vector.