Hebbian Learning
Hebbian learning is the oldest biologically-inspired learning rule in neuroscience: neurons that fire together, wire together. First proposed by Donald Hebb in 1949, it states that the strength of a synaptic connection increases when pre- and post-synaptic neurons are active simultaneously. The rule requires no external reward signal or global error gradient; learning is purely local and self-organizing, driven by correlation in neural activity.
Hebbian plasticity is the mechanism underlying synaptic plasticity and the foundation of unsupervised learning in artificial neural networks. Its limitation is clear: pure correlation learning cannot distinguish causal from coincident activation, and uncorrelated inputs decay to zero strength. The BCM theory of synaptic modification was developed precisely to address this limitation by introducing sliding thresholds for long-term potentiation and depression.