Recurrent neural networks that store and retrieve patterns as energy-minimizing attractors.
Hopfield networks are fully connected recurrent neural networks with symmetric weights whose dynamics are governed by a Lyapunov energy function. Rather than learning a mapping from input to output, they store a set of target patterns as stable fixed points — local minima of the energy landscape. When presented with a partial or noisy version of a stored pattern, the network iteratively updates its units according to a simple threshold rule, descending the energy surface until it settles into the nearest attractor. This makes Hopfield networks a form of content-addressable or associative memory: retrieval is driven by similarity to stored content rather than by an explicit address.
The classical binary model uses N binary units with weights set via a Hebbian learning rule — essentially the outer product sum of the stored patterns. Asynchronous or synchronous update schedules both guarantee convergence under the symmetric-weight constraint. The network's storage capacity is approximately 0.138N uncorrelated random binary patterns before retrieval errors become frequent, a result derived through statistical physics analogies to Ising spin glasses. Extensions to continuous-valued neurons, stochastic update rules (Glauber dynamics), and sparse coding schemes address some of the original model's limitations, including spurious attractors and sensitivity to correlated patterns.
Hopfield networks gained renewed relevance in the late 2010s through the development of modern or dense associative memory variants. By replacing the quadratic energy function with a higher-order polynomial or exponential interaction term, these models achieve exponentially larger storage capacity and broader attractor basins. Crucially, researchers showed that the update rule of modern Hopfield networks is mathematically equivalent to the scaled dot-product attention mechanism central to Transformer architectures, forging a direct conceptual link between classical associative memory and contemporary deep learning.
Beyond memory retrieval, Hopfield networks have been applied to combinatorial optimization problems — where the energy minimum corresponds to an approximate solution — and serve as a foundational example of energy-based models in machine learning. Their analysis established core ideas about attractor dynamics, memory capacity, and the relationship between learning rules and generalization that continue to inform modern neural network theory.