Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Vocab
  3. EBM (Energy-Based Model)

EBM (Energy-Based Model)

A model class that assigns lower energy scores to more probable data configurations.

Year: 2006Generality: 694
Back to Vocab

Energy-Based Models (EBMs) are a broad family of machine learning models that frame learning as the process of shaping a scalar energy function over the space of inputs and outputs. Rather than directly modeling probabilities, an EBM assigns a real-valued energy score to each configuration of variables, with lower energies corresponding to more compatible or likely configurations. Inference then involves finding the variable assignments that minimize this energy, and learning involves adjusting the energy function so that correct or observed configurations receive lower scores than incorrect or unobserved ones.

The energy function itself is typically a parameterized neural network, giving EBMs the expressive power to capture complex, high-dimensional dependencies. Training requires contrasting low-energy regions (where real data lives) against high-energy regions (where data does not). This is non-trivial because computing the partition function — the normalizing constant needed to convert energies into probabilities — is generally intractable for high-dimensional data. As a result, EBM training often relies on techniques such as contrastive divergence, noise-contrastive estimation, or Markov Chain Monte Carlo (MCMC) sampling to approximate the necessary gradients.

EBMs gained renewed attention in the deep learning era as researchers recognized their flexibility: unlike directed generative models such as VAEs or GANs, EBMs impose no constraints on the architecture of the energy function and can naturally handle structured outputs, missing data, and multimodal distributions. They have been applied to image generation, anomaly detection, structured prediction, and reinforcement learning, among other domains. Notable instantiations include Restricted Boltzmann Machines, Deep Boltzmann Machines, and more recent score-based and diffusion models, which can be interpreted through an energy-based lens.

The appeal of EBMs lies in their conceptual unification: virtually any model that scores configurations can be viewed as an EBM, making the framework a powerful lens for understanding and designing learning systems. Their main practical challenge remains efficient training in high dimensions, an active area of research that continues to yield new algorithms and architectural innovations.

Related

Related

Energy-Based Models
Energy-Based Models

A framework that scores variable configurations with a scalar energy instead of an explicit probability.

Generality: 694
Boltzmann Machine
Boltzmann Machine

A stochastic recurrent network that learns probability distributions over binary variables.

Generality: 694
Restricted Boltzmann Machines (RBMs)
Restricted Boltzmann Machines (RBMs)

Generative neural networks that learn probability distributions over input data using two layers.

Generality: 692
Variational Free Energy
Variational Free Energy

A bound on model evidence used to approximate intractable posterior distributions efficiently.

Generality: 650
Thermodynamic Bayesian Inference
Thermodynamic Bayesian Inference

A framework unifying thermodynamic principles with Bayesian inference through energy minimization.

Generality: 450
DBN (Deep Belief Network)
DBN (Deep Belief Network)

A generative neural network built from stacked Restricted Boltzmann Machines trained layer by layer.

Generality: 694