Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Vocab
  3. Normalizing Flows

Normalizing Flows

Generative models that learn complex distributions via composed invertible transformations with exact likelihoods.

Year: 2015Generality: 694
Back to Vocab

Normalizing flows are a family of generative models that transform a simple, tractable base distribution—typically a standard Gaussian—into a complex target distribution by composing a sequence of invertible, differentiable mappings. Because each transformation is bijective, the model can compute exact log-likelihoods using the change-of-variables formula: the log-probability of a data point equals the log-probability of its latent encoding plus the sum of log absolute Jacobian determinants across all transformations. This stands in contrast to variational autoencoders, which optimize a lower bound on likelihood, and GANs, which offer no likelihood estimate at all.

The central engineering challenge in normalizing flows is designing transformations that are simultaneously expressive, efficiently invertible, and cheap to differentiate. Coupling-layer architectures such as NICE and RealNVP achieve this by splitting dimensions and applying conditionally affine maps, keeping Jacobians triangular and thus O(d) to compute. Autoregressive flows like MAF and IAF exploit autoregressive structure for the same triangular-Jacobian benefit, but trade off density evaluation speed against sampling speed depending on the direction of conditioning. Invertible 1×1 convolutions, introduced in Glow, extend these ideas to image generation with competitive visual quality. Continuous normalizing flows (FFJORD) replace discrete compositions with neural ODEs, allowing architecturally unconstrained transformations at the cost of numerical ODE integration during both training and inference.

Normalizing flows matter because exact likelihood is a powerful property: it enables principled model comparison, anomaly detection, and use as expressive variational posteriors in Bayesian inference without the approximation gap of ELBO-based methods. They have been applied to image synthesis, speech generation, molecular conformation modeling, and density estimation in scientific domains where calibrated uncertainty is critical. Their theoretical cleanliness also makes them a natural testbed for studying generative model expressivity and the geometry of learned representations.

Despite their elegance, normalizing flows face practical limitations: architectural constraints required for tractable Jacobians can limit expressivity relative to diffusion models or GANs, and scaling to very high-dimensional data remains computationally demanding. Research into more flexible flow families and hybrid architectures continues to be an active area, with flows increasingly used as components within larger probabilistic pipelines rather than as standalone generative models.

Related

Related

GFlowNet (Generative Flow Network)
GFlowNet (Generative Flow Network)

A generative framework that learns to sample compositional objects proportional to a reward.

Generality: 339
PFGM (Poisson Flow Generative Model)
PFGM (Poisson Flow Generative Model)

A generative model that maps data distributions using electric field dynamics in augmented space.

Generality: 101
DDN (Discrete Distribution Networks)
DDN (Discrete Distribution Networks)

Neural architectures that model and transform discrete probability distributions over categorical data.

Generality: 337
RGM (Renormalizing Generative Model)
RGM (Renormalizing Generative Model)

A generative model framework borrowing renormalization principles from physics to handle high-dimensional data.

Generality: 104
Diffusion Models
Diffusion Models

Generative models that learn to reverse a noise-addition process to synthesize new data.

Generality: 796
Generative Model
Generative Model

A model that learns data distributions to synthesize realistic new samples.

Generality: 896