Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Vocab
  3. GFlowNet (Generative Flow Network)

GFlowNet (Generative Flow Network)

A generative framework that learns to sample compositional objects proportional to a reward.

Year: 2021Generality: 339
Back to Vocab

A Generative Flow Network (GFlowNet) is a probabilistic generative framework designed to learn policies that sample compositional objects—such as molecular graphs, sequences, or causal structures—with probability proportional to a given reward function. Unlike standard generative models that maximize likelihood or variational objectives, GFlowNets treat generation as a sequential decision-making process: an agent constructs an object step by step by taking actions in a directed acyclic graph of states, and training encourages the resulting flow of probability mass to satisfy a consistency condition known as the flow-matching or detailed balance constraint. This makes GFlowNets closely related to reinforcement learning, but with a fundamentally different goal—diversity of high-reward samples rather than maximization of a single reward.

The core training objective ensures that the total flow into any intermediate state equals the total flow out, analogous to conservation laws in physical flow networks. By satisfying these constraints across all states, the learned policy generates terminal objects with frequencies proportional to their rewards. This property is especially valuable when the reward landscape is multimodal: where a greedy or maximum-likelihood approach would collapse onto a single high-reward mode, a GFlowNet naturally explores and represents the full distribution of good solutions. Training can be performed using variants such as trajectory balance, which provides more stable and efficient gradient estimates than earlier flow-matching formulations.

GFlowNets are particularly well-suited to scientific discovery tasks where diversity matters as much as quality. In drug discovery, for example, a model that proposes many structurally distinct high-affinity molecules is far more useful than one that repeatedly suggests the same compound. They have also been applied to Bayesian structure learning, combinatorial optimization, and active learning, where the ability to maintain uncertainty and explore broadly is critical. Their connection to amortized variational inference and energy-based models gives them a principled probabilistic interpretation, allowing them to serve as flexible approximate samplers for intractable posteriors.

Introduced by Yoshua Bengio and collaborators in 2021, GFlowNets have rapidly attracted research interest as a unifying framework bridging reinforcement learning, probabilistic inference, and deep generative modeling. Their ability to turn reward signals into calibrated generative distributions positions them as a promising tool wherever exploration and diversity are essential.

Related

Related

Normalizing Flows
Normalizing Flows

Generative models that learn complex distributions via composed invertible transformations with exact likelihoods.

Generality: 694
PFGM (Poisson Flow Generative Model)
PFGM (Poisson Flow Generative Model)

A generative model that maps data distributions using electric field dynamics in augmented space.

Generality: 101
Generative Workflow
Generative Workflow

An end-to-end AI pipeline that produces original content by learning from data.

Generality: 694
Flow Engineering
Flow Engineering

A structured, iterative methodology for guiding AI models through multi-phase problem-solving workflows.

Generality: 339
GAN (Generative Adversarial Network)
GAN (Generative Adversarial Network)

A framework where two neural networks compete to generate realistic synthetic data.

Generality: 838
GQN (Generative Query Network)
GQN (Generative Query Network)

A neural architecture that infers and renders 3D scenes from limited viewpoint observations.

Generality: 292