Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Vocab
  3. Bayesian Network

Bayesian Network

A probabilistic graphical model encoding conditional dependencies among variables via directed acyclic graphs.

Year: 1985Generality: 794
Back to Vocab

A Bayesian Network is a probabilistic graphical model that represents a set of variables and their conditional dependencies using a directed acyclic graph (DAG). Each node in the graph corresponds to a random variable — which may be observable, latent, or hypothetical — while each directed edge encodes a conditional dependency between a parent and child node. The strength of these relationships is quantified through conditional probability tables (CPTs) attached to each node, specifying the probability of each variable's states given the states of its parents. Together, the graph structure and CPTs define a compact factorization of the full joint probability distribution over all variables.

Inference in Bayesian Networks involves computing the posterior probability of unobserved variables given evidence about observed ones. Exact inference algorithms such as variable elimination and belief propagation exploit the graph's conditional independence structure to perform these computations efficiently. For large or densely connected networks where exact inference becomes intractable, approximate methods like Markov Chain Monte Carlo (MCMC) sampling or variational inference are employed. Learning a Bayesian Network from data involves two tasks: structure learning, which identifies the DAG topology, and parameter learning, which estimates the CPTs — both of which can be approached from frequentist or Bayesian perspectives.

Bayesian Networks are particularly powerful because they make conditional independence assumptions explicit and interpretable, allowing domain experts to incorporate prior knowledge directly into the model structure. This transparency distinguishes them from many black-box machine learning approaches and makes them well-suited for applications where explainability matters, such as medical diagnosis, fault detection, and risk assessment. They also handle missing data naturally through marginalization, a significant practical advantage.

The framework became foundational to modern probabilistic machine learning and influenced the development of more expressive models, including dynamic Bayesian Networks for temporal data and hierarchical Bayesian models. Concepts central to Bayesian Networks — such as d-separation, Markov blankets, and belief propagation — remain core tools in probabilistic reasoning and underpin many contemporary approaches in causal inference and generative modeling.

Related

Related

Bayesian Neural Network
Bayesian Neural Network

A neural network that represents uncertainty by placing probability distributions over its weights.

Generality: 707
Bayesian Inference
Bayesian Inference

A statistical method that updates probability estimates as new evidence arrives.

Generality: 871
DBN (Deep Belief Network)
DBN (Deep Belief Network)

A generative neural network built from stacked Restricted Boltzmann Machines trained layer by layer.

Generality: 694
Probabilistic Inference
Probabilistic Inference

Drawing conclusions from uncertain or incomplete data using probability theory.

Generality: 875
DAG (Directed Acyclic Graph)
DAG (Directed Acyclic Graph)

A directed graph with no cycles, used to represent dependencies and computation flows.

Generality: 796
Markov Blanket
Markov Blanket

The minimal set of variables that renders a node conditionally independent of all others.

Generality: 709