Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Vocab
  3. Conditional Probability

Conditional Probability

The probability of an event occurring given that another event has already occurred.

Year: 1950Generality: 961
Back to Vocab

Conditional probability is a foundational concept in probability theory that quantifies how likely an event A is to occur given that event B has already taken place. Formally written as P(A|B), it is computed as P(A ∩ B) / P(B), provided P(B) > 0. This ratio captures how knowledge of one event reshapes the probability landscape for another, making it a precise mathematical tool for reasoning under uncertainty. When A and B are independent, knowing B provides no information about A and P(A|B) = P(A); when they are dependent, the conditional probability diverges meaningfully from the marginal.

In machine learning, conditional probability is everywhere. Generative classifiers like Naive Bayes model P(class | features) directly. Language models learn P(next token | previous tokens) to generate coherent text. Probabilistic graphical models — Bayesian networks and Markov random fields — are built entirely from conditional probability relationships between variables. Even discriminative models like logistic regression can be interpreted as estimating conditional distributions. The chain rule of probability, which decomposes joint distributions into products of conditionals, underpins how complex probabilistic models are constructed and trained.

The concept also anchors Bayesian inference, where a prior belief P(hypothesis) is updated with observed evidence via Bayes' theorem to yield a posterior P(hypothesis | evidence). This framework is central to probabilistic machine learning, enabling models to quantify uncertainty, incorporate domain knowledge, and update beliefs as new data arrives. Understanding conditional probability is essentially a prerequisite for any serious engagement with probabilistic reasoning, statistical modeling, or modern deep learning theory.

Related

Related

Probabilistic Inference
Probabilistic Inference

Drawing conclusions from uncertain or incomplete data using probability theory.

Generality: 875
Conditional Generation
Conditional Generation

Generative models producing outputs constrained or guided by specified input conditions.

Generality: 713
Naive Bayes Classifier
Naive Bayes Classifier

A probabilistic classifier assuming all input features are mutually independent given the class.

Generality: 694
Probability Density Function
Probability Density Function

A function describing the relative likelihood of a continuous random variable's values.

Generality: 875
Probabilistic Programming
Probabilistic Programming

A programming paradigm that encodes uncertainty and statistical reasoning directly in code.

Generality: 756
Bayesian Network
Bayesian Network

A probabilistic graphical model encoding conditional dependencies among variables via directed acyclic graphs.

Generality: 794