Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Vocab
  3. Log Odds

Log Odds

The logarithm of the odds ratio, linking probabilities to linear model outputs.

Year: 1944Generality: 694
Back to Vocab

Log odds express the logarithm of the ratio between the probability of an event occurring and the probability of it not occurring: log(p / (1 − p)). While raw probabilities are bounded between 0 and 1, this transformation maps them onto the entire real number line, from negative infinity to positive infinity. This unbounded, continuous range makes log odds far more amenable to linear modeling than probabilities themselves, which is why they serve as the foundational link function in logistic regression.

In logistic regression — one of the most widely used classification algorithms in machine learning — the model directly predicts log odds as a linear combination of input features. Each coefficient in the model represents the change in log odds associated with a one-unit increase in the corresponding predictor, holding all other variables constant. To recover an interpretable probability from these predictions, practitioners apply the sigmoid (logistic) function, which is simply the inverse of the log-odds transformation. This tight mathematical relationship between log odds, the sigmoid function, and probability is central to how logistic regression operates.

Beyond logistic regression, log odds appear throughout machine learning and probabilistic reasoning. In Naive Bayes classifiers, classification decisions can be framed as comparing log-odds ratios derived from class-conditional likelihoods, making inference efficient and numerically stable. In information theory and model calibration, log odds provide a natural way to update beliefs — adding log-odds contributions from independent pieces of evidence corresponds to multiplying probabilities, a property that simplifies sequential Bayesian updating.

The practical importance of log odds extends to model interpretability and diagnostics. Because coefficients in logistic regression operate in log-odds space, practitioners can reason about feature importance and direction of effect without needing to evaluate the full nonlinear probability curve. Converting log-odds coefficients to odds ratios (by exponentiating them) yields another widely used interpretive quantity in clinical and social science applications. As machine learning increasingly intersects with fields demanding transparent, interpretable models, fluency with log odds remains an essential skill for practitioners.

Related

Related

Log Likelihood
Log Likelihood

The logarithm of a likelihood function, simplifying probabilistic model optimization and parameter estimation.

Generality: 838
Logits
Logits

Raw, unnormalized scores output by a neural network before probability conversion.

Generality: 700
Logistic Regression
Logistic Regression

A classification algorithm that models the probability of a binary outcome.

Generality: 838
OOMs (Orders of Magnitude)
OOMs (Orders of Magnitude)

A scale-based framework for comparing quantities using powers of ten.

Generality: 650
Regression
Regression

A supervised learning approach that predicts continuous numerical outcomes from input variables.

Generality: 909
Softmax Function
Softmax Function

Converts a vector of real numbers into a normalized probability distribution over classes.

Generality: 796