Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Vocab
  3. Regression

Regression

A supervised learning approach that predicts continuous numerical outcomes from input variables.

Year: 1950Generality: 909
Back to Vocab

Regression is a class of supervised machine learning methods that model the relationship between one or more input features and a continuous target variable. Unlike classification, which assigns inputs to discrete categories, regression produces real-valued outputs — making it the natural choice for tasks like predicting house prices, estimating energy consumption, or forecasting demand. The learned model captures how changes in input variables correspond to changes in the output, enabling predictions on new, unseen data.

The most foundational form is linear regression, which fits a weighted sum of input features to minimize prediction error — typically measured as mean squared error. The optimal weights are found analytically via the normal equations or iteratively through gradient descent. More expressive variants include polynomial regression, ridge and lasso regression (which add regularization to prevent overfitting), and logistic regression (which, despite its name, is used for classification by modeling class probabilities). Nonlinear regression methods, including regression trees, support vector regression, and neural networks, can capture complex, high-dimensional relationships that linear models cannot.

Regression sits at the core of machine learning practice. Nearly every neural network trained on a continuous output — whether predicting protein structure energies, stock returns, or sensor readings — is solving a regression problem. Evaluation metrics such as mean absolute error (MAE), root mean squared error (RMSE), and R² score provide standardized ways to assess model quality and compare approaches.

The technique's power lies in its interpretability and versatility. Simple linear models offer transparent, auditable predictions that are critical in regulated industries like healthcare and finance. More complex regression models, meanwhile, achieve state-of-the-art accuracy on challenging benchmarks. Understanding regression — its assumptions, failure modes, and regularization strategies — remains one of the most essential competencies in applied machine learning, forming the conceptual backbone from which more advanced methods are built.

Related

Related

Least Squares Regression
Least Squares Regression

A method that fits models to data by minimizing squared prediction errors.

Generality: 875
Logistic Regression
Logistic Regression

A classification algorithm that models the probability of a binary outcome.

Generality: 838
Supervised Learning
Supervised Learning

Training models on labeled input-output pairs to predict or classify new data.

Generality: 900
Meta-Regressor
Meta-Regressor

An ensemble model that learns from base regressors' predictions to produce a final output.

Generality: 451
Prediction
Prediction

Using learned patterns from data to estimate unknown or future outcomes.

Generality: 964
RMSE (Root Mean Squared Error)
RMSE (Root Mean Squared Error)

A regression metric that penalizes large prediction errors by squaring residuals before averaging.

Generality: 796