Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Vocab
  3. Support Vector Machine (SVM)

Support Vector Machine (SVM)

A supervised learning model that classifies data by finding the optimal separating hyperplane.

Year: 1995Generality: 720
Back to Vocab

A Support Vector Machine is a supervised learning algorithm that solves classification and regression problems by identifying the hyperplane that maximally separates data points belonging to different classes. The core insight is margin maximization: rather than finding any boundary that separates classes, an SVM finds the one with the greatest distance to the nearest data points on either side — those boundary-defining points are called support vectors. This maximum-margin approach gives SVMs strong generalization properties, reducing the risk of overfitting even in high-dimensional feature spaces.

A critical innovation that expanded SVMs from linear to nonlinear problems is the kernel trick. By implicitly mapping input data into a higher-dimensional feature space using kernel functions — such as polynomial, radial basis function (RBF), or sigmoid kernels — SVMs can find linear separating hyperplanes in that transformed space, which correspond to complex nonlinear boundaries in the original input space. This allows a single algorithmic framework to handle a wide variety of data geometries without explicitly computing expensive high-dimensional transformations. The soft-margin extension further improved practical applicability by allowing some misclassifications, controlled by a regularization parameter, making SVMs robust to noisy or overlapping class distributions.

SVMs became a dominant machine learning method through the 1990s and 2000s, particularly excelling in text classification, image recognition, and bioinformatics tasks where data is high-dimensional but training sets are relatively small. They offer strong theoretical guarantees rooted in statistical learning theory and produce sparse, interpretable models defined entirely by their support vectors. While deep learning has displaced SVMs in many large-scale applications, they remain highly competitive in low-to-medium data regimes and are valued for their mathematical rigor, predictable behavior, and effectiveness without extensive hyperparameter tuning.

Related

Related

Kernel Method
Kernel Method

Algorithms that implicitly map data into high-dimensional spaces using kernel functions.

Generality: 796
Margin
Margin

The distance between a decision boundary and the nearest data points of each class.

Generality: 774
Supervised Classifier
Supervised Classifier

A model trained on labeled data to predict categories for new, unseen inputs.

Generality: 750
Hyperplane
Hyperplane

A flat subspace of one fewer dimension than its ambient space, used to separate data classes.

Generality: 792
Supervised Learning
Supervised Learning

Training models on labeled input-output pairs to predict or classify new data.

Generality: 900
Discriminative AI
Discriminative AI

Models that learn decision boundaries between classes rather than modeling data distributions.

Generality: 781