Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Vocab
  3. Active Learning

Active Learning

A training strategy where a model selectively queries the most informative unlabeled examples to learn efficiently.

Year: 1992Generality: 731
Back to Vocab

Active learning is a machine learning paradigm in which a model actively participates in its own training by selecting which data points it wants labeled, rather than passively consuming a pre-labeled dataset. The core motivation is practical: in many real-world domains — medical imaging, legal document analysis, scientific literature — acquiring raw data is cheap but obtaining expert annotations is expensive and slow. By strategically choosing which examples to present to a human annotator, an active learner aims to achieve high accuracy with far fewer labeled samples than standard supervised learning would require.

The selection process is driven by query strategies that estimate which unlabeled examples would be most informative if labeled. Uncertainty sampling picks examples the model is least confident about — those near a decision boundary, for instance. Query-by-committee maintains an ensemble of models and selects examples where the ensemble disagrees most. Expected model change and expected error reduction strategies choose examples that would most alter the model's parameters or most reduce generalization error, respectively. In practice, uncertainty sampling is the most widely used due to its simplicity and computational efficiency, while more sophisticated strategies are applied when the labeling budget is extremely tight.

Active learning has become increasingly relevant as deep learning models demand massive labeled datasets that are costly to produce. It integrates naturally with semi-supervised learning and self-supervised pretraining, where a small actively-selected labeled set fine-tunes a model pretrained on abundant unlabeled data. Applications span drug discovery, autonomous driving data curation, and low-resource NLP. A persistent challenge is the cold-start problem — the model needs some initial labeled data to make meaningful queries — and the risk of introducing sampling bias if the query strategy systematically avoids certain regions of the input space.

Related

Related

Semi-Supervised Learning
Semi-Supervised Learning

Training models using both small labeled datasets and large unlabeled datasets together.

Generality: 796
Data-Efficient Learning
Data-Efficient Learning

Machine learning approaches that achieve strong performance with minimal training data.

Generality: 752
EDL (Experimentation Driven Learning)
EDL (Experimentation Driven Learning)

A learning paradigm where AI agents improve by actively experimenting within their environment.

Generality: 322
Active Inference
Active Inference

A framework where agents minimize prediction errors through both perception and action.

Generality: 590
Autonomous Learning
Autonomous Learning

AI systems that independently adapt and improve through environmental interaction without human intervention.

Generality: 792
Eager Learning
Eager Learning

A learning approach that builds a complete global model before any predictions are made.

Generality: 694