Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Vocab
  3. Few-Shot Learning

Few-Shot Learning

Training ML models to generalize accurately from only a handful of labeled examples.

Year: 2016Generality: 759
Back to Vocab

Few-shot learning is a machine learning paradigm that enables models to recognize patterns and make accurate predictions from only a small number of labeled training examples — typically between one and five per class. This stands in sharp contrast to conventional supervised learning, which demands thousands or millions of examples to achieve reliable performance. The core challenge is bridging the gap between the richness of human learning, where a child can identify a new animal from a single picture, and the data hunger of standard deep learning systems.

The dominant approaches to few-shot learning fall into three broad families. Meta-learning (or "learning to learn") trains a model across many related tasks so it develops an inductive bias that allows rapid adaptation to new tasks with minimal data — MAML (Model-Agnostic Meta-Learning) is a canonical example. Metric-based methods such as Siamese Networks, Matching Networks, and Prototypical Networks learn an embedding space where examples from the same class cluster together, enabling classification by nearest-neighbor comparison. Transfer learning approaches fine-tune large pretrained models on small target datasets, exploiting representations already learned from massive corpora.

Few-shot learning gained significant traction in the mid-2010s alongside advances in meta-learning and the proliferation of large pretrained models. The introduction of benchmark datasets like Omniglot and miniImageNet gave researchers standardized evaluation grounds, accelerating progress. More recently, large language models such as GPT-3 demonstrated remarkable few-shot capabilities through in-context learning — adapting to new tasks from just a few prompt examples without any weight updates at all, reshaping how the community thinks about the concept.

The practical importance of few-shot learning is substantial. In medicine, rare disease classification may yield only dozens of confirmed cases. In linguistics, thousands of languages lack sufficient digital text for standard training. In personalization, individual user data is inherently sparse. By enabling models to generalize from limited signal, few-shot learning extends the reach of AI into domains where large-scale data collection is economically, ethically, or logistically infeasible.

Related

Related

FSL (Few-Shot Learning)
FSL (Few-Shot Learning)

Training models to generalize accurately from only a handful of labeled examples.

Generality: 710
One-Shot Learning
One-Shot Learning

A learning paradigm where models generalize effectively from a single training example per class.

Generality: 694
Meta-Learning
Meta-Learning

A paradigm enabling models to learn how to learn across tasks efficiently.

Generality: 756
Data-Efficient Learning
Data-Efficient Learning

Machine learning approaches that achieve strong performance with minimal training data.

Generality: 752
Zero-Shot Learning (ZSL)
Zero-Shot Learning (ZSL)

A technique enabling models to recognize concepts never encountered during training.

Generality: 620
Zero-shot Capability
Zero-shot Capability

An AI model's ability to perform unseen tasks without task-specific training examples.

Generality: 650