Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Vocab
  3. FSL (Few-Shot Learning)

FSL (Few-Shot Learning)

Training models to generalize accurately from only a handful of labeled examples.

Year: 2016Generality: 710
Back to Vocab

Few-Shot Learning (FSL) is a machine learning paradigm in which models are trained to recognize patterns and make accurate predictions from an extremely small number of labeled examples — typically one to five per class. This stands in sharp contrast to conventional deep learning, which typically requires thousands or millions of labeled samples to achieve strong performance. FSL is especially valuable in domains where data collection is expensive, time-consuming, or ethically constrained, such as rare disease diagnosis, drug discovery, and low-resource language processing.

The core challenge of FSL is overcoming the severe risk of overfitting when so little training data is available. Researchers have addressed this through several complementary strategies. Meta-learning, or "learning to learn," trains a model across many related tasks so it can rapidly adapt to new ones with minimal examples. Metric-based approaches, such as Prototypical Networks and Siamese Networks, learn embedding spaces where examples from the same class cluster tightly together, enabling classification by proximity. Transfer learning is another common strategy, where a model pre-trained on a large dataset is fine-tuned on the few available target examples, leveraging previously acquired representations.

A standard evaluation framework for FSL is the N-way K-shot task, where a model must distinguish between N classes given only K labeled examples per class. Benchmark datasets like Omniglot and miniImageNet have driven progress by providing standardized test beds for comparing approaches. More recently, large pre-trained models — including vision transformers and large language models — have demonstrated strong few-shot capabilities simply through in-context learning, where examples are provided directly in the input prompt without any gradient updates.

FSL matters because real-world machine learning deployments frequently face data scarcity. The ability to generalize from limited supervision brings AI systems closer to human-like learning efficiency and dramatically expands the range of problems that can be tackled without massive annotation efforts. As data labeling costs remain high and privacy constraints tighten, few-shot learning continues to grow in practical and research importance.

Related

Related

Few-Shot Learning
Few-Shot Learning

Training ML models to generalize accurately from only a handful of labeled examples.

Generality: 759
One-Shot Learning
One-Shot Learning

A learning paradigm where models generalize effectively from a single training example per class.

Generality: 694
Zero-Shot Learning (ZSL)
Zero-Shot Learning (ZSL)

A technique enabling models to recognize concepts never encountered during training.

Generality: 620
Data-Efficient Learning
Data-Efficient Learning

Machine learning approaches that achieve strong performance with minimal training data.

Generality: 752
Sample Efficiency
Sample Efficiency

How well a model learns from limited training data to achieve strong performance.

Generality: 710
Meta-Learning
Meta-Learning

A paradigm enabling models to learn how to learn across tasks efficiently.

Generality: 756