Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Vocab
  3. NAS (Neural Architecture Search)

NAS (Neural Architecture Search)

Automated method that discovers optimal neural network architectures without manual design.

Year: 2017Generality: 694
Back to Vocab

Neural Architecture Search (NAS) is a subfield of automated machine learning (AutoML) that uses computational methods to discover high-performing neural network architectures automatically, replacing the traditionally manual and expertise-intensive process of network design. Rather than relying on human intuition to determine the number of layers, connection patterns, activation functions, and other structural choices, NAS treats architecture design as an optimization problem and searches systematically through a defined space of possible configurations.

NAS methods generally consist of three components: a search space defining which architectural choices are possible, a search strategy for exploring that space, and a performance estimation strategy for evaluating candidate architectures. Early approaches used reinforcement learning, training a controller network to generate architecture descriptions and rewarding it based on validation accuracy. Evolutionary algorithms offered an alternative by mutating and selecting architectures across generations. More recent gradient-based methods, such as DARTS (Differentiable Architecture Search), made the search dramatically more efficient by relaxing the discrete architecture choices into continuous parameters that can be optimized with standard backpropagation, reducing search time from thousands of GPU-hours to a handful.

NAS has produced architectures that rival or surpass hand-crafted designs on benchmarks in image classification, object detection, and natural language processing. Models like NASNet, EfficientNet, and MobileNetV3 emerged from or were informed by NAS pipelines and became widely adopted in production systems. The technique also enables hardware-aware search, where the optimization explicitly accounts for latency, memory footprint, or energy consumption on target devices such as mobile phones or edge hardware.

Despite its power, NAS carries significant computational costs and risks of overfitting to benchmark datasets, prompting ongoing research into more efficient and generalizable search methods. It represents a broader trend in machine learning toward automating the design decisions that once required deep specialist knowledge, lowering barriers to building state-of-the-art models across diverse application domains.

Related

Related

Neuroevolution
Neuroevolution

Using evolutionary algorithms to optimize neural network architectures and weights.

Generality: 581
Search Optimization
Search Optimization

Techniques for efficiently finding optimal solutions within large, complex solution spaces.

Generality: 794
Search
Search

Systematic exploration of a problem space to find goal-achieving solutions or action sequences.

Generality: 871
Directed Evolution
Directed Evolution

Iteratively improving models or algorithms by mimicking biological natural selection.

Generality: 571
Neural Network
Neural Network

A layered system of interconnected nodes that learns patterns from data.

Generality: 947
NeuMeta (Neural Metamorphosis)
NeuMeta (Neural Metamorphosis)

A framework enabling neural networks to structurally and functionally transform across tasks without retraining.

Generality: 102