Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Vocab
  3. Neuroevolution

Neuroevolution

Using evolutionary algorithms to optimize neural network architectures and weights.

Year: 1990Generality: 581
Back to Vocab

Neuroevolution is a family of techniques that applies evolutionary algorithms to the design and training of artificial neural networks, treating network architectures and connection weights as genomes subject to selection, mutation, and recombination. Rather than relying on gradient-based optimization like backpropagation, neuroevolution maintains a population of candidate networks, evaluates each on a target task, and iteratively selects the best performers to produce successive generations of improved solutions. This population-based search can simultaneously optimize both the structural topology of a network and its numerical parameters, a capability that distinguishes it from conventional training methods.

The mechanics of neuroevolution vary across algorithms, but the core loop involves fitness evaluation, selection pressure, and genetic operators. Crossover combines structural or weight information from two parent networks, while mutation introduces random perturbations to weights, connections, or layer configurations. A landmark advance came with Kenneth Stanley and Risto Miikkulainen's NEAT algorithm (2002), which solved the competing conventions problem in network crossover by tracking gene history and allowing topological complexity to grow incrementally from minimal starting structures. More recent approaches such as NEAT variants, novelty search, and quality-diversity algorithms have extended these ideas considerably.

Neuroevolution is especially valuable in reinforcement learning settings where reward signals are sparse, delayed, or non-differentiable, making gradient-based methods unreliable or inapplicable. It also offers a natural mechanism for architecture search, discovering unconventional network topologies that human designers or gradient-based neural architecture search might overlook. OpenAI and other research groups demonstrated in the late 2010s that simple evolution strategies could match or exceed deep reinforcement learning on challenging continuous control benchmarks, renewing broad interest in the field.

Beyond performance, neuroevolution offers practical advantages in parallelization, since fitness evaluations across a population are independent and can be distributed across many processors. It also avoids vanishing gradient problems and does not require differentiable loss functions, broadening the range of tasks it can address. As hardware and algorithmic efficiency improve, neuroevolution continues to serve as both a competitive optimization strategy and a tool for studying how complex neural structures can emerge through open-ended search processes.

Related

Related

Evolutionary Algorithm
Evolutionary Algorithm

Optimization methods that evolve populations of candidate solutions through selection, crossover, and mutation.

Generality: 796
Directed Evolution
Directed Evolution

Iteratively improving models or algorithms by mimicking biological natural selection.

Generality: 571
Neurogenesis
Neurogenesis

The biological formation of new neurons, inspiring adaptive neural network architectures.

Generality: 322
NAS (Neural Architecture Search)
NAS (Neural Architecture Search)

Automated method that discovers optimal neural network architectures without manual design.

Generality: 694
NeuMeta (Neural Metamorphosis)
NeuMeta (Neural Metamorphosis)

A framework enabling neural networks to structurally and functionally transform across tasks without retraining.

Generality: 102
Neurode
Neurode

A simplified computational unit modeling a biological neuron within artificial neural networks.

Generality: 694