Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Vocab
  3. Symbolic Descent

Symbolic Descent

An optimization method that searches over symbolic programs instead of tuning neural network weights

Year: 2025Generality: 264
Back to Vocab

Symbolic descent is a proposed optimization technique that replaces gradient descent's continuous parameter tuning with discrete search through the space of symbolic programs. Where gradient descent fits a parametric curve to data by iteratively adjusting weights, symbolic descent seeks the simplest possible symbolic expression — a compact program — that explains the observed input-output relationship. The term was introduced by François Chollet's lab NDIA in the context of building alternatives to deep learning from first principles.

The core motivation is a fundamental limitation of parametric models: they approximate functions by fitting enormous numbers of parameters, producing representations that are large, opaque, and brittle outside their training distribution. Symbolic descent instead aims to find minimal-length programs that capture the underlying structure of data. Because these programs are symbolic rather than numerical, traditional gradient-based optimization cannot be applied directly. The method requires new search algorithms that navigate discrete program spaces efficiently — a challenge closer to program synthesis than to conventional machine learning.

The practical implications are significant if the approach scales. Symbolic models that are orders of magnitude smaller than their neural counterparts would require far less data to learn, run far more efficiently at inference time, and generalize more robustly to novel inputs because they encode structural understanding rather than statistical correlation. They would also compose more naturally, since small symbolic modules can be chained and recombined in ways that massive parameter tensors cannot. The approach represents a direct response to evidence — most dramatically from benchmarks like ARC-AGI — that scaling parametric models alone does not produce the adaptive, exploratory reasoning characteristic of general intelligence.

Related

Related

Symbolic Regression
Symbolic Regression

An algorithm-driven search for mathematical expressions that best fit observed data.

Generality: 550
Gradient Descent
Gradient Descent

An iterative optimization algorithm that minimizes a function by following its steepest downhill direction.

Generality: 909
Double Descent
Double Descent

Test error drops, rises, then drops again as model complexity increases.

Generality: 599
Symbolic Computing
Symbolic Computing

An AI paradigm that manipulates human-readable symbols and logic to represent knowledge and reason.

Generality: 650
Symbolic AI
Symbolic AI

An AI paradigm that represents knowledge as explicit symbols manipulated through logical rules.

Generality: 720
Neurosymbolic AI
Neurosymbolic AI

AI systems combining neural network learning with symbolic reasoning for human-like cognition.

Generality: 694