Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Vocab
  3. Hyperdimensional Computing

Hyperdimensional Computing

A computing paradigm using high-dimensional random vectors to represent and process information robustly.

Year: 2009Generality: 339
Back to Vocab

Hyperdimensional computing (HDC) is a computational framework that represents data as dense, high-dimensional binary or bipolar vectors — typically with 10,000 or more dimensions — called hypervectors. These hypervectors are generated randomly but exploit the mathematical properties of high-dimensional spaces, where random vectors are nearly orthogonal to one another with overwhelming probability. This near-orthogonality provides a natural mechanism for distinguishing between distinct concepts while still allowing meaningful similarity comparisons, forming the foundation for encoding, storing, and retrieving information in a noise-tolerant way.

The core operations in HDC are simple and hardware-friendly: binding (element-wise multiplication or XOR) combines two hypervectors into a new one representing their association, bundling (element-wise addition or majority vote) creates a superposition of multiple hypervectors, and permutation (cyclic shifting) encodes sequential or positional information. These operations are closed over the hypervector space, meaning the results remain valid hypervectors. By composing these primitives, HDC systems can encode complex structured data — sequences, graphs, sets — into single fixed-width vectors that support fast associative lookup via similarity search against a stored item memory.

In machine learning contexts, HDC has attracted interest as a lightweight alternative to deep neural networks for classification and pattern recognition tasks, particularly in resource-constrained environments like edge devices and neuromorphic hardware. Because training often reduces to a single pass of vector accumulation rather than iterative gradient descent, HDC models can learn incrementally and adapt quickly to new classes without catastrophic forgetting. This makes them appealing for continual learning and few-shot learning scenarios where conventional deep learning struggles or is computationally prohibitive.

The framework draws theoretical inspiration from Pentti Kanerva's work on sparse distributed memory in the late 1980s and was formalized as a computing paradigm in the 2000s, gaining significant traction in the ML community around 2009 as researchers demonstrated competitive accuracy on real-world benchmarks with dramatically lower energy and latency costs. Today HDC is actively explored for biosignal classification, natural language processing, robotics, and brain-inspired AI architectures, positioning it as a compelling bridge between cognitive science and practical machine learning.

Related

Related

Dimension
Dimension

The number of independent axes defining a vector space used to represent data.

Generality: 895
Hyperspherical Representation Learning
Hyperspherical Representation Learning

Learning data representations constrained to a hypersphere to exploit its geometric properties.

Generality: 314
HPC (High Performance Computing)
HPC (High Performance Computing)

Aggregated computing infrastructure delivering processing power far beyond standard workstations.

Generality: 792
Hyperobject
Hyperobject

Massively distributed entities transcending localization, challenging AI systems managing vast complexity.

Generality: 293
AIMC (Analog In-Memory Computing)
AIMC (Analog In-Memory Computing)

A hardware paradigm that computes matrix operations directly inside analog memory arrays.

Generality: 293
DNC (Differentiable Neural Computer)
DNC (Differentiable Neural Computer)

A neural network augmented with external, differentiable memory for complex reasoning tasks.

Generality: 485