Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Research
  3. Wintermute
  4. Analog In-Memory Compute Chips

Analog In-Memory Compute Chips

Chips that compute directly in memory arrays, bypassing data transfer bottlenecks for AI workloads
Back to WintermuteView interactive version

Analog in-memory compute chips perform calculations directly within memory arrays where data is stored, eliminating the von Neumann bottleneck where processors must constantly fetch data from separate memory. These chips use analog circuits to perform matrix operations in-place, using the physical properties of memory cells (like resistance or capacitance) to compute results, dramatically reducing energy consumption and latency compared to traditional digital processors.

This innovation addresses the massive energy consumption of AI systems, where data movement between processors and memory accounts for a significant portion of power usage. By computing where data resides, analog in-memory chips can achieve orders-of-magnitude improvements in energy efficiency for AI workloads, particularly transformer models and continual learning tasks. Companies like Mythic, Syntiant, and various research institutions are developing these technologies, with some chips already deployed in edge devices.

The technology is particularly valuable for edge AI applications where power constraints are critical, such as mobile devices, IoT sensors, and autonomous systems. As AI becomes more pervasive and energy efficiency becomes a competitive advantage, analog in-memory computing offers a pathway to deploying sophisticated AI capabilities in power-constrained environments. However, the technology faces challenges including precision limitations, manufacturing variability, and the need for specialized design tools and algorithms optimized for analog computation.

TRL
5/9Validated
Impact
4/5
Investment
4/5
Category
Hardware

Related Organizations

IBM Research logo
IBM Research

United States · Company

95%

Long-standing leader in neuro-symbolic AI, combining neural networks with logical reasoning for enterprise applications.

Developer
Mythic

United States · Startup

95%

Develops Analog Matrix Processors (AMP) that perform compute-in-memory using flash memory cells.

Developer
Axelera AI logo
Axelera AI

Netherlands · Startup

90%

Designs the Metis AI platform based on in-memory computing for computer vision at the edge.

Developer
EnCharge AI

United States · Startup

90%

Developing in-memory computing technology for AI based on charge-based analog computation.

Developer
Rain AI

United States · Startup

90%

Building analog neuromorphic hardware using memristive nanowire networks for training and inference.

Developer
TetraMem

United States · Startup

90%

Develops analog in-memory computing accelerators using proprietary memristor crossbar technology for edge AI.

Developer
Aspinity logo
Aspinity

United States · Startup

85%

Develops analog machine learning chips for always-on sensing applications.

Developer
Blumind

Canada · Startup

85%

Designing analog semiconductor architecture for edge AI, utilizing standard CMOS manufacturing.

Developer
Samsung Electronics logo
Samsung Electronics

South Korea · Company

85%

Global electronics leader.

Developer

Taiwan Semiconductor Manufacturing Company (TSMC)

Taiwan · Company

80%

Global semiconductor foundry leader providing the advanced manufacturing and packaging processes required for wafer-scale integration.

Developer

Supporting Evidence

Evidence data is not available for this technology yet.

Connections

Hardware
Hardware
In-Memory Computing Chips

Chips that compute directly in memory arrays, eliminating data transfer overhead

TRL
6/9
Impact
5/5
Investment
5/5
Hardware
Hardware
Analog AI Accelerators

Hardware that uses continuous physical signals to run neural networks with far less power than digital chips

TRL
5/9
Impact
4/5
Investment
4/5
Hardware
Hardware
Edge Neuromorphic Processors

Brain-inspired chips running spiking neural networks at milliwatt power for always-on edge AI

TRL
4/9
Impact
4/5
Investment
4/5
Hardware
Hardware
Memristor Crossbar Arrays

Programmable resistive grids that compute neural network operations directly in memory

TRL
5/9
Impact
4/5
Investment
4/5
Hardware
Hardware
Cryogenic AI Processors

AI chips cooled to near-zero temperatures for ultra-fast, near-zero-power computation

TRL
4/9
Impact
4/5
Investment
4/5
Hardware
Hardware
3D-Stacked Neuromorphic Architectures

Vertically stacked chips mimicking brain connectivity for spiking neural networks

TRL
3/9
Impact
5/5
Investment
3/5

Book a research session

Bring this signal into a focused decision sprint with analyst-led framing and synthesis.
Research Sessions