Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • My Collection
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
Analog In-Memory Compute Chips | Wintermute | Envisioning
  1. Home
  2. Research
  3. Wintermute
  4. Analog In-Memory Compute Chips

Analog In-Memory Compute Chips

Chips optimized for transformer attention paths and continual learning.
BACK TO WINTERMUTE

Related Organizations

IBM Research logo
IBM Research

US · Company

95%

Long-standing leader in neuro-symbolic AI, combining neural networks with logical reasoning for enterprise applications.

Developer
Mythic

US · Startup

95%

Develops Analog Matrix Processors (AMP) that perform compute-in-memory using flash memory cells.

Developer
Axelera AI logo
Axelera AI

NL · Startup

90%

Designs the Metis AI platform based on in-memory computing for computer vision at the edge.

Developer

Supporting Evidence

Evidence data is not available for this technology yet.

Connections

Explore this signal in your context

Get a focused view of implications, timing, and action options for your organization.
Discuss this signal
VIEW INTERACTIVE VERSION
EnCharge AI

US · Startup

90%

Developing in-memory computing technology for AI based on charge-based analog computation.

Developer
Rain AI

US · Startup

90%

Building analog neuromorphic hardware using memristive nanowire networks for training and inference.

Developer
TetraMem

US · Startup

90%

Develops analog in-memory computing accelerators using proprietary memristor crossbar technology for edge AI.

Developer
Aspinity logo
Aspinity

US · Startup

85%

Develops analog machine learning chips for always-on sensing applications.

Developer
Blumind

CA · Startup

85%

Designing analog semiconductor architecture for edge AI, utilizing standard CMOS manufacturing.

Developer
Samsung Electronics logo
Samsung Electronics

KR · Company

85%

Global electronics leader.

Developer

Taiwan Semiconductor Manufacturing Company (TSMC)

TW · Company

80%

Global semiconductor foundry leader providing the advanced manufacturing and packaging processes required for wafer-scale integration.

Developer
Hardware
Hardware
In-Memory Computing Chips

Logic and memory co-located to remove von Neumann bottlenecks.

TRL
6/9
Impact
5/5
Investment
5/5
Hardware
Hardware
Analog AI Accelerators

Continuous-value compute blocks delivering high TOPS per watt.

TRL
5/9
Impact
4/5
Investment
4/5
Hardware
Hardware
Edge Neuromorphic Processors

Ultra-low-power spiking chips for always-on embodied agents.

TRL
4/9
Impact
4/5
Investment
4/5
Hardware
Hardware
Memristor Crossbar Arrays

Resistive memory arrays executing MAC operations natively.

TRL
5/9
Impact
4/5
Investment
4/5
Hardware
Hardware
Cryogenic AI Processors

Superconducting logic enabling ultra-low-power AI acceleration.

TRL
4/9
Impact
4/5
Investment
4/5
Hardware
Hardware
3D-Stacked Neuromorphic Architectures

Hardware supporting sparse, recurrent, and spiking behavior.

TRL
3/9
Impact
5/5
Investment
3/5

Analog in-memory compute chips perform calculations directly within memory arrays where data is stored, eliminating the von Neumann bottleneck where processors must constantly fetch data from separate memory. These chips use analog circuits to perform matrix operations in-place, using the physical properties of memory cells (like resistance or capacitance) to compute results, dramatically reducing energy consumption and latency compared to traditional digital processors.

This innovation addresses the massive energy consumption of AI systems, where data movement between processors and memory accounts for a significant portion of power usage. By computing where data resides, analog in-memory chips can achieve orders-of-magnitude improvements in energy efficiency for AI workloads, particularly transformer models and continual learning tasks. Companies like Mythic, Syntiant, and various research institutions are developing these technologies, with some chips already deployed in edge devices.

The technology is particularly valuable for edge AI applications where power constraints are critical, such as mobile devices, IoT sensors, and autonomous systems. As AI becomes more pervasive and energy efficiency becomes a competitive advantage, analog in-memory computing offers a pathway to deploying sophisticated AI capabilities in power-constrained environments. However, the technology faces challenges including precision limitations, manufacturing variability, and the need for specialized design tools and algorithms optimized for analog computation.

TRL
5/9Validated
Impact
4/5
Investment
4/5
Category
Hardware

Newsletter

Follow us for weekly foresight in your inbox.

Browse the latest from Artificial Insights, our opinionated weekly briefing exploring the transition toward AGI.
Mar 8, 2026 · Issue 131
Mar 8, 2026 · Issue 131
Prompt it into existence
Feb 23, 2026 · Issue 130
Feb 23, 2026 · Issue 130
An Apocaloptimist
Feb 9, 2026 · Issue 129
Feb 9, 2026 · Issue 129
Agent in the Loop
View all issues