Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • My Collection
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Research
  3. Substrate
  4. High Bandwidth Memory (HBM)

High Bandwidth Memory (HBM)

SK Hynix and Samsung control over 90% of the global HBM market, with HBM4 chips shipping to every major AI accelerator vendor including Nvidia and AMD.
Back to SubstrateView interactive version

SK Hynix pioneered mass production of HBM3E in 2024 and began HBM4 volume shipments in late 2025, stacking 16 DRAM layers using through-silicon via (TSV) technology to deliver 1.65 TB/s bandwidth per stack. Samsung followed with its own HBM4 production line at Pyeongtaek. The two companies together supply virtually every AI GPU and accelerator on the market.

HBM is the memory bottleneck-breaker for AI training and inference. As model sizes grew past the trillion-parameter mark, conventional DRAM couldn't feed data to GPUs fast enough. HBM solved this by moving memory physically closer to compute and widening the data bus — but fabricating these 12-16 layer stacks with acceptable yield requires manufacturing precision that only SK Hynix and Samsung have mastered at scale.

This duopoly gives South Korea extraordinary leverage in the AI hardware supply chain. Every Nvidia H200 and B200 GPU ships with Korean HBM. The strategic significance became apparent when the US restricted China's access to advanced AI chips — the memory inside those chips was already Korean-controlled. HBM revenue for SK Hynix alone exceeded $12B in 2025.

TRL
9/9Established
Impact
5/5
Investment
5/5
Category
Hardware

Book a research session

Bring this signal into a focused decision sprint with analyst-led framing and synthesis.
Research Sessions