Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Research
  3. Wintermute
  4. Sovereign AI Language Models

Sovereign AI Language Models

Korea has five independent large language model programs — Naver HyperCLOVA X, LG Exaone, SK Telecom A.X, Upstage Solar Pro, and Kakao Kanana — more sovereign AI efforts per capita than any other country.

Geography: Asia Pacific · East Asia · South Korea

Back to WintermuteBack to South KoreaView interactive version

Naver's HyperCLOVA X is the most advanced Korean LLM, trained on Korean-language data at a scale that outperforms GPT-4 on Korean benchmarks. LG's Exaone is a bilingual model designed for enterprise and scientific applications. SK Telecom's A.X powers its customer service and telecom operations. Upstage's Solar Pro is an open-weight model competitive with Llama on multilingual tasks. Kakao's Kanana targets Korean conversational AI.

Korea's LLM diversity is remarkable for a country of 52 million people — five major independent efforts, each backed by a different chaebol or tech company. This reflects a national conviction that linguistic and cultural sovereignty requires domestic AI models, not dependence on OpenAI or Google. The Korean government's $349M AI investment in 2025 includes compute subsidies specifically for domestic model training.

The practical question is whether five separate Korean LLMs can each achieve sufficient scale to compete globally, or whether consolidation is inevitable. Naver has the strongest position with 60,000+ GPUs and the most comprehensive Korean training data, but smaller players like Upstage have found niches in open-source and enterprise deployment. Korea's LLM ecosystem is a microcosm of its broader innovation pattern: intense domestic competition driving rapid improvement.

TRL
7/9Operational
Impact
2/5
Investment
4/5
Category
Software

Book a research session

Bring this signal into a focused decision sprint with analyst-led framing and synthesis.
Research Sessions