Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Research
  3. Pixels
  4. Neural Texture Compression

Neural Texture Compression

AI-driven codecs that compress game textures up to 90% while preserving visual quality
Back to PixelsView interactive version

Neural texture compression (NTC) pipelines train lightweight decoder networks that regenerate material details on the GPU just before rasterization. Artists export high-resolution textures, NTC encoders quantize them into latent tensors, and runtime shaders decode only the texels needed per frame. Because the latent is much smaller than raw PBR maps, studios slash VRAM consumption, patch sizes, and streaming bandwidth without visibly degrading quality.

Mobile titles and cloud-streamed games rely on NTC to ship cinematic assets without multi-gig downloads, while mod marketplaces use it to distribute sprawling user worlds that still fit on consumer SSDs. AAA studios leverage NTC to keep multiple texture variants resident—mud, snow, decals—so dynamic weather feels richer. Generative workflows even pair NTC with procedural materials, compressing neural outputs into latents that older GPUs can decode.

TRL 5 implementations exist in Unreal, Frostbite, Godot, and custom engines, yet tooling is still maturing. Artists need previewers to inspect decode artifacts, and QA teams must validate that fallbacks exist for older hardware lacking tensor cores. Khronos and Microsoft are discussing standard container formats so GPU vendors can accelerate decoders in hardware. As those standards land and encoders integrate into DCC pipelines, NTC will become as routine as BCn or ASTC texture baking—just far more bandwidth-friendly.

TRL
5/9Validated
Impact
5/5
Investment
4/5
Category
Software

Related Organizations

NVIDIA logo
NVIDIA

United States · Company

100%

Developing foundation models for robotics (Project GR00T) and vision-language models like VILA.

Researcher
Meta Reality Labs logo
Meta Reality Labs

United States · Company

95%

Develops the Quest Pro and research prototypes (Butterscotch, Starburst) focusing on foveated systems.

Researcher
Intel logo
Intel

United States · Company

90%

Develops silicon spin qubits using advanced 300mm wafer manufacturing processes.

Researcher
Apple logo
Apple

United States · Company

85%

Developing 'Apple Intelligence', a personal intelligence system integrated into iOS/macOS that uses on-device context to mediate tasks and information.

Researcher
ETH Zurich logo
ETH Zurich

Switzerland · University

85%

Conducts advanced research in bioelectronics and the interface between biological systems and electronic circuits.

Researcher
Max Planck Institute for Intelligent Systems logo
Max Planck Institute for Intelligent Systems

Germany · Research Lab

85%

A leading research institute investigating the principles of perception, action, and learning in autonomous systems.

Researcher
Ubisoft La Forge logo

Ubisoft La Forge

Canada · Research Lab

85%

The R&D branch of Ubisoft bridging academic research and game production.

Researcher
Unity Technologies logo
Unity Technologies

United States · Company

85%

Provides the High Definition Render Pipeline (HDRP) which supports real-time ray tracing for gaming and industrial visualization.

Developer
AMD logo
AMD

United States · Company

80%

Develops the RDNA architecture with Ray Accelerators, powering ray tracing on PC and current-gen consoles (PS5, Xbox Series X).

Researcher

Supporting Evidence

Evidence data is not available for this technology yet.

Connections

Software
Software
Neural Radiance Fields (NeRF) Streaming

Streams photorealistic 3D environments by sending compact neural models instead of heavy meshes

TRL
4/9
Impact
4/5
Investment
4/5
Software
Software
Automatic LOD Generation

ML-driven mesh simplification that generates optimized level-of-detail assets for game engines

TRL
6/9
Impact
4/5
Investment
4/5
Software
Software
Real-Time Path Tracing Engines

GPU-accelerated rendering that traces light paths for photorealistic game visuals at playable frame rates

TRL
6/9
Impact
5/5
Investment
5/5
Hardware
Hardware
Edge AI Accelerator Consoles

Gaming hardware with built-in neural processors for local AI-driven NPCs, graphics, and adaptive gameplay

TRL
8/9
Impact
5/5
Investment
5/5
Software
Software
AI-Native Game Engines

Game engines that procedurally generate worlds, characters, and stories from player actions in real time

TRL
4/9
Impact
5/5
Investment
5/5
Hardware
Hardware
Foveated Rendering Accelerators

Hardware that tracks eye movement to render high detail only where players look

TRL
6/9
Impact
4/5
Investment
4/5

Book a research session

Bring this signal into a focused decision sprint with analyst-led framing and synthesis.
Research Sessions