A generative model framework borrowing renormalization principles from physics to handle high-dimensional data.
Renormalizing Generative Models (RGMs) are a class of probabilistic generative models that draw on renormalization group (RG) theory — a mathematical framework originally developed in statistical mechanics and quantum field theory — to learn and generate complex, high-dimensional data distributions. The core idea is to exploit the hierarchical, multi-scale structure that renormalization provides: rather than modeling data at a single resolution, RGMs progressively coarse-grain representations, capturing structure at multiple scales before reconstructing fine-grained outputs during generation.
In practice, RGMs work by defining a sequence of transformations that systematically integrate out fine-scale degrees of freedom, analogous to how renormalization group flows operate in physics. This process yields a hierarchy of latent representations, each encoding information at a different level of abstraction. Training involves learning both the coarse-graining transformations and the conditional distributions needed to reverse them, enabling the model to generate new samples by sampling from a tractable coarse distribution and progressively refining it. This architecture shares conceptual ground with hierarchical VAEs and diffusion models, but is explicitly motivated by the physics of scale invariance and universality.
The appeal of RGMs lies in their principled approach to scalability and expressiveness. By structuring the generative process around multi-scale decompositions, they can more efficiently represent data with long-range correlations — a known weakness of many standard generative architectures. Applications include image synthesis, where natural images exhibit strong multi-scale structure, as well as scientific domains like molecular simulation and cosmological data generation, where physical symmetries and scale dependencies are intrinsic to the problem.
RGMs sit at the intersection of machine learning and theoretical physics, an interdisciplinary space that gained significant momentum in the early 2020s. While the underlying mathematics of renormalization group theory is decades old, its deliberate application as an architectural principle in modern deep generative modeling is relatively recent, driven by growing interest in physics-informed machine learning and the search for more interpretable, structured generative frameworks.