Gaussian processes extended to curved or structured non-Euclidean domains via geometry-aware kernels.
Geometric Gaussian Processes (GGPs) are a Bayesian nonparametric framework that extends classical Gaussian processes to data living on curved or structured spaces — such as manifolds, graphs, meshes, and point clouds — by encoding the domain's intrinsic geometry directly into covariance kernels and spectral priors. Where standard GPs assume data exists in flat Euclidean space, GGPs replace Euclidean distance with geodesic or graph-theoretic notions of proximity, ensuring that the statistical model respects the true shape of the domain rather than imposing an inappropriate flat-space approximation.
The technical machinery behind GGPs draws on several mathematical tools. Heat kernels and diffusion kernels capture how information spreads across a curved surface over time, while Laplace–Beltrami eigenfunctions provide a spectral decomposition of the manifold analogous to Fourier analysis on flat space. A particularly influential construction links GGPs to stochastic partial differential equations (SPDEs): by expressing Matérn-class covariance functions as solutions to SPDEs and then discretizing those equations on meshes or graphs, practitioners obtain Gaussian Markov random field approximations that are computationally tractable at scale. Sparse inducing-point methods and reduced-rank eigenbasis approximations further enable GGPs to handle large datasets on complex geometries without sacrificing principled uncertainty quantification.
GGPs matter because many real-world datasets are inherently non-Euclidean. Brain activity recorded on cortical surfaces, climate variables measured on the globe, signals propagating through sensor networks, and robot state spaces all possess geometric structure that flat-space models distort or ignore. By building geometry into the prior, GGPs improve interpolation accuracy, produce better-calibrated uncertainty estimates, and respect symmetries and invariances that domain knowledge demands. This makes them valuable in spatial statistics, medical neuroimaging, physics-informed modeling on curved domains, and geometric machine learning pipelines.
Interest in combining Gaussian processes with explicit manifold and graph geometry grew steadily through the early 2010s, accelerating as SPDE-based methods, spectral graph theory, and geometric deep learning matured into practical toolkits. Today GGPs sit at the intersection of probabilistic machine learning and geometric ML, offering a principled probabilistic complement to deterministic graph neural networks and manifold-learning methods.