Toward a Unified Geometry Understanding: Riemannian Diffusion Framework for Graph Generation and Prediction

πŸ“… 2025-10-06
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Existing graph diffusion models embed features of varying curvature into a shared Euclidean latent space, ignoring the intrinsic non-Euclidean manifold structure of graph dataβ€”leading to untapped geometric potential and severe representation entanglement. To address this, we propose GeoMancer, the first unified Riemannian diffusion framework. It (i) replaces the conventional exponential map with an isometry-invariant Riemannian gyroscopic kernel to mitigate numerical instability on manifolds; (ii) introduces a manifold-constrained diffusion process coupled with a self-guided unconditional generation mechanism, ensuring generated samples strictly reside on the target Riemannian manifold; and (iii) jointly models generation and prediction tasks via manifold-decoupled representations that explicitly encode multi-level geometric properties. Extensive experiments on multiple graph generation and prediction benchmarks demonstrate significant improvements over state-of-the-art methods, validating both the effectiveness and necessity of Riemannian geometric modeling for complex graph structures.

Technology Category

Application Category

πŸ“ Abstract
Graph diffusion models have made significant progress in learning structured graph data and have demonstrated strong potential for predictive tasks. Existing approaches typically embed node, edge, and graph-level features into a unified latent space, modeling prediction tasks including classification and regression as a form of conditional generation. However, due to the non-Euclidean nature of graph data, features of different curvatures are entangled in the same latent space without releasing their geometric potential. To address this issue, we aim to construt an ideal Riemannian diffusion model to capture distinct manifold signatures of complex graph data and learn their distribution. This goal faces two challenges: numerical instability caused by exponential mapping during the encoding proces and manifold deviation during diffusion generation. To address these challenges, we propose GeoMancer: a novel Riemannian graph diffusion framework for both generation and prediction tasks. To mitigate numerical instability, we replace exponential mapping with an isometric-invariant Riemannian gyrokernel approach and decouple multi-level features onto their respective task-specific manifolds to learn optimal representations. To address manifold deviation, we introduce a manifold-constrained diffusion method and a self-guided strategy for unconditional generation, ensuring that the generated data remains aligned with the manifold signature. Extensive experiments validate the effectiveness of our approach, demonstrating superior performance across a variety of tasks.
Problem

Research questions and friction points this paper is trying to address.

Modeling graph data with distinct geometric curvatures in latent space
Addressing numerical instability from exponential mapping in encoding
Preventing manifold deviation during diffusion-based graph generation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Replaces exponential mapping with isometric-invariant gyrokernel approach
Decouples multi-level features onto task-specific manifolds
Introduces manifold-constrained diffusion with self-guided strategy