🤖 AI Summary
Modeling physical/geometric symmetries as priors is critical for regression, conditional probability estimation, and uncertainty quantification. Method: We propose a representation learning framework that jointly enforces equivariance and disentanglement. We establish the first non-asymptotic statistical learning guarantee for symmetry-aware inference; leverage spectral decomposition of the conditional expectation operator to construct equivariant and disentangled representations across independent subgroups; and integrate operator theory, group representation theory, and geometric deep learning to design spectral approximation and subgroup-disentangling architectures. Contributions/Results: Experiments on synthetic data and real-world robotic tasks demonstrate that our method matches or surpasses state-of-the-art equivariant baselines in regression accuracy, while simultaneously producing well-calibrated, parametric uncertainty estimates.
📝 Abstract
In many real-world applications of regression, conditional probability estimation, and uncertainty quantification, exploiting symmetries rooted in physics or geometry can dramatically improve generalization and sample efficiency. While geometric deep learning has made significant empirical advances by incorporating group-theoretic structure, less attention has been given to statistical learning guarantees. In this paper, we introduce an equivariant representation learning framework that simultaneously addresses regression, conditional probability estimation, and uncertainty quantification while providing first-of-its-kind non-asymptotic statistical learning guarantees. Grounded in operator and group representation theory, our framework approximates the spectral decomposition of the conditional expectation operator, building representations that are both equivariant and disentangled along independent symmetry subgroups. Empirical evaluations on synthetic datasets and real-world robotics applications confirm the potential of our approach, matching or outperforming existing equivariant baselines in regression while additionally providing well-calibrated parametric uncertainty estimates.