Equivariant Representation Learning for Symmetry-Aware Inference with Guarantees

📅 2025-05-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Modeling physical/geometric symmetries as priors is critical for regression, conditional probability estimation, and uncertainty quantification. Method: We propose a representation learning framework that jointly enforces equivariance and disentanglement. We establish the first non-asymptotic statistical learning guarantee for symmetry-aware inference; leverage spectral decomposition of the conditional expectation operator to construct equivariant and disentangled representations across independent subgroups; and integrate operator theory, group representation theory, and geometric deep learning to design spectral approximation and subgroup-disentangling architectures. Contributions/Results: Experiments on synthetic data and real-world robotic tasks demonstrate that our method matches or surpasses state-of-the-art equivariant baselines in regression accuracy, while simultaneously producing well-calibrated, parametric uncertainty estimates.

Technology Category

Application Category

📝 Abstract
In many real-world applications of regression, conditional probability estimation, and uncertainty quantification, exploiting symmetries rooted in physics or geometry can dramatically improve generalization and sample efficiency. While geometric deep learning has made significant empirical advances by incorporating group-theoretic structure, less attention has been given to statistical learning guarantees. In this paper, we introduce an equivariant representation learning framework that simultaneously addresses regression, conditional probability estimation, and uncertainty quantification while providing first-of-its-kind non-asymptotic statistical learning guarantees. Grounded in operator and group representation theory, our framework approximates the spectral decomposition of the conditional expectation operator, building representations that are both equivariant and disentangled along independent symmetry subgroups. Empirical evaluations on synthetic datasets and real-world robotics applications confirm the potential of our approach, matching or outperforming existing equivariant baselines in regression while additionally providing well-calibrated parametric uncertainty estimates.
Problem

Research questions and friction points this paper is trying to address.

Exploiting symmetries to improve generalization and sample efficiency
Providing statistical learning guarantees for equivariant representation learning
Addressing regression, probability estimation, and uncertainty quantification simultaneously
Innovation

Methods, ideas, or system contributions that make the work stand out.

Equivariant representation learning for symmetry-aware inference
Spectral decomposition of conditional expectation operator
Disentangled representations along independent symmetry subgroups
🔎 Similar Papers
No similar papers found.
D
Daniel Ordonez-Apraez
CSML & DLS, Italian Institute of Technology, Università di Genova
A
Alek Frohlich
CSML, Italian Institute of Technology, Università di Genova
V
Vivien Brandt
CMAP, École Polytechnique
Karim Lounici
Karim Lounici
Statistics Professor, Ecole Polytechnique
High-Dimensional StatisticsUncertainty QuantificationMachine Learning for Science
M
Massimiliano Pontil
CMSL, Italian Institute of Technology, University College London