Statistical Learning Guarantees for Group-Invariant Barron Functions

📅 2025-09-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates the generalization error of group-invariant neural networks within the Barron function framework, focusing on how symmetry structures enhance statistical efficiency in learning target functions with intrinsic group symmetries. We introduce a group-dependent factor δ_{G,Γ,σ} to quantify the impact of symmetry on approximation capacity. Our theoretical analysis shows that group-invariance reduces generalization error significantly—without increasing Rademacher complexity—and achieves an |G|⁻¹ improvement in approximation error when δ_{G,Γ,σ} is small. This constitutes the first quantitative characterization, under Barron-norm constraints, of the generalization benefit conferred by group invariance. The result provides a rigorous statistical learning-theoretic foundation for symmetry-aware neural network design.

Technology Category

Application Category

📝 Abstract
We investigate the generalization error of group-invariant neural networks within the Barron framework. Our analysis shows that incorporating group-invariant structures introduces a group-dependent factor $δ_{G,Γ,σ} le 1$ into the approximation rate. When this factor is small, group invariance yields substantial improvements in approximation accuracy. On the estimation side, we establish that the Rademacher complexity of the group-invariant class is no larger than that of the non-invariant counterpart, implying that the estimation error remains unaffected by the incorporation of symmetry. Consequently, the generalization error can improve significantly when learning functions with inherent group symmetries. We further provide illustrative examples demonstrating both favorable cases, where $δ_{G,Γ,σ}approx |G|^{-1}$, and unfavorable ones, where $δ_{G,Γ,σ}approx 1$. Overall, our results offer a rigorous theoretical foundation showing that encoding group-invariant structures in neural networks leads to clear statistical advantages for symmetric target functions.
Problem

Research questions and friction points this paper is trying to address.

Analyzing generalization error of group-invariant neural networks
Studying how group symmetry improves approximation accuracy
Establishing theoretical advantages of invariant structures for symmetric functions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Incorporating group-invariant structures in neural networks
Rademacher complexity unaffected by symmetry inclusion
Encoding group-invariant structures yields statistical advantages
🔎 Similar Papers
No similar papers found.
Yahong Yang
Yahong Yang
Georgia Institute of Technology
Deep Learning TheoryMathematical Modeling and Simulation in Materials Science
W
Wei Zhu
School of Mathematics, Georgia Institute of Technology, Atlanta, GA, USA