🤖 AI Summary
This work addresses the systematic challenge of modeling tensor symmetries in geometric deep learning. We propose a unified framework based on Symmetric Tensor Networks (STNs), enabling equivariant and invariant operations over arbitrary-rank Cartesian and spherical tensors. Methodologically, we integrate invariant and covariant theory from group representation theory and introduce an interpretable graphical network notation, supporting flexible adaptation to input/output tensor types and orders. Our key contribution lies in embedding symmetry constraints directly into modular network design, drastically simplifying the development of equivariant layers. We validate the framework on two tasks: message passing in geometric graph neural networks and constitutive law modeling for materials. Results demonstrate that our models achieve high predictive accuracy while strictly preserving physical symmetries. The approach establishes a scalable, interpretable, and general-purpose paradigm for equivariant machine learning. (149 words)
📝 Abstract
Design of neural networks that incorporate symmetry is crucial for geometric deep learning. Central to this effort is the development of invariant and equivariant operations. This works presents a systematic method for constructing valid invariant and equivariant operations. It can handle inputs and outputs in the form of Cartesian tensors with different rank, as well as spherical tensors with different types. In addition, our method features a graphical representation utilizing the symmetric tensor network, which simplifies both the proofs and constructions related to invariant and equivariant functions. We also apply this approach to design the equivariant interaction message for the geometry graph neural network, and equivariant machine learning model to learn the constitutive law of materials.