Why is topology hard to learn?

📅 2025-09-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Conventional neural networks struggle to learn topological invariants due to a fundamental expressivity gap between their architecture and the underlying real-space topological structure. Method: We propose a hybrid tensor network–neural network architecture that rigorously embeds real-space topological invariants—such as the Chern number—into the model design, complemented by rigorous trainability analysis and generalization assessment. Contribution/Results: This work achieves the first exact, differentiable parametrization of topological invariants within a learnable model. It systematically uncovers a decoupling between model expressivity and trainability—a previously underexplored phenomenon. Empirical evaluation demonstrates that the hybrid architecture significantly outperforms mainstream deep learning models (e.g., CNNs and ResNets) on topological phase classification tasks, achieving over 15% higher accuracy while exhibiting strong generalization across lattice geometries and system sizes. Moreover, the model retains physical interpretability through its explicit topological encoding, establishing a new paradigm for physics-informed, interpretable machine learning.

Technology Category

Application Category

📝 Abstract
Much attention has been devoted to the use of machine learning to approximate physical concepts. Yet, due to challenges in interpretability of machine learning techniques, the question of what physics machine learning models are able to learn remains open. Here we bridge the concept a physical quantity and its machine learning approximation in the context of the original application of neural networks in physics: topological phase classification. We construct a hybrid tensor-neural network object that exactly expresses real space topological invariant and rigorously assess its trainability and generalization. Specifically, we benchmark the accuracy and trainability of a tensor-neural network to multiple types of neural networks, thus exemplifying the differences in trainability and representational power. Our work highlights the challenges in learning topological invariants and constitutes a stepping stone towards more accurate and better generalizable machine learning representations in condensed matter physics.
Problem

Research questions and friction points this paper is trying to address.

Understanding why topology is difficult for machine learning to learn
Bridging physical concepts with machine learning approximations in topological phases
Assessing trainability and generalization of neural networks for topological invariants
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hybrid tensor-neural network expresses topological invariant
Rigorous assessment of trainability and generalization capabilities
Benchmarking against multiple neural network architectures
D
D. O. Oriekhov
QuTech and Kavli Institute of Nanoscience, Delft University of Technology, 2628 CJ Delft, the Netherlands
S
Stan Bergkamp
QuTech and Kavli Institute of Nanoscience, Delft University of Technology, 2628 CJ Delft, the Netherlands
G
Guliuxin Jin
QuTech and Kavli Institute of Nanoscience, Delft University of Technology, 2628 CJ Delft, the Netherlands
J
Juan Daniel Torres Luna
QuTech and Kavli Institute of Nanoscience, Delft University of Technology, 2628 CJ Delft, the Netherlands
B
Badr Zouggari
QuTech and Kavli Institute of Nanoscience, Delft University of Technology, 2628 CJ Delft, the Netherlands
S
Sibren van der Meer
QuTech and Kavli Institute of Nanoscience, Delft University of Technology, 2628 CJ Delft, the Netherlands
N
Naoual El Yazidi
QuTech and Kavli Institute of Nanoscience, Delft University of Technology, 2628 CJ Delft, the Netherlands
Eliska Greplova
Eliska Greplova
Delft University of Technology
quantum devicescondensed matter physicsmachine learning