🤖 AI Summary
This study addresses the theoretical understanding of generalization mechanisms and representational capacity in deep learning, proposing “rank” as a unifying analytical framework. Methodologically, it introduces quantum entanglement rank into graph neural network modeling for the first time; systematically uncovers how gradient descent implicitly prefers low-rank structures; establishes rigorous connections between neural networks and tensor decomposition; and integrates implicit regularization theory, rank-based analysis, and empirical generalization evaluation. Key contributions include: (1) empirical confirmation of pervasive implicit low-rank bias across diverse architectures during training; (2) a novel theoretical explanation for efficient generalization on natural data—such as images, speech, and text—grounded in rank dynamics; and (3) derivation of implementable explicit low-rank regularization strategies and generalization-aware data preprocessing methods, thereby bridging theoretical insight with practical performance improvement.
📝 Abstract
Despite the extreme popularity of deep learning in science and industry, its formal understanding is limited. This thesis puts forth notions of rank as key for developing a theory of deep learning, focusing on the fundamental aspects of generalization and expressiveness. In particular, we establish that gradient-based training can induce an implicit regularization towards low rank for several neural network architectures, and demonstrate empirically that this phenomenon may facilitate an explanation of generalization over natural data (e.g., audio, images, and text). Then, we characterize the ability of graph neural networks to model interactions via a notion of rank, which is commonly used for quantifying entanglement in quantum physics. A central tool underlying these results is a connection between neural networks and tensor factorizations. Practical implications of our theory for designing explicit regularization schemes and data preprocessing algorithms are presented.