🤖 AI Summary
This work addresses the limitation of energy-function dependency in dynamical analysis of self-attention mechanisms, proposing the first energy-agnostic dynamical modeling framework. Methodologically, we conduct spectral analysis of the Jacobian matrix and discover that normalization layers enforce eigenvalues to lie near the unit circle in the complex plane, driving the system toward a critical dynamical regime. Leveraging this insight, we design a pseudo-energy monitoring metric and a novel training regularization technique. Experiments demonstrate substantial improvements in multi-task inference performance and generalization ability; the framework provides interpretable, real-time dynamical monitoring and—critically—offers the first empirical validation of a strong correlation between criticality and generalization. Key contributions include: (i) eliminating reliance on energy-based assumptions; (ii) revealing how normalization constrains complex eigenvalue magnitudes; and (iii) establishing a generalizable paradigm for dynamical analysis of transformer-based architectures.
📝 Abstract
The theoretical understanding of self-attention (SA) has been steadily progressing. A prominent line of work studies a class of SA layers that admit an energy function decreased by state updates. While it provides valuable insights into inherent biases in signal propagation, it often relies on idealized assumptions or additional constraints not necessarily present in standard SA. Thus, to broaden our understanding, this work aims to relax these energy constraints and provide an energy-agnostic characterization of inference dynamics by dynamical systems analysis. In more detail, we first consider relaxing the symmetry and single-head constraints traditionally required in energy-based formulations. Next, to investigate more general SA architectures capable of oscillatory dynamics without necessarily admitting an energy function, we analyze the Jacobian matrix of the state. We reveal that normalization layers effectively normalize the Jacobian's complex eigenvalues, forcing the dynamics close to a critical state. This significantly enhances inference performance. Furthermore, we utilize the Jacobian perspective to develop regularization methods for training and a pseudo-energy for monitoring inference dynamics.