🤖 AI Summary
This study investigates the phase transition from fixed-point to chaotic dynamics in recurrent neural networks (RNNs) with heavy-tailed (Lévy-stable) synaptic connectivity, and its implications for dynamical robustness and high-dimensional neural activity richness. Addressing the failure of conventional Gaussian-connectivity mean-field theory in finite-size networks, we integrate finite-size perturbation analysis, Lyapunov spectrum computation, Lévy modeling, and large-scale numerical simulations. We theoretically predict and empirically verify, for the first time, the critical gain for chaos onset in heavy-tailed RNNs; identify a novel “slow chaos transition” phenomenon; and uncover a biologically plausible trade-off between robustness and high-dimensional activity. We analytically pinpoint the transition point, demonstrating that heavy-tailed networks exhibit a broader edge-of-chaos parameter regime—yet yield attractors with reduced Lyapunov dimension—indicating effective dimensional compression.
📝 Abstract
Growing evidence suggests that synaptic weights in the brain follow heavy-tailed distributions, yet most theoretical analyses of recurrent neural networks (RNNs) assume Gaussian connectivity. We systematically study the activity of RNNs with random weights drawn from biologically plausible L'evy alpha-stable distributions. While mean-field theory for the infinite system predicts that the quiescent state is always unstable -- implying ubiquitous chaos -- our finite-size analysis reveals a sharp transition between quiescent and chaotic dynamics. We theoretically predict the gain at which the system transitions from quiescent to chaotic dynamics, and validate it through simulations. Compared to Gaussian networks, heavy-tailed RNNs exhibit a broader parameter regime near the edge of chaos, namely a slow transition to chaos. However, this robustness comes with a tradeoff: heavier tails reduce the Lyapunov dimension of the attractor, indicating lower effective dimensionality. Our results reveal a biologically aligned tradeoff between the robustness of dynamics near the edge of chaos and the richness of high-dimensional neural activity. By analytically characterizing the transition point in finite-size networks -- where mean-field theory breaks down -- we provide a tractable framework for understanding dynamics in realistically sized, heavy-tailed neural circuits.