🤖 AI Summary
A theoretical gap persists between the fine-scale structure of microcircuit neural dynamics and the mechanisms underlying high-level cognitive functions.
Method: We propose a brain-inspired computational framework based on state space models (SSMs), leveraging their neuron-like dynamical properties to model flexible learning behaviors—including temporal perception and event counting. Crucially, we employ a diagonalized S5 architecture coupled with reinforcement learning for training and analysis on temporally resolved tasks.
Contribution/Results: We discover that rotational dynamics of hidden states in the complex plane unify the emergence of time cells, ramping activity, and traveling waves—phenomena widely observed in neurophysiological experiments. Moreover, this mechanism naturally generalizes to abstract cognitive tasks such as event counting. The model successfully recapitulates diverse neural dynamics reported in biological experiments while demonstrating robust generalization beyond basic timing tasks. These results substantiate SSMs as a unifying computational paradigm bridging neural mechanisms and cognitive function—both theoretically grounded and empirically validated.
📝 Abstract
A grand challenge in modern neuroscience is to bridge the gap between the detailed mapping of microscale neural circuits and a mechanistic understanding of cognitive functions. While extensive knowledge exists about neuronal connectivity and biophysics, a significant gap remains in how these elements combine to produce flexible, learned behaviors. Here, we propose that a framework based on State-Space Models (SSMs), an emerging class of deep learning architectures, can bridge this gap. We argue that the differential equations governing elements in an SSM are conceptually consistent with the biophysical dynamics of neurons, while the combined dynamics in the model lead to emergent behaviors observed in experimental neuroscience. We test this framework by training an S5 model--a specific SSM variant employing a diagonal state transition matrix--on temporal discrimination tasks with reinforcement learning (RL). We demonstrate that the model spontaneously develops neural representations that strikingly mimic biological 'time cells'. We reveal that these cells emerge from a simple generative principle: learned rotational dynamics of hidden state vectors in the complex plane. This single mechanism unifies the emergence of time cells, ramping activity, and oscillations/traveling waves observed in numerous experiments. Furthermore, we show that this rotational dynamics generalizes beyond interval discriminative tasks to abstract event-counting tasks that were considered foundational for performing complex cognitive tasks. Our findings position SSMs as a compelling framework that connects single-neuron dynamics to cognitive phenomena, offering a unifying and computationally tractable theoretical ground for temporal learning in the brain.