Modeling Latent Neural Dynamics with Gaussian Process Switching Linear Dynamical Systems

๐Ÿ“… 2024-07-19
๐Ÿ›๏ธ arXiv.org
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Existing recurrent switching linear dynamical systems (rSLDS) suffer from spurious oscillations at regime boundaries and lack principled quantification of posterior uncertainty in latent dynamics modeling of high-dimensional neural time-series data. Method: We propose the Gaussian process switching linear dynamical system (gpSLDS), which replaces abrupt regime transitions with a novel smooth interpolation kernel, enabling continuous, locally linear yet globally nonlinear dynamical modeling. The framework integrates Gaussian processes, stochastic differential equations, and variational inference to support rigorous posterior uncertainty quantification over latent dynamics. Contribution/Results: Evaluated on synthetic data and two real neural recording datasets, gpSLDS achieves significantly improved dynamical fit accuracy and interpretability compared to rSLDS, while providing well-calibrated uncertainty estimatesโ€”thereby advancing scalable, interpretable, and probabilistically grounded modeling of population neural activity.

Technology Category

Application Category

๐Ÿ“ Abstract
Understanding how the collective activity of neural populations relates to computation and ultimately behavior is a key goal in neuroscience. To this end, statistical methods which describe high-dimensional neural time series in terms of low-dimensional latent dynamics have played a fundamental role in characterizing neural systems. Yet, what constitutes a successful method involves two opposing criteria: (1) methods should be expressive enough to capture complex nonlinear dynamics, and (2) they should maintain a notion of interpretability often only warranted by simpler linear models. In this paper, we develop an approach that balances these two objectives: the Gaussian Process Switching Linear Dynamical System (gpSLDS). Our method builds on previous work modeling the latent state evolution via a stochastic differential equation whose nonlinear dynamics are described by a Gaussian process (GP-SDEs). We propose a novel kernel function which enforces smoothly interpolated locally linear dynamics, and therefore expresses flexible -- yet interpretable -- dynamics akin to those of recurrent switching linear dynamical systems (rSLDS). Our approach resolves key limitations of the rSLDS such as artifactual oscillations in dynamics near discrete state boundaries, while also providing posterior uncertainty estimates of the dynamics. To fit our models, we leverage a modified learning objective which improves the estimation accuracy of kernel hyperparameters compared to previous GP-SDE fitting approaches. We apply our method to synthetic data and data recorded in two neuroscience experiments and demonstrate favorable performance in comparison to the rSLDS.
Problem

Research questions and friction points this paper is trying to address.

Neural Activity
Statistical Method
Cognitive Influence
Innovation

Methods, ideas, or system contributions that make the work stand out.

gpSLDS
Smooth State Transitions
Optimized Learning Rules
๐Ÿ”Ž Similar Papers
No similar papers found.
Amber Hu
Amber Hu
PhD Student, Statistics, Stanford University
statisticsmachine learningcomputational neuroscience
D
D. Zoltowski
Stanford University
Aditya Nair
Aditya Nair
Caltech & Howard Hughes Medical Institute
D
David Anderson
Caltech & Howard Hughes Medical Institute
L
Lea Duncker
Columbia University
S
Scott W. Linderman
Stanford University