🤖 AI Summary
Existing machine learning approaches for dynamical systems are limited by discrete-time modeling and local analysis, failing to capture coexisting local and global bifurcations. To address this, we propose a continuous-time modeling paradigm based on Neural Ordinary Differential Equations (Neural ODEs). Our method directly learns parameter-dependent vector fields from noisy, sparse time-series data, enabling differentiable modeling and extrapolation across bifurcations. This work is the first to apply Neural ODEs to bifurcation structure prediction—overcoming constraints of the training parameter domain—and accurately reconstructs complex bifurcation diagrams in predator–prey systems. It demonstrates robust generalization under data scarcity and noise corruption. The core contribution is the establishment of the first data-driven, continuous-time framework for cross-bifurcation dynamics, uniquely integrating physical interpretability with global predictive capability.
📝 Abstract
Forecasting system behaviour near and across bifurcations is crucial for identifying potential shifts in dynamical systems. While machine learning has recently been used to learn critical transitions and bifurcation structures from data, most studies remain limited as they exclusively focus on discrete-time methods and local bifurcations. To address these limitations, we use Neural Ordinary Differential Equations which provide a continuous, data-driven framework for learning system dynamics. We apply our approach to a predator-prey system that features both local and global bifurcations, presenting a challenging test case. Our results show that Neural Ordinary Differential Equations can recover underlying bifurcation structures directly from timeseries data by learning parameter-dependent vector fields. Notably, we demonstrate that Neural Ordinary Differential Equations can forecast bifurcations even beyond the parameter regions represented in the training data. We also assess the method's performance under limited and noisy data conditions, finding that model accuracy depends more on the quality of information that can be inferred from the training data, than on the amount of data available.