🤖 AI Summary
This work addresses zero-shot inference of ordinary differential equations (ODEs) from sparse, noisy observations. We propose FIM-ODE—the first framework to integrate a foundation inference model into the ODE context-learning paradigm, enabling generalization to unseen dynamical systems without fine-tuning. Our method employs a joint architecture combining a pre-trained neural network and a neural operator, trained exclusively on synthetic data, to robustly identify vector field structure and infer governing equation forms. Experiments demonstrate that FIM-ODE matches state-of-the-art neural ODE methods in parameter estimation accuracy while faithfully recovering qualitative dynamical behaviors—including fixed points, limit cycles, and other topological features. By bridging foundation-model scalability with interpretable, continuous-time modeling, FIM-ODE significantly advances generalization capability and physical interpretability in data-driven ODE discovery.
📝 Abstract
Ordinary differential equations (ODEs) describe dynamical systems evolving deterministically in continuous time. Accurate data-driven modeling of systems as ODEs, a central problem across the natural sciences, remains challenging, especially if the data is sparse or noisy. We introduce FIM-ODE (Foundation Inference Model for ODEs), a pretrained neural model designed to estimate ODEs zero-shot (i.e., in context) from sparse and noisy observations. Trained on synthetic data, the model utilizes a flexible neural operator for robust ODE inference, even from corrupted data. We empirically verify that FIM-ODE provides accurate estimates, on par with a neural state-of-the-art method, and qualitatively compare the structure of their estimated vector fields.