🤖 AI Summary
Modeling dynamic systems with long-range memory and stochasticity remains challenging due to the inherent limitations of Markovian assumptions in conventional stochastic differential equations (SDEs).
Method: This paper introduces Neural Stochastic Volterra Equations (NSVEs), the first framework to embed Volterra-type integral operators into neural stochastic modeling, thereby explicitly capturing path dependence and historical cumulative effects while relaxing the Markov constraint. We establish theoretical guarantees on existence, uniqueness, and approximation capacity of NSVE solutions.
Results: Empirically, NSVEs significantly outperform neural SDEs and DeepONets in long-term forecasting accuracy and generalization across benchmark memory-inclusive systems—including perturbed pendulums, generalized Ornstein–Uhlenbeck processes, and rough Heston models. By unifying non-Markovian dynamics with data-driven learning, NSVEs establish a novel paradigm for modeling stochastic dynamical systems with long-range temporal dependencies.
📝 Abstract
Stochastic Volterra equations (SVEs) serve as mathematical models for the time evolutions of random systems with memory effects and irregular behaviour. We introduce neural stochastic Volterra equations as a physics-inspired architecture, generalizing the class of neural stochastic differential equations, and provide some theoretical foundation. Numerical experiments on various SVEs, like the disturbed pendulum equation, the generalized Ornstein--Uhlenbeck process and the rough Heston model are presented, comparing the performance of neural SVEs, neural SDEs and Deep Operator Networks (DeepONets).