🤖 AI Summary
To address the weak long-term dependency capture and low training efficiency of Neural Controlled Differential Equations (NCDEs) in modeling irregularly sampled multivariate time series, this paper proposes Log-NCDEs—a novel framework that introduces the Log-ODE method into NCDEs for the first time. Grounded in rough path theory, Log-NCDEs explicitly encodes the non-commutative structure of vector fields via Lie bracket expansions to construct high-order path signatures. This design substantially improves solution approximation accuracy and gradient stability, thereby enhancing long-horizon temporal dependency modeling. Extensive experiments on real-world and synthetic datasets with up to 50,000 irregular observations demonstrate that Log-NCDEs consistently outperforms state-of-the-art models—including NCDE, NRDE, LRU, S5, and Mamba—in both predictive accuracy and training speed. Log-NCDEs thus establishes a new paradigm for irregular time series modeling that unifies theoretical rigor with practical efficiency.
📝 Abstract
The vector field of a controlled differential equation (CDE) describes the relationship between a control path and the evolution of a solution path. Neural CDEs (NCDEs) treat time series data as observations from a control path, parameterise a CDE's vector field using a neural network, and use the solution path as a continuously evolving hidden state. As their formulation makes them robust to irregular sampling rates, NCDEs are a powerful approach for modelling real-world data. Building on neural rough differential equations (NRDEs), we introduce Log-NCDEs, a novel, effective, and efficient method for training NCDEs. The core component of Log-NCDEs is the Log-ODE method, a tool from the study of rough paths for approximating a CDE's solution. Log-NCDEs are shown to outperform NCDEs, NRDEs, the linear recurrent unit, S5, and MAMBA on a range of multivariate time series datasets with up to $50{,}000$ observations.