Log Neural Controlled Differential Equations: The Lie Brackets Make a Difference

📅 2024-02-28
🏛️ International Conference on Machine Learning
📈 Citations: 17
Influential: 3
📄 PDF
🤖 AI Summary
To address the weak long-term dependency capture and low training efficiency of Neural Controlled Differential Equations (NCDEs) in modeling irregularly sampled multivariate time series, this paper proposes Log-NCDEs—a novel framework that introduces the Log-ODE method into NCDEs for the first time. Grounded in rough path theory, Log-NCDEs explicitly encodes the non-commutative structure of vector fields via Lie bracket expansions to construct high-order path signatures. This design substantially improves solution approximation accuracy and gradient stability, thereby enhancing long-horizon temporal dependency modeling. Extensive experiments on real-world and synthetic datasets with up to 50,000 irregular observations demonstrate that Log-NCDEs consistently outperforms state-of-the-art models—including NCDE, NRDE, LRU, S5, and Mamba—in both predictive accuracy and training speed. Log-NCDEs thus establishes a new paradigm for irregular time series modeling that unifies theoretical rigor with practical efficiency.

Technology Category

Application Category

📝 Abstract
The vector field of a controlled differential equation (CDE) describes the relationship between a control path and the evolution of a solution path. Neural CDEs (NCDEs) treat time series data as observations from a control path, parameterise a CDE's vector field using a neural network, and use the solution path as a continuously evolving hidden state. As their formulation makes them robust to irregular sampling rates, NCDEs are a powerful approach for modelling real-world data. Building on neural rough differential equations (NRDEs), we introduce Log-NCDEs, a novel, effective, and efficient method for training NCDEs. The core component of Log-NCDEs is the Log-ODE method, a tool from the study of rough paths for approximating a CDE's solution. Log-NCDEs are shown to outperform NCDEs, NRDEs, the linear recurrent unit, S5, and MAMBA on a range of multivariate time series datasets with up to $50{,}000$ observations.
Problem

Research questions and friction points this paper is trying to address.

Enhancing neural CDE training efficiency using Lie brackets
Improving time series modeling with rough path approximations
Outperforming existing methods on large multivariate datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Log-NCDEs use Lie brackets for efficient training
Log-ODE method approximates CDE solutions effectively
Method outperforms existing models on time series
🔎 Similar Papers
No similar papers found.
B
Benjamin Walker
Mathematical Institute, University of Oxford, Oxford, Uk
A
Andrew D. McLeod
Mathematical Institute, University of Oxford, Oxford, Uk
Tiexin Qin
Tiexin Qin
City University of Hong Kong
Machine LearningDynamical Systems
Y
Yichuan Cheng
Department of Electrical Engineering, City University of Hong Kong, Hong Kong
Haoliang Li
Haoliang Li
Department of Electrical Engineering, City University of Hong Kong
AI SecurityInformation Forensics and SecurityMachine Learning
T
Terry Lyons
Mathematical Institute, University of Oxford, Oxford, Uk