Oscillators Are All You Need: Irregular Time Series Modelling via Damped Harmonic Oscillators with Closed-Form Solutions

📅 2026-02-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing Transformer architectures struggle to effectively model irregularly sampled time series, while methods based on Neural Ordinary Differential Equations (NODEs) enable continuous-time dynamics but incur high computational costs due to reliance on numerical solvers. This work proposes a novel continuous-time attention mechanism that models key-value pairs as linear damped driven harmonic oscillators admitting closed-form solutions, with query vectors expanded in a sinusoidal basis, thereby interpreting attention as a resonance process. The approach eliminates the need for numerical integration, removing the primary computational bottleneck while preserving universal approximation capability. Empirical results demonstrate that the proposed model achieves state-of-the-art performance across multiple irregular time series benchmarks and offers inference speeds several orders of magnitude faster than existing approaches such as ContiFormer.

Technology Category

Application Category

📝 Abstract
Transformers excel at time series modelling through attention mechanisms that capture long-term temporal patterns. However, they assume uniform time intervals and therefore struggle with irregular time series. Neural Ordinary Differential Equations (NODEs) effectively handle irregular time series by modelling hidden states as continuously evolving trajectories. ContiFormers arxiv:2402.10635 combine NODEs with Transformers, but inherit the computational bottleneck of the former by using heavy numerical solvers. This bottleneck can be removed by using a closed-form solution for the given dynamical system - but this is known to be intractable in general! We obviate this by replacing NODEs with a novel linear damped harmonic oscillator analogy - which has a known closed-form solution. We model keys and values as damped, driven oscillators and expand the query in a sinusoidal basis up to a suitable number of modes. This analogy naturally captures the query-key coupling that is fundamental to any transformer architecture by modelling attention as a resonance phenomenon. Our closed-form solution eliminates the computational overhead of numerical ODE solvers while preserving expressivity. We prove that this oscillator-based parameterisation maintains the universal approximation property of continuous-time attention; specifically, any discrete attention matrix realisable by ContiFormer's continuous keys can be approximated arbitrarily well by our fixed oscillator modes. Our approach delivers both theoretical guarantees and scalability, achieving state-of-the-art performance on irregular time series benchmarks while being orders of magnitude faster.
Problem

Research questions and friction points this paper is trying to address.

irregular time series
computational bottleneck
neural ODEs
time series modelling
non-uniform sampling
Innovation

Methods, ideas, or system contributions that make the work stand out.

Damped Harmonic Oscillators
Closed-Form Solution
Irregular Time Series
Continuous-Time Attention
Transformer
Y
Yashas Shende
Department of Physics, Ashoka University, Haryana, India (HR – 131029)
Aritra Das
Aritra Das
University of Maryland, College Park
Machine learningCondensed matter theoryLattice gauge theories
R
Reva Laxmi Chauhan
Department of Computer Science, Ashoka University, Haryana, India (HR – 131029)
A
Arghya Pathak
Department of Computer Science, Ashoka University, Haryana, India (HR – 131029)
Debayan Gupta
Debayan Gupta
Massachusetts Institute of Technology
CryptographySecure Multi-Party ComputationPrivacyDatabases