Deep Time-series Forecasting Needs Kernelized Moment Balancing

📅 2026-01-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitation of existing deep time series forecasting methods, which rely on fixed balancing functions and fail to satisfy the first-moment matching condition required for distributional balance, leading to misalignment between predicted and true distributions. The authors reformulate forecasting as a distributional balancing problem and introduce, for the first time, a kernelized moment balancing mechanism that adaptively selects the most informative balancing functions in a reproducing kernel Hilbert space (RKHS). This approach achieves first-moment matching for arbitrary functions, rigorously fulfilling the Imbens distributional balancing criterion. The resulting differentiable and computable objective function enables end-to-end optimization and seamless integration with mainstream deep forecasting models. Extensive experiments demonstrate that the proposed method significantly improves prediction accuracy across multiple benchmark models and datasets, achieving state-of-the-art performance and confirming its effectiveness and generalizability.

Technology Category

Application Category

📝 Abstract
Deep time-series forecasting can be formulated as a distribution balancing problem aimed at aligning the distribution of the forecasts and ground truths. According to Imbens'criterion, true distribution balance requires matching the first moments with respect to any balancing function. We demonstrate that existing objectives fail to meet this criterion, as they enforce moment matching only for one or two predefined balancing functions, thus failing to achieve full distribution balance. To address this limitation, we propose direct forecasting with kernelized moment balancing (KMB-DF). Unlike existing objectives, KMB-DF adaptively selects the most informative balancing functions from a reproducing kernel hilbert space (RKHS) to enforce sufficient distribution balancing. We derive a tractable and differentiable objective that enables efficient estimation from empirical samples and seamless integration into gradient-based training pipelines. Extensive experiments across multiple models and datasets show that KMB-DF consistently improves forecasting accuracy and achieves state-of-the-art performance. Code is available at https://anonymous.4open.science/r/KMB-DF-403C.
Problem

Research questions and friction points this paper is trying to address.

deep time-series forecasting
distribution balancing
moment matching
balancing function
Imbens' criterion
Innovation

Methods, ideas, or system contributions that make the work stand out.

kernelized moment balancing
distribution balancing
reproducing kernel Hilbert space
deep time-series forecasting
moment matching
🔎 Similar Papers
No similar papers found.