🤖 AI Summary
Diffusion models for time-series forecasting suffer from fixed source distributions and inflexible, unidirectional sampling paths; while flow matching offers computational efficiency and modeling flexibility, its potential to explicitly model auxiliary model prediction errors remains unexplored. This paper proposes Conditional-Guided Flow Matching (CGFM), the first method to incorporate prior model prediction errors as explicit conditioning signals—jointly with historical observations—to construct a bidirectional conditional probability path. To enhance expressivity and robustness, CGFM introduces an affine transformation to expand the path space. Integrating flow matching, error-guided conditioning, and a general affine path design, CGFM balances high-fidelity generation with inference efficiency. Extensive experiments across diverse time-series forecasting benchmarks demonstrate that CGFM consistently outperforms state-of-the-art methods, achieving superior accuracy, strong generalization across domains, and practical effectiveness.
📝 Abstract
Diffusion models, a type of generative model, have shown promise in time series forecasting. But they face limitations like rigid source distributions and limited sampling paths, which hinder their performance. Flow matching offers faster generation, higher-quality outputs, and greater flexibility, while also possessing the ability to utilize valuable information from the prediction errors of prior models, which were previously inaccessible yet critically important. To address these challenges and fully unlock the untapped potential of flow matching, we propose Conditional Guided Flow Matching (CGFM). CGFM extends flow matching by incorporating the outputs of an auxiliary model, enabling a previously unattainable capability in the field: learning from the errors of the auxiliary model. For time series forecasting tasks, it integrates historical data as conditions and guidance, constructs two-sided conditional probability paths, and uses a general affine path to expand the space of probability paths, ultimately leading to improved predictions. Extensive experiments show that CGFM consistently enhances and outperforms state-of-the-art models, highlighting its effectiveness in advancing forecasting methods.