🤖 AI Summary
This work addresses the challenges of insufficient modeling of multi-scale temporal patterns and structural heterogeneity, along with data scarcity and privacy concerns in time series generation. To this end, the authors propose a structure-disentangled, multi-scale generative framework that employs a dual-path Vector Quantized Variational Autoencoder (VQ-VAE) to separately capture trend and seasonal components, yielding semantically consistent discrete latent representations. A coarse-to-fine autoregressive generation strategy is introduced, augmented with a guidance-based reconstruction mechanism that leverages coarse-grained seasonal signals to inform fine-grained pattern synthesis. Evaluated on six benchmark datasets, the proposed method significantly outperforms existing approaches in terms of generation quality and long-range dependency modeling, while substantially reducing model parameters—demonstrating notable efficiency and scalability.
📝 Abstract
Generative modeling offers a promising solution to data scarcity and privacy challenges in time series analysis. However, the structural complexity of time series, characterized by multi-scale temporal patterns and heterogeneous components, remains insufficiently addressed. In this work, we propose a structure-disentangled multiscale generation framework for time series. Our approach encodes sequences into discrete tokens at multiple temporal resolutions and performs autoregressive generation in a coarse-to-fine manner, thereby preserving hierarchical dependencies. To tackle structural heterogeneity, we introduce a dual-path VQ-VAE that disentangles trend and seasonal components, enabling the learning of semantically consistent latent representations. Additionally, we present a guidance-based reconstruction strategy, where coarse seasonal signals are utilized as priors to guide the reconstruction of fine-grained seasonal patterns. Experiments on six datasets show that our approach produces higher-quality time series than existing methods. Notably, our model achieves strong performance with a significantly reduced parameter count and exhibits superior capability in generating high-quality long-term sequences. Our implementation is available at https://anonymous.4open.science/r/TimeMAR-BC5B.