Closing the Gap Between Synthetic and Ground Truth Time Series Distributions via Neural Mapping

๐Ÿ“… 2025-01-29
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
To address fidelity degradation in time-series generation caused by distribution distortion and reconstruction artifacts, this paper proposes NM-VQTSGโ€”a novel framework that introduces the U-Net-based neural mapping mechanism into the post-processing stage of vector-quantized (VQ) time-series generation. Unlike prior approaches, NM-VQTSG enables plug-and-play, model-agnostic latent-space distribution alignment without retraining. It overcomes two fundamental bottlenecks inherent to discrete VQ: information loss due to quantization and mismatch between the learned codebook prior and true data distribution. Evaluated across multiple UCR time-series datasets, NM-VQTSG significantly improves both distributional and structural fidelity: Frรฉchet Inception Distance (FID) decreases by 37% on average, while Inception Score (IS) and conditional FID also improve consistently. Visual analysis confirms enhanced alignment between the data space and the VQ latent space, validating improved distributional consistency.

Technology Category

Application Category

๐Ÿ“ Abstract
In this paper, we introduce Neural Mapper for Vector Quantized Time Series Generator (NM-VQTSG), a novel method aimed at addressing fidelity challenges in vector quantized (VQ) time series generation. VQ-based methods, such as TimeVQVAE, have demonstrated success in generating time series but are hindered by two critical bottlenecks: information loss during compression into discrete latent spaces and deviations in the learned prior distribution from the ground truth distribution. These challenges result in synthetic time series with compromised fidelity and distributional accuracy. To overcome these limitations, NM-VQTSG leverages a U-Net-based neural mapping model to bridge the distributional gap between synthetic and ground truth time series. To be more specific, the model refines synthetic data by addressing artifacts introduced during generation, effectively aligning the distributions of synthetic and real data. Importantly, NM-VQTSG can be used for synthetic time series generated by any VQ-based generative method. We evaluate NM-VQTSG across diverse datasets from the UCR Time Series Classification archive, demonstrating its capability to consistently enhance fidelity in both unconditional and conditional generation tasks. The improvements are evidenced by significant improvements in FID, IS, and conditional FID, additionally backed up by visual inspection in a data space and a latent space. Our findings establish NM-VQTSG as a new method to improve the quality of synthetic time series. Our implementation is available on url{https://github.com/ML4ITS/TimeVQVAE}.
Problem

Research questions and friction points this paper is trying to address.

Time Series Generation
Fidelity Issue
Distribution Discrepancy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neural Mapper for Vector Quantized Time Series Generator (NM-VQTSG)
U-Net Neural Network Model
Fidelity Improvement in Time Series Generation
๐Ÿ”Ž Similar Papers
No similar papers found.