STaTS: Structure-Aware Temporal Sequence Summarization via Statistical Window Merging

📅 2025-10-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Time series exhibit structural characteristics—including change points, repetitive patterns, and local stationarity—yet prevailing representation learning methods often neglect their non-stationarity and heterogeneity, leading to inefficient and brittle long-sequence modeling. To address this, we propose a lightweight, unsupervised preprocessing framework: it first performs multi-scale statistical change-point detection guided by the Bayesian Information Criterion (BIC) for structure-aware segmentation; then applies sliding-window merging and mean aggregation to produce compact tokenized sequences; finally leverages Gaussian Mixture Models (GMM) to preserve semantic fidelity. The method requires no model coupling and is fully plug-and-play. Evaluated across 150+ benchmark datasets, it achieves up to 30× sequence compression while retaining 85–90% of original model performance—substantially outperforming uniform sampling and clustering baselines. It demonstrates superior efficiency, robustness, and scalability.

Technology Category

Application Category

📝 Abstract
Time series data often contain latent temporal structure, transitions between locally stationary regimes, repeated motifs, and bursts of variability, that are rarely leveraged in standard representation learning pipelines. Existing models typically operate on raw or fixed-window sequences, treating all time steps as equally informative, which leads to inefficiencies, poor robustness, and limited scalability in long or noisy sequences. We propose STaTS, a lightweight, unsupervised framework for Structure-Aware Temporal Summarization that adaptively compresses both univariate and multivariate time series into compact, information-preserving token sequences. STaTS detects change points across multiple temporal resolutions using a BIC-based statistical divergence criterion, then summarizes each segment using simple functions like the mean or generative models such as GMMs. This process achieves up to 30x sequence compression while retaining core temporal dynamics. STaTS operates as a model-agnostic preprocessor and can be integrated with existing unsupervised time series encoders without retraining. Extensive experiments on 150+ datasets, including classification tasks on the UCR-85, UCR-128, and UEA-30 archives, and forecasting on ETTh1 and ETTh2, ETTm1, and Electricity, demonstrate that STaTS enables 85-90% of the full-model performance while offering dramatic reductions in computational cost. Moreover, STaTS improves robustness under noise and preserves discriminative structure, outperforming uniform and clustering-based compression baselines. These results position STaTS as a principled, general-purpose solution for efficient, structure-aware time series modeling.
Problem

Research questions and friction points this paper is trying to address.

Detects change points in time series data
Compresses sequences while preserving temporal dynamics
Enables efficient modeling without retraining existing encoders
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptively compresses time series via statistical window merging
Detects change points using BIC-based divergence criterion
Summarizes segments with functions like mean or GMMs
🔎 Similar Papers
No similar papers found.
D
Disharee Bhowmick
CSSE Department, Auburn University
R
Ranjith Ramanathan
Department of Animal and Food Sciences, Oklahoma State University
Sathyanarayanan N. Aakur
Sathyanarayanan N. Aakur
Assistant Professor, Auburn University
Event UnderstandingVisual CommonsenseMetagenome Analysis