Harnessing Contrastive Learning and Neural Transformation for Time Series Anomaly Detection

📅 2023-04-16
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
In unsupervised industrial time-series anomaly detection, contrastive learning often suffers from representation collapse and missed anomalies due to difficulties in constructing informative negative samples and neglecting temporal context. To address this, we propose the Contrastive Neural Transformation (CNT) framework, a synergistic architecture integrating window-level contrastive learning with learnable neural transformations. CNT theoretically avoids trivial constant-encoder solutions, introduces a parameterized time-warping module to explicitly model local temporal anomalies, and jointly optimizes window-sampling contrastive loss, self-supervised representation learning, and reconstruction-based auxiliary regularization. Evaluated on multiple real-world industrial datasets, CNT achieves an average 9.2% improvement in F1-score over state-of-the-art methods, demonstrating superior robustness and eliminating the need for any anomaly labels.
📝 Abstract
Time series anomaly detection (TSAD) plays a vital role in many industrial applications. While contrastive learning has gained momentum in the time series domain for its prowess in extracting meaningful representations from unlabeled data, its straightforward application to anomaly detection is not without hurdles. Firstly, contrastive learning typically requires negative sampling to avoid the representation collapse issue, where the encoder converges to a constant solution. However, drawing from the same dataset for dissimilar samples is ill-suited for TSAD as most samples are ``normal'' in the training dataset. Secondly, conventional contrastive learning focuses on instance discrimination, which may overlook anomalies that are detectable when compared to their temporal context. In this study, we propose a novel approach, CNT, that incorporates a window-based contrastive learning strategy fortified with learnable transformations. This dual configuration focuses on capturing temporal anomalies in local regions while simultaneously mitigating the representation collapse issue. Our theoretical analysis validates the effectiveness of CNT in circumventing constant encoder solutions. Through extensive experiments on diverse real-world industrial datasets, we show the superiority of our framework by outperforming various baselines and model variants.
Problem

Research questions and friction points this paper is trying to address.

Time Series Analysis
Anomaly Detection
Contrastive Learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

CNT
Anomaly Detection
Window Contrastive Learning
J
Jiazhen Chen
dept. Statistics and Actuarial Science, University of Waterloo
M
Mingbin Feng
dept. Statistics and Actuarial Science, University of Waterloo
Tony S. Wirjanto
Tony S. Wirjanto
Department of Statistics & Actuarial Science, University of Waterloo
Financial Econometrics/Time SeriesFinancial MathematicsComputational Finance