π€ AI Summary
This work addresses the challenge of anomaly detection in multivariate time series, where complex temporal dependencies and inherent uncertainty hinder reliable identification of rare events. To this end, we propose temporal conditional Normalizing Flows (tcNF), a novel autoregressive probabilistic model that, for the first time, integrates a temporal conditioning mechanism into normalizing flows. By dynamically modeling the joint probability distribution conditioned on historical observations, tcNF effectively captures both the evolving temporal patterns and associated uncertainties, enabling precise detection of low-probability anomalies. Extensive experiments demonstrate that tcNF significantly outperforms current state-of-the-art methods across multiple benchmark datasets, achieving high accuracy and strong robustness. The source code has been made publicly available to support reproducible research.
π Abstract
This paper introduces temporal-conditioned normalizing flows (tcNF), a novel framework that addresses anomaly detection in time series data with accurate modeling of temporal dependencies and uncertainty. By conditioning normalizing flows on previous observations, tcNF effectively captures complex temporal dynamics and generates accurate probability distributions of expected behavior. This autoregressive approach enables robust anomaly detection by identifying low-probability events within the learned distribution. We evaluate tcNF on diverse datasets, demonstrating good accuracy and robustness compared to existing methods. A comprehensive analysis of strengths and limitations and open-source code is provided to facilitate reproducibility and future research.