Entropic Analysis of Time Series through Kernel Density Estimation

📅 2025-03-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the challenges of modeling multiscale complexity evolution in time series and robustly detecting dynamic change points. Methodologically, it proposes an entropy analysis framework integrating Takens’ embedding with kernel density estimation (KDE). First, it introduces a multiscale KDE entropy variation metric, ΔKE, which quantifies the scale-dependent information expansion within the reconstructed phase space—a novel characterization. Second, it designs a change-point detection algorithm based on sliding-baseline KL divergence, markedly enhancing noise resilience and robustness to outliers. Empirical evaluation demonstrates high-precision identification of injected events in RF signals, accurate ventricular fibrillation detection in ECG data, and successful recognition of intermittent chaotic states. These results validate the framework’s broad applicability and effectiveness across domains including wireless communications, clinical diagnostics, and nonlinear dynamical systems.

Technology Category

Application Category

📝 Abstract
This work presents a novel framework for time series analysis using entropic measures based on the kernel density estimate (KDE) of the time series' Takens' embeddings. Using this framework we introduce two distinct analytical tools: (1) a multi-scale KDE entropy metric, denoted as $Δ ext{KE}$, which quantifies the evolution of time series complexity across different scales by measuring certain entropy changes, and (2) a sliding baseline method that employs the Kullback-Leibler (KL) divergence to detect changes in time series dynamics through changes in KDEs. The $Δ{ m KE}$ metric offers insights into the information content and ``unfolding'' properties of the time series' embedding related to dynamical systems, while the KL divergence-based approach provides a noise and outlier robust approach for identifying time series change points (injections in RF signals, e.g.). We demonstrate the versatility and effectiveness of these tools through a set of experiments encompassing diverse domains. In the space of radio frequency (RF) signal processing, we achieve accurate detection of signal injections under varying noise and interference conditions. Furthermore, we apply our methodology to electrocardiography (ECG) data, successfully identifying instances of ventricular fibrillation with high accuracy. Finally, we demonstrate the potential of our tools for dynamic state detection by accurately identifying chaotic regimes within an intermittent signal. These results show the broad applicability of our framework for extracting meaningful insights from complex time series data across various scientific disciplines.
Problem

Research questions and friction points this paper is trying to address.

Develops multi-scale entropy metric for time series complexity analysis
Introduces Kullback-Leibler divergence method for change point detection
Applies framework to RF signals and ECG data for anomaly detection
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-scale KDE entropy for complexity evolution
KL divergence for robust change point detection
Takens' embeddings with kernel density estimation
🔎 Similar Papers
No similar papers found.
Audun Myers
Audun Myers
Data Scientist at Pacific Northwest National Laboratory
Complex Data ModelingTopological Data AnalysisMachine Learning
Bill Kay
Bill Kay
Mathematician
CombinatoricsGraph TheoryInformation Theory
I
Iliana Alvarez
Pacific Northwest National Laboratory, California State University
Michael Hughes
Michael Hughes
Pacific Northwest National Laboratory
C
Cameron Mackenzie
Pacific Northwest National Laboratory
C
Carlos Ortiz Marrero
Pacific Northwest National Laboratory
E
Emily Ellwein
Pacific Northwest National Laboratory
Erik Lentz
Erik Lentz
Pacific Northwest National Laboratory