🤖 AI Summary
This work addresses the lack of a general theoretical understanding of the reliability of persistent entropy in detecting phase transitions under stochastic and data-driven settings. We establish the first model-agnostic sufficient condition guaranteeing an asymptotically non-vanishing gap in persistent entropy across phase transitions, and introduce a sliding-window-based topological stabilization framework that enables robust phase transition detection from finite observations. The theory is unified across multiple data modalities, filtration schemes, and homological dimensions. Empirical validation on Kuramoto synchronization, Vicsek collective motion, and neural network training dynamics demonstrates that the stabilization of persistent entropy—accompanied by a collapse in its variability—serves as a universal and robust numerical signature of phase transitions.
📝 Abstract
Persistent entropy (PE) is an information-theoretic summary statistic of persistence barcodes that has been widely used to detect regime changes in complex systems. Despite its empirical success, a general theoretical understanding of when and why persistent entropy reliably detects phase transitions has remained limited, particularly in stochastic and data-driven settings. In this work, we establish a general, model-independent theorem providing sufficient conditions under which persistent entropy provably separates two phases. We show that persistent entropy exhibits an asymptotically non-vanishing gap across phases. The result relies only on continuity of persistent entropy along the convergent diagram sequence, or under mild regularization, and is therefore broadly applicable across data modalities, filtrations, and homological degrees. To connect asymptotic theory with finite-time computations, we introduce an operational framework based on topological stabilization, defining a topological transition time by stabilizing a chosen topological statistic over sliding windows, and a probability-based estimator of critical parameters within a finite observation horizon. We validate the framework on the Kuramoto synchronization transition, the Vicsek order-to-disorder transition in collective motion, and neural network training dynamics across multiple datasets and architectures. Across all experiments, stabilization of persistent entropy and collapse of variability across realizations provide robust numerical signatures consistent with the theoretical mechanism.