The many faces of multivariate information

📅 2026-01-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of uniformly characterizing higher-order synergy and redundancy in multivariate systems. The authors propose a general functional \( \Delta^k \), parameterized by an integer \( k \), constructed within a framework combining joint, conditional, and marginal entropies along with entropy conjugates. Its dual, \( \Gamma^k \), captures higher-order redundancy structures. This unified framework subsumes existing measures—such as S-information, dual total correlation, and O-information—as special cases of \( \Delta^k \) at specific values of \( k \). The sign of \( \Delta^k \) provides a clear interpretation: \( \Delta^k > 0 \), \( < 0 \), or \( = 0 \) indicates that the dominant interaction order is respectively higher than, lower than, or exactly \( k \). Thus, the approach establishes a hierarchical, unified theory of higher-order information interactions, systematically integrating disparate measures in the current literature.

Technology Category

Application Category

📝 Abstract
Extracting higher-order structures from multivariate data has become an area of intensive study in complex systems science, as these multipartite interactions can reveal insights into fundamental features of complex systems like emergent phenomena. Information theory provides a natural language for exploring these interactions, as it elegantly formalizes the problem of comparing ``wholes"and ``parts"using joint, conditional, and marginal entropies. A large number of distinct statistics have been developed over the years, all aiming to capture different aspects of ``higher-order"information sharing. Here, we show that three of them (the dual total correlation, S-information, and O-information) are special cases of a more general function, $\Delta^{k}$ which is parameterized by a free parameter $k$. For different values of $k$, we recover different measures: $\Delta^{0}$ is equal to the S-information, $\Delta^{1}$ is equal to the dual total correlation, and $\Delta^{2}$ is equal to the negative O-information. Generally, the $\Delta^{k}$ function is arranged into a hierarchy of increasingly high-order synergies; for a given value of $k$, if $\Delta^{k}>0$, then the system is dominated by interactions with order greater than $k$, while if $\Delta^{k}<0$, then the system is dominated by interactions with order lower than $k$. $\Delta^{k}=0$ if the system is composed entirely of synergies of order-k. Using the entropic conjugation framework, we also find that the conjugate of $\Delta^{k}$, which we term $\Gamma^{k}$ is arranged into a similar hierarchy of increasingly high-order redundancies. These results provide new insights into the nature of both higher-order redundant and synergistic interactions, and helps unify the existing zoo of measures into a more coherent structure.
Problem

Research questions and friction points this paper is trying to address.

multivariate information
higher-order interactions
synergy
redundancy
information theory
Innovation

Methods, ideas, or system contributions that make the work stand out.

higher-order synergy
multivariate information
information decomposition
entropy conjugation
unified framework
🔎 Similar Papers
No similar papers found.