A Class of Subadditive Information Measures and their Applications

📅 2026-01-22
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the construction of generalized information measures satisfying subadditivity in information theory by introducing a class of $(G,f)$-divergences and their associated information measures, built from non-decreasing functions $G$ and $f$-divergences. By establishing a simplification principle on binary alphabets, the authors systematically develop a $(G,f)$-information framework that reduces the verification of subadditivity—under product distributions and channels—to the binary case. This approach yields easily verifiable sufficient conditions for subadditivity across a range of classical $f$-divergences and enables novel applications, including converse bounds for channel coding at finite blocklengths, performance bounds for binary hypothesis testing, and a generalization of the Shannon–Gallager–Berlekamp sphere-packing bound.

Technology Category

Application Category

📝 Abstract
We introduce a two-parameter family of discrepancy measures, termed \emph{$(G,f)$-divergences}, obtained by applying a non-decreasing function $G$ to an $f$-divergence $D_f$. Building on Csisz\'ar's formulation of mutual $f$-information, we define a corresponding $(G,f)$-information measure $ I_{G,f}(X;Y)$. A central theme of the paper is subadditivity over product distributions and product channels. We develop reduction principles showing that, for broad classes of $G$, it suffices to verify divergence subadditivity on binary alphabets. Specializing to the functions $G(x)\in\{x,\log(1+x),-\log(1-x)\}$, we derive tractable sufficient conditions on $f$ that guarantee subadditivity, covering many standard $f$-divergences. Finally, we present applications to finite-blocklength converses for channel coding, bounds in binary hypothesis testing, and an extension of the Shannon--Gallager--Berlekamp sphere-packing exponent framework to subadditive $(G,f)$-divergences.
Problem

Research questions and friction points this paper is trying to address.

subadditivity
information measures
f-divergences
channel coding
hypothesis testing
Innovation

Methods, ideas, or system contributions that make the work stand out.

(G,f)-divergence
subadditivity
f-divergence
mutual information
finite-blocklength converse
🔎 Similar Papers
No similar papers found.
H
Hamidreza Abin
Department of Information Engineering, The Chinese University of Hong Kong, Sha Tin, NT, Hong Kong
M
Mahdi Zinati
Department of Information Engineering, The Chinese University of Hong Kong, Sha Tin, NT, Hong Kong
A
Amin Gohari
Department of Information Engineering, The Chinese University of Hong Kong, Sha Tin, NT, Hong Kong
Mohammad Hossein Yassaee
Mohammad Hossein Yassaee
Sharif university of technology
Information theoryLearning theory
M
M. M. Mojahedian
Department of Electrical Engineering, Sharif University of Technology, Tehran, Iran