$α$-Mutual Information for the Gaussian Noise Channel

📅 2026-04-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the insufficient understanding of Sibson’s α-mutual information in additive Gaussian noise channels by systematically analyzing its regularity, continuity, and convexity properties, and establishing a generalized connection to the minimum mean-square error (MMSE). Leveraging tools such as α-tilted distributions, information geometry, and differential analysis, the authors derive an α-I-MMSE relationship and a generalized de Bruijn identity, yielding an estimator for Rényi entropy. Key contributions include characterizing asymptotic behaviors at high and low signal-to-noise ratios, proving that α-mutual information converges to the order-1/α Rényi entropy for discrete inputs, and uncovering its intrinsic link to the α-information dimension, thereby extending classical results for Shannon mutual information (α=1).

Technology Category

Application Category

📝 Abstract
In this paper, we study Sibson's $α$-mutual information in the context of the additive Gaussian noise channel. While the classical case $α= 1$ is well understood and admits deep connections to estimation-theoretic quantities, such as the minimum mean-square error (MMSE) and Fisher information, many of the corresponding structural properties for general $α$ remain less explored. Our goal is to develop a systematic understanding of $α$-mutual information in the Gaussian noise setting and to identify which properties extend beyond the Shannon case. To this end, we establish several regularity properties, including finiteness conditions, continuity with respect to the signal-to-noise ratio (SNR) and the input distribution, and strict concavity/convexity properties that ensure uniqueness in associated optimization problems. A central contribution is the development of an $α$-I-MMSE relationship, generalizing the classical identity by relating the derivative of $α$-mutual information with respect to SNR to the MMSE evaluated under appropriately tilted distributions. This connection further leads to a generalized de Bruijn identity and new estimation-theoretic representations of Rényi entropy and differential Rényi entropy. We also characterize the low- and high-SNR behavior. In the low-SNR regime, the first-order behavior depends only on the input variance. In the high-SNR regime, for discrete inputs, $α$-mutual information converges to the Rényi entropy of order $1/α$, while for general inputs we connect it to $α$-information dimension. Overall, our results show that many fundamental relationships between information and estimation extend beyond the Shannon setting, in a form involving $α$-tilted distributions.
Problem

Research questions and friction points this paper is trying to address.

α-Mutual Information
Gaussian Noise Channel
Rényi Entropy
MMSE
Information-Estimation Relations
Innovation

Methods, ideas, or system contributions that make the work stand out.

α-mutual information
I-MMSE relation
Gaussian channel
Rényi entropy
tilted distributions
🔎 Similar Papers
No similar papers found.