On the R\'enyi Rate-Distortion-Perception Function and Functional Representations

📅 2026-01-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the fundamental trade-off between compression rate, distortion, and perceptual quality—a challenge inadequately captured by classical rate-distortion theory—by extending the rate-distortion-perception (RDP) framework into the Rényi information-theoretic setting. Focusing on Gaussian sources under joint distortion and perception constraints, the study leverages Sibson’s α-mutual information to characterize the corresponding information-theoretic limits. The core contributions include the first establishment of a Rényi generalized strong functional representation lemma, the derivation of a closed-form expression for the Rényi RDP function in the scalar Gaussian case, and the discovery of a phase transition governed by the parameter α: for 0.5 < α < 1, optimal codebooks exhibit heavy-tailed structures, whereas for α > 1, they collapse to finite support. These findings significantly deepen the understanding of the interplay between perception-distortion trade-offs and abrupt changes in coding structure under shared randomness.

Technology Category

Application Category

📝 Abstract
We extend the Rate-Distortion-Perception (RDP) framework to the R\'enyi information-theoretic regime, utilizing Sibson's $\alpha$-mutual information to characterize the fundamental limits under distortion and perception constraints. For scalar Gaussian sources, we derive closed-form expressions for the R\'enyi RDP function, showing that the perception constraint induces a feasible interval for the reproduction variance. Furthermore, we establish a R\'enyi-generalized version of the Strong Functional Representation Lemma. Our analysis reveals a phase transition in the complexity of optimal functional representations: for $0.5<\alpha<1$, the coding cost is bounded by the $\alpha$-divergence of order $\alpha+1$, necessitating a codebook with heavy-tailed polynomial decay; conversely, for $\alpha>1$, the representation collapses to one with finite support, offering new insights into the compression of shared randomness under generalized notions of mutual information.
Problem

Research questions and friction points this paper is trying to address.

Rate-Distortion-Perception
Rényi information
perception constraint
functional representation
information-theoretic limits
Innovation

Methods, ideas, or system contributions that make the work stand out.

Rényi information theory
Rate-Distortion-Perception
α-mutual information
functional representation
phase transition
🔎 Similar Papers
No similar papers found.