A Theoretically-Grounded Codebook for Digital Semantic Communications

📅 2025-10-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address low quantization efficiency, poor robustness, and semantic distortion induced by channel noise in digital semantic communication, this paper proposes an information-theoretic, learnable codebook design framework. We first establish a theoretical equivalence between semantic synonym mapping and Voronoi partitioning, then formulate an end-to-end jointly optimized objective comprising a semantic-maximizing entropy-regularized quantization loss and a channel-aware semantic distortion loss. The method integrates mutual information maximization, Voronoi-based quantization modeling, and channel distortion characterization. Evaluated on image reconstruction, the proposed approach achieves a 24.1% PSNR gain and a 46.5% improvement in LPIPS perceptual similarity at 10 dB SNR, significantly mitigating semantic distortion. This work introduces a novel paradigm for efficient and reliable semantic-driven transmission.

Technology Category

Application Category

📝 Abstract
The use of a learnable codebook provides an efficient way for semantic communications to map vector-based high-dimensional semantic features onto discrete symbol representations required in digital communication systems. In this paper, the problem of codebook-enabled quantization mapping for digital semantic communications is studied from the perspective of information theory. Particularly, a novel theoretically-grounded codebook design is proposed for jointly optimizing quantization efficiency, transmission efficiency, and robust performance. First, a formal equivalence is established between the one-to-many synonymous mapping defined in semantic information theory and the many-to-one quantization mapping based on the codebook's Voronoi partitions. Then, the mutual information between semantic features and their quantized indices is derived in order to maximize semantic information carried by discrete indices. To realize the semantic maximum in practice, an entropy-regularized quantization loss based on empirical estimation is introduced for end-to-end codebook training. Next, the physical channel-induced semantic distortion and the optimal codebook size for semantic communications are characterized under bit-flip errors and semantic distortion. To mitigate the semantic distortion caused by physical channel noise, a novel channel-aware semantic distortion loss is proposed. Simulation results on image reconstruction tasks demonstrate the superior performance of the proposed theoretically-grounded codebook that achieves a 24.1% improvement in peak signal-to-noise ratio (PSNR) and a 46.5% improvement in learned perceptual image patch similarity (LPIPS) compared to the existing codebook designs when the signal-to-noise ratio (SNR) is 10 dB.
Problem

Research questions and friction points this paper is trying to address.

Optimizing codebook design for digital semantic communication systems
Maximizing semantic information transmission through discrete quantization mapping
Mitigating semantic distortion caused by physical channel noise effects
Innovation

Methods, ideas, or system contributions that make the work stand out.

Theoretically-grounded codebook optimizes quantization and transmission efficiency
Mutual information maximizes semantic information in discrete indices
Channel-aware semantic distortion loss mitigates physical noise effects
🔎 Similar Papers
No similar papers found.