Robust Transmission of Punctured Text with Large Language Model-based Recovery

📅 2025-03-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the insufficient robustness of semantic communication under low signal-to-noise ratio (SNR) conditions, this paper proposes a sparse text transmission paradigm: only a small set of semantically critical characters are transmitted, while the full text is reconstructed at the receiver using a large language model (LLM). Our key contributions are threefold: (1) the first end-to-end LLM-based text reconstruction architecture for semantic communication; (2) an Importance-aware Character Extractor (ICE), enabling adaptive, semantics-guided character selection optimized for reconstruction fidelity; and (3) the integration of semantic-level channel coding with low-SNR-robust transmission mechanisms. Extensive experiments across multiple datasets and tasks demonstrate stable performance, substantial gains over random-character baselines, and consistent superiority over conventional bit-level communication under low-SNR regimes. This work establishes a novel, robust paradigm for semantic communication.

Technology Category

Application Category

📝 Abstract
With the recent advancements in deep learning, semantic communication which transmits only task-oriented features, has rapidly emerged. However, since feature extraction relies on learning-based models, its performance fundamentally depends on the training dataset or tasks. For practical scenarios, it is essential to design a model that demonstrates robust performance regardless of dataset or tasks. In this correspondence, we propose a novel text transmission model that selects and transmits only a few characters and recovers the missing characters at the receiver using a large language model (LLM). Additionally, we propose a novel importance character extractor (ICE), which selects transmitted characters to enhance LLM recovery performance. Simulations demonstrate that the proposed filter selection by ICE outperforms random filter selection, which selects transmitted characters randomly. Moreover, the proposed model exhibits robust performance across different datasets and tasks and outperforms traditional bit-based communication in low signal-to-noise ratio conditions.
Problem

Research questions and friction points this paper is trying to address.

Design robust text transmission model using LLM
Enhance recovery with importance character extractor (ICE)
Outperform traditional communication in low SNR conditions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses large language model for text recovery
Introduces importance character extractor (ICE)
Enhances performance in low signal-to-noise conditions
🔎 Similar Papers
No similar papers found.