🤖 AI Summary
To address the high computational overhead and insufficient robustness of end-to-end neural communication systems under high-order modulation, this paper proposes a joint optimization framework integrating a bit-level efficient receiver with a symbol-level autoencoder (AE). We innovatively design a symbol-level AE that breaks the conventional modular physical-layer paradigm, enabling end-to-end joint learning of channel coding, modulation, and demodulation. Our approach unifies information-theoretic modeling with dual-granularity (bit- and symbol-level) cooperative training. Experiments under 16-QAM and 64-QAM demonstrate that the proposed scheme achieves significantly lower bit error rates (BER) than baseline methods and approaches the Shannon limit. Furthermore, we empirically validate the critical role of training signal-to-noise ratio (SNR) in cross-SNR generalization performance, achieving BER accuracy within theoretical limits.
📝 Abstract
Neural network (NN)-based end-to-end (E2E) communication systems, in which each system component may consist of a portion of a neural network, have been investigated as potential tools for developing artificial intelligence (Al)-native E2E systems. In this paper, we propose an NN-based bitwise receiver that improves computational efficiency while maintaining performance comparable to baseline demappers. Building on this foundation, we introduce a novel symbol-wise autoencoder (AE)-based E2E system that jointly optimizes the transmitter and receiver at the physical layer. We evaluate the proposed NN-based receiver using bit-error rate (BER) analysis to confirm that the numerical BER achieved by NN-based receivers or transceivers is accurate. Results demonstrate that the AE-based system outperforms baseline architectures, particularly for higher-order modulation schemes. We further show that the training signal-to-noise ratio (SNR) significantly affects the performance of the systems when inference is conducted at different SNR levels.