🤖 AI Summary
To address the high query overhead and the difficulty in balancing efficiency and performance in GRAND-type decoding algorithms, this paper proposes a constrained-generation method based on random invertible linear transformations and balanced-tree modeling. The core contribution is the first systematic reformulation of the parity-check matrix into a balanced-tree structure, integrated with random invertible transformations; this enables efficient derivation of up to log₂(n) linearly independent noise constraints without altering the original code structure, thereby achieving an exponential increase in the number of usable constraints. Theoretical analysis rigorously proves the validity and linear independence of the generated constraints. Monte Carlo simulations demonstrate that the proposed method significantly reduces decoding query counts while preserving near-maximum-likelihood performance, and achieves lower average computational complexity than Segmented GRAND.
📝 Abstract
Guessing Random Additive Noise Decoding (GRAND) and its variants, known for their near-maximum likelihood performance, have been introduced in recent years. One such variant, Segmented GRAND, reduces decoding complexity by generating only noise patterns that meet specific constraints imposed by the linear code. In this paper, we introduce a new method to efficiently derive multiple constraints from the parity check matrix. By applying a random invertible linear transformation and reorganizing the matrix into a tree structure, we extract up to log2(n) constraints, reducing the number of decoding queries while maintaining the structure of the original code for a code length of n. We validate the method through theoretical analysis and experimental simulations.