Sparse Superposition Codes with Binomial Dictionary are Capacity-Achieving with Maximum Likelihood Decoding

📅 2025-04-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates whether Sparse Superposition Codes (SPARCs) with dictionary entries drawn from a binomial distribution can achieve the Shannon capacity of the additive white Gaussian noise (AWGN) channel under maximum-likelihood (ML) decoding. Addressing the limitation of prior theoretical analyses—largely confined to Bernoulli or Gaussian dictionaries—the paper introduces a binomial-distributed dictionary model and employs asymptotic information-theoretic analysis, typical-set arguments, and probabilistic dictionary construction. It rigorously establishes that, under appropriate sparsity constraints, SPARCs with such dictionaries remain capacity-achieving under ML decoding. The key contribution is the first natural extension of capacity achievability results to binomial dictionaries, unifying and broadening the theoretical framework for non-Gaussian SPARC dictionaries. This fills a critical gap in the capacity-limit analysis of non-Gaussian structured dictionaries and enhances the practical applicability and theoretical completeness of SPARCs in real-world dictionary design.

Technology Category

Application Category

📝 Abstract
It is known that sparse superposition codes asymptotically achieve the channel capacity over the additive white Gaussian noise channel with both maximum likelihood decoding and efficient decoding (Joseph and Barron in 2012, 2014). Takeishi et al. (in 2014, 2019) demonstrated that these codes can also asymptotically achieve the channel capacity with maximum likelihood decoding when the dictionary is drawn from a Bernoulli distribution. In this paper, we extend these results by showing that the dictionary distribution can be naturally generalized to the binomial distribution.
Problem

Research questions and friction points this paper is trying to address.

Extend sparse codes' capacity-achieving proof to binomial dictionaries
Generalize dictionary distribution beyond Bernoulli for ML decoding
Achieve channel capacity with binomial sparse superposition codes
Innovation

Methods, ideas, or system contributions that make the work stand out.

Sparse superposition codes with binomial dictionary
Maximum likelihood decoding achieves capacity
Generalizes dictionary to binomial distribution
🔎 Similar Papers
No similar papers found.