🤖 AI Summary
This work addresses the challenge in compressed sensing that arbitrary (even ill-conditioned) measurement matrices often fail to satisfy theoretical reconstruction conditions. We propose augmenting such matrices with lightweight random perturbations to endow them with the Robust Null Space Property (RNSP). Theoretically, this is the first incorporation of smoothed analysis into compressed sensing, unifying sub-Gaussian, sub-exponential, and extremely heavy-tailed perturbations—requiring only bounded log n-th moments. Leveraging Mendelson’s small-ball method and an RNSP-based characterization, we rigorously establish that the perturbed matrix achieves, with asymptotic almost-sure certainty, unique and stable ℓ₁-minimization recovery at the optimal sampling rate m = O(s log(n/s)). This result provides the first theoretical guarantee that is both universally applicable and order-optimal for non-ideal, real-world measurement systems—such as those affected by hardware distortions or quantization noise.
📝 Abstract
Arbitrary matrices $M in mathbb{R}^{m imes n}$, randomly perturbed in an additive manner using a random matrix $R in mathbb{R}^{m imes n}$, are shown to asymptotically almost surely satisfy the so-called {sl robust null space property} whilst asymptotically meeting the optimal number of measurements required for {sl unique reconstruction} via $ell_1$-minimisation algorithms. A wide range of random perturbation matrices is considered; in that, $R$ is allowed to be sub-gaussian, sub-exponential, as well as extremely heavy-tailed, where only the first $log n$ moments of each entry of $R$ are bounded. A key tool driving our proofs is {sl Mendelson's small-ball method} ({em Learning without concentration}, J. ACM, Vol. $62$, $2015$).