🤖 AI Summary
This paper addresses the Sparse Integer Least Squares (SILS) problem—NP-hard least squares optimization under {0, ±1} constraints and sparsity. To tackle it, we propose the first ℓ₁-regularized semidefinite programming (SDP) relaxation and design a complementary randomized rounding algorithm. Theoretically, we derive a sufficient condition for exact recovery via SDP under fixed sparsity, leveraging sub-Gaussian analysis and second-moment characterization of the covariance matrix to establish exact recovery even under weakly correlated features. Empirically, we validate the method’s effectiveness and practicality across diverse applications, including privacy-preserving identification, multi-user detection, feature extraction, and integer sparse signal recovery. Our approach bridges rigorous theoretical guarantees—providing provable exact recovery under realistic statistical assumptions—with computational tractability and real-world deployability.
📝 Abstract
In this paper, we study the polynomial approximability or solvability of sparse integer least square problem (SILS), which is the NP-hard variant of the least square problem, where we only consider sparse {0, +1, -1}-vectors. We propose an l1-based SDP relaxation to SILS, and introduce a randomized algorithm for SILS based on the SDP relaxation. In fact, the proposed randomized algorithm works for a broader class of binary quadratic program with cardinality constraint, where the objective function can be possibly non-convex. Moreover, when the sparsity parameter is fixed, we provide sufficient conditions for our SDP relaxation to solve SILS. The class of data input which guarantee that SDP solves SILS is broad enough to cover many cases in real-world applications, such as privacy preserving identification, and multiuser detection. To show this, we specialize our sufficient conditions to two special cases of SILS with relevant applications: the feature extraction problem and the integer sparse recovery problem. We show that our SDP relaxation can solve the feature extraction problem with sub-Gaussian data, under some weak conditions on the second moment of the covariance matrix. We also show that our SDP relaxation can solve the integer sparse recovery problem under some conditions that can be satisfied both in high and low coherence settings.