Optimal Subspace Embeddings: Resolving Nelson-Nguyen Conjecture Up to Sub-Polylogarithmic Factors

📅 2025-08-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work resolves the central conjecture of Nelson and Nguyen (FOCS 2013) on the optimal dimension and sparsity of oblivious subspace embeddings (OSEs): For any $n ge d$ and $varepsilon ge d^{-O(1)}$, does there exist a random matrix $Pi$ such that, with high probability, $(1-varepsilon)|Ax| le |Pi Ax| le (1+varepsilon)|Ax|$ holds simultaneously for all $A in mathbb{R}^{n imes d}$ and $x in mathbb{R}^d$? We introduce an iterative decoupling technique for matrix concentration, overcoming the universality bottleneck of conventional higher-order moment analysis and enabling fine-grained control over trace moment bounds. Our approach yields the first OSE with embedding dimension $ ilde{O}(d/varepsilon^2)$—achieving both subpolynomial logarithmic factors and fast-matrix-multiplication structure. This improves upon prior constructions and significantly accelerates downstream applications such as large-scale linear regression.

Technology Category

Application Category

📝 Abstract
We give a proof of the conjecture of Nelson and Nguyen [FOCS 2013] on the optimal dimension and sparsity of oblivious subspace embeddings, up to sub-polylogarithmic factors: For any $ngeq d$ and $εgeq d^{-O(1)}$, there is a random $ ilde O(d/ε^2) imes n$ matrix $Π$ with $ ilde O(log(d)/ε)$ non-zeros per column such that for any $Ainmathbb{R}^{n imes d}$, with high probability, $(1-ε)|Ax|leq|ΠAx|leq(1+ε)|Ax|$ for all $xinmathbb{R}^d$, where $ ilde O(cdot)$ hides only sub-polylogarithmic factors in $d$. Our result in particular implies a new fastest sub-current matrix multiplication time reduction of size $ ilde O(d/ε^2)$ for a broad class of $n imes d$ linear regression tasks. A key novelty in our analysis is a matrix concentration technique we call iterative decoupling, which we use to fine-tune the higher-order trace moment bounds attainable via existing random matrix universality tools [Brailovskaya and van Handel, GAFA 2024].
Problem

Research questions and friction points this paper is trying to address.

Proving optimal dimension and sparsity for oblivious subspace embeddings
Resolving Nelson-Nguyen conjecture with sub-polylogarithmic precision
Enabling faster matrix multiplication for linear regression tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Optimal dimension and sparsity subspace embeddings
Iterative decoupling matrix concentration technique
Sub-polylogarithmic factors resolution conjecture
🔎 Similar Papers
No similar papers found.
S
Shabarish Chenakkod
University of Michigan, Ann Arbor, MI, USA
Michał Dereziński
Michał Dereziński
Assistant Professor at University of Michigan
machine learningoptimizationstatisticsrandomized algorithms
X
Xiaoyu Dong
National University of Singapore, Singapore