🤖 AI Summary
This work investigates the average-case complexity of the $k$-Orthogonal Vectors ($k$-OV) problem. The central challenge addressed is how to embed a unique solution into random instances while preserving computational hardness. We introduce the first “planting” mechanism that rigorously preserves all $(k-1)$-wise marginal distributions: given i.i.d. $p$-biased random vectors, we embed a unique orthogonal $k$-tuple such that the joint distribution of any $k-1$ vectors remains identical to that of the original unstructured model. This mechanism enables, for the first time, a fine-grained search-to-decision reduction for $k$-OV and establishes the first average-case hardness framework for the problem. Theoretical analysis shows that planted instances remain conditionally hard—any algorithm requires $n^{k-o(1)}$ time even in the worst case—thereby providing a rigorous average-case foundation for fine-grained complexity.
📝 Abstract
In the $k$-Orthogonal Vectors ($k$-OV) problem we are given $k$ sets, each containing $n$ binary vectors of dimension $d=n^{o(1)}$, and our goal is to pick one vector from each set so that at each coordinate at least one vector has a zero. It is a central problem in fine-grained complexity, conjectured to require $n^{k-o(1)}$ time in the worst case. We propose a way to emph{plant} a solution among vectors with i.i.d. $p$-biased entries, for appropriately chosen $p$, so that the planted solution is the unique one. Our conjecture is that the resulting $k$-OV instances still require time $n^{k-o(1)}$ to solve, emph{on average}. Our planted distribution has the property that any subset of strictly less than $k$ vectors has the emph{same} marginal distribution as in the model distribution, consisting of i.i.d. $p$-biased random vectors. We use this property to give average-case search-to-decision reductions for $k$-OV.