🤖 AI Summary
The Online Boolean Matrix-Vector Multiplication (OMv) hypothesis—widely used to establish conditional lower bounds for dynamic problems—lacks strong credibility due to its reliance on the unresolved hardness of Boolean matrix multiplication. Method: This paper introduces and systematically studies five non-Boolean OMv variants—Equality, Dominance, Minimum Witness, Min-Max, and Bounded Monotone Min-Plus Product—whose offline counterparts are known to be strictly harder than Boolean matrix multiplication. Through fine-grained reductions, the authors rigorously prove that each variant is computationally equivalent to the standard OMv hypothesis under fine-grained complexity assumptions. Contribution/Results: This is the first work to establish fine-grained equivalence between multiple non-Boolean online query hypotheses and OMv. It strengthens the theoretical foundation of dynamic problem lower bounds and, for the first time, constructs a fine-grained equivalence class in dynamic computational complexity—providing a richer, more robust set of assumptions for future conditional lower bound research.
📝 Abstract
Most of the known tight lower bounds for dynamic problems are based on the Online Boolean Matrix-Vector Multiplication (OMv) Hypothesis, which is not as well studied and understood as some more popular hypotheses in fine-grained complexity. It would be desirable to base hardness of dynamic problems on a more believable hypothesis. We propose analogues of the OMv Hypothesis for variants of matrix multiplication that are known to be harder than Boolean product in the offline setting, namely: equality, dominance, min-witness, min-max, and bounded monotone min-plus products. These hypotheses are a priori weaker assumptions than the standard (Boolean) OMv Hypothesis. Somewhat surprisingly, we show that they are actually equivalent to it. This establishes the first such fine-grained equivalence class for dynamic problems.