🤖 AI Summary
The $k$-submodular cover (kSC) problem suffers from weak approximation ratios and high query complexity in AI applications such as influence maximization and resource allocation.
Method: We propose the first randomized greedy algorithm for kSC that simultaneously achieves strong bi-criteria approximation guarantees and low query complexity. Our approach innovatively integrates fast random sampling into the kSC framework, augmented by adaptive thresholding and probabilistic estimation to drastically reduce redundant function evaluations.
Contribution/Results: We theoretically prove that the algorithm attains a $(1-varepsilon)$-approximate solution using only $O(n log k / varepsilon)$ function queries—surpassing the $Omega(nk)$ query lower bound of existing deterministic methods. Experiments on multiple real-world datasets demonstrate that our method reduces query counts by one to two orders of magnitude without sacrificing solution quality, significantly enhancing scalability and practicality for large-scale instances.
📝 Abstract
We study the $k$-Submodular Cover ($kSC$) problem, a natural generalization of the classical Submodular Cover problem that arises in artificial intelligence and combinatorial optimization tasks such as influence maximization, resource allocation, and sensor placement. Existing algorithms for $kSC$ often provide weak approximation guarantees or incur prohibitively high query complexity. To overcome these limitations, we propose a extit{Fast Stochastic Greedy} algorithm that achieves strong bicriteria approximation while substantially lowering query complexity compared to state-of-the-art methods. Our approach dramatically reduces the number of function evaluations, making it highly scalable and practical for large-scale real-world AI applications where efficiency is essential.