🤖 AI Summary
This paper studies the maximization of non-monotone DR-submodular functions under a cardinality constraint $k$. For ground sets of size $n$, we propose two efficient approximation algorithms—FastDrSub and FastDrSub++—that achieve, for the first time, constant-factor approximations with $O(n log k)$ function evaluation complexity: FastDrSub++ attains a $(1/4 - varepsilon)$-approximation guarantee. Our methods integrate a greedy framework with randomized sampling to substantially reduce query overhead while preserving theoretical performance bounds. Experiments on canonical tasks—including influence maximization—demonstrate that our algorithms outperform state-of-the-art approaches in both solution quality and computational efficiency, achieving superior scalability and practicality.
📝 Abstract
This work studies the non-monotone DR-submodular Maximization over a ground set of $n$ subject to a size constraint $k$. We propose two approximation algorithms for solving this problem named FastDrSub and FastDrSub++. FastDrSub offers an approximation ratio of $0.044$ with query complexity of $O(n log(k))$. The second one, FastDrSub++, improves upon it with a ratio of $1/4-epsilon$ within query complexity of $(n log k)$ for an input parameter $epsilon>0$. Therefore, our proposed algorithms are the first constant-ratio approximation algorithms for the problem with the low complexity of $O(n log(k))$. Additionally, both algorithms are experimentally evaluated and compared against existing state-of-the-art methods, demonstrating their effectiveness in solving the Revenue Maximization problem with DR-submodular objective function. The experimental results show that our proposed algorithms significantly outperform existing approaches in terms of both query complexity and solution quality.