skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Optimal Algorithms for Continuous Non-monotone Submodular and DR-Submodular Maximization
This paper considers the problems of maximizing a continuous non-monotone submodular function over the hypercube, both with and without coordinate-wise concavity. This family of optimization problems has several applications in machine learning, economics, and communication systems. The main result is the first 1/2-approximation algorithm for continuous submodular function maximization; this approximation factor of 1/2 is the best possible for algorithms that only query the objective function at polynomially many points. For the special case of DR-submodular maximization, i.e. when the submodular functions are also coordinate-wise concave along all coordinates, we provide a different 1 2-approximation algorithm that runs in quasi-linear time.  more » « less
Award ID(s):
1813188
PAR ID:
10096612
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
31st Annual Conference on Neural Information Processing Systems (NeurIPS), December 2018.
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Kraus, Andreas (Ed.)
    In this paper we study the fundamental problems of maximizing a continuous nonmonotone submodular function over the hypercube, both with and without coordinate-wise concavity. This family of optimization problems has several applications in machine learning, economics, and communication systems. Our main result is the first 1 2 -approximation algorithm for continuous submodular function maximization; this approximation factor of 1 2 is the best possible for algorithms that only query the objective function at polynomially many points. For the special case of DR-submodular maximization, i.e. when the submodular function is also coordinate-wise concave along all coordinates, we provide a different 1 2 -approximation algorithm that runs in quasi-linear time. Both these results improve upon prior work (Bian et al., 2017a,b; Soma and Yoshida, 2017). Our first algorithm uses novel ideas such as reducing the guaranteed approximation problem to analyzing a zero-sum game for each coordinate, and incorporates the geometry of this zero-sum game to fix the value at this coordinate. Our second algorithm exploits coordinate-wise concavity to identify a monotone equilibrium condition sufficient for getting the required approximation guarantee, and hunts for the equilibrium point using binary search. We further run experiments to verify the performance of our proposed algorithms in related machine learning applications. 
    more » « less
  2. We study submodular maximization problems with matroid constraints, in particular, problems where the objective can be expressed via compositions of analytic and multilinear functions. We show that for functions of this form, the so-called continuous greedy algorithm attains a ratio arbitrarily close to (1 – 1/e) ≈ 0.63 using a deterministic estimation via Taylor series approximation. This drastically reduces execution time over prior art that uses sampling. 
    more » « less
  3. We consider the problem of maximizing the multilinear extension of a submodular function subject a single matroid constraint or multiple packing constraints with a small number of adaptive rounds of evaluation queries. We obtain the first algorithms with low adaptivity for submodular maximization with a matroid constraint. Our algorithms achieve a $$1-1/e-\epsilon$$ approximation for monotone functions and a $$1/e-\epsilon$$ approximation for non-monotone functions, which nearly matches the best guarantees known in the fully adaptive setting. The number of rounds of adaptivity is $$O(\log^2{n}/\epsilon^3)$$, which is an exponential speedup over the existing algorithms. We obtain the first parallel algorithm for non-monotone submodular maximization subject to packing constraints. Our algorithm achieves a $$1/e-\epsilon$$ approximation using $$O(\log(n/\epsilon) \log(1/\epsilon) \log(n+m)/ \epsilon^2)$$ parallel rounds, which is again an exponential speedup in parallel time over the existing algorithms. For monotone functions, we obtain a $$1-1/e-\epsilon$$ approximation in $$O(\log(n/\epsilon)\log(m)/\epsilon^2)$$ parallel rounds. The number of parallel rounds of our algorithm matches that of the state of the art algorithm for solving packing LPs with a linear objective (Mahoney et al., 2016). Our results apply more generally to the problem of maximizing a diminishing returns submodular (DR-submodular) function. 
    more » « less
  4. We study the problem of maximizing a non-monotone submodular function subject to a cardinality constraint in the streaming model. Our main contribution is a single-pass (semi-)streaming algorithm that uses roughly $$O(k / \varepsilon^2)$$ memory, where $$k$$ is the size constraint. At the end of the stream, our algorithm post-processes its data structure using any offline algorithm for submodular maximization, and obtains a solution whose approximation guarantee is $$\frac{\alpha}{1+\alpha}-\varepsilon$$, where $$\alpha$$ is the approximation of the offline algorithm. If we use an exact (exponential time) post-processing algorithm, this leads to $$\frac{1}{2}-\varepsilon$$ approximation (which is nearly optimal). If we post-process with the algorithm of \cite{buchbinder2019constrained}, that achieves the state-of-the-art offline approximation guarantee of $$\alpha=0.385$$, we obtain $0.2779$-approximation in polynomial time, improving over the previously best polynomial-time approximation of $0.1715$$ due to \cite{feldman2018less}. It is also worth mentioning that our algorithm is combinatorial and deterministic, which is rare for an algorithm for non-monotone submodular maximization, and enjoys a fast update time of $$O(\frac{\log k + \log (1/\alpha {\varepsilon^2})$ per element. 
    more » « less
  5. Maximizing a monotone k-submodular function subject to cardinality constraints is a general model for several applications ranging from influence maximization with multiple products to sensor placement with multiple sensor types and online ad allocation. Due to the large problem scale in many applications and the online nature of ad allocation, a need arises for algorithms that process elements in a streaming fashion and possibly make online decisions. In this work, we develop a new streaming algorithm for maximizing a monotone k-submodular function subject to a per-coordinate cardinality constraint attaining an approximation guarantee close to the state of the art guarantee in the offline setting. Though not typical for streaming algorithms, our streaming algorithm also readily applies to the online setting with free disposal. Our algorithm is combinatorial and enjoys fast running time and small number of function evaluations. Furthermore, its guarantee improves as the cardinality constraints get larger, which is especially suited for the large scale applications. For the special case of maximizing a submodular function with large budgets, our combinatorial algorithm matches the guarantee of the state-of-the-art continuous algorithm, which requires significantly more time and function evaluations. 
    more » « less