Beyond Submodular Maximization

04/19/2019
by   Mehrdad Ghadiri, et al.
0

While there are well-developed tools for maximizing a submodular function subject to a matroid constraint, there is much less work on the corresponding supermodular maximization problems. We develop new techniques for attacking these problems inspired by the continuous greedy method applied to the multi-linear extension of a submodular function. We first adapt the continuous greedy algorithm to work for general twice-continuously differentiable functions. The performance of the adapted algorithm depends on a new smoothness parameter. If F:[0,1]^n→R_≥ 0 is one-sided σ-smooth, then the approximation factor only depends on σ. We apply the new algorithm to a broad class of quadratic supermodular functions arising in diversity maximization. The case σ=2 captures metric diversity maximization. We also develop new methods (inspired by swap rounding and approximate integer decomposition) for rounding quadratics over a matroid polytope. Together with the adapted continuous greedy this leads to a O(σ^3/2)-approximation. This is the best asymptotic approximation known for this class of diversity maximization and the evidence suggests that it may be tight. We then consider general (non-quadratic) functions. We give a broad parameterized family of monotone functions which include submodular functions and the just-discussed supermodular family of discrete quadratics. Such set functions are called γ-meta-submodular. We develop local search algorithms with approximation factors that depend only on γ. We show that the γ-meta-submodular families include well-known function classes including meta-submodular functions (γ=0), proportionally submodular (γ=1), and diversity functions based on negative-type distances or Jensen-Shannon divergence (both γ=2) and (semi-)metric diversity functions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset