DeepAI AI Chat
Log In Sign Up

Nearly Linear-Time, Parallelizable Algorithms for Non-Monotone Submodular Maximization

by   Alan Kuhnle, et al.

We study parallelizable algorithms for maximization of a submodular function, not necessarily monotone, with respect to a cardinality constraint k. We improve the best approximation factor achieved by an algorithm that has optimal adaptivity and query complexity, up to logarithmic factors in the size n of the ground set, from 0.039 - ϵ to 0.193 - ϵ. We provide two algorithms; the first has approximation ratio 1/6 - ϵ, adaptivity O( log n ), and query complexity O( n log k ), while the second has approximation ratio 0.193 - ϵ, adaptivity O( log^2 n ), and query complexity O(n log k). Heuristic versions of our algorithms are empirically validated to use a low number of adaptive rounds and total queries while obtaining solutions with high objective value in comparison with highly adaptive approximation algorithms.


page 1

page 2

page 3

page 4


Best of Both Worlds: Practical and Theoretically Optimal Submodular Maximization in Parallel

For the problem of maximizing a monotone, submodular function with respe...

Beyond Pointwise Submodularity: Non-Monotone Adaptive Submodular Maximization in Linear Time

In this paper, we study the non-monotone adaptive submodular maximizatio...

On the Complexity of Dynamic Submodular Maximization

We study dynamic algorithms for the problem of maximizing a monotone sub...

The FAST Algorithm for Submodular Maximization

In this paper we describe a new algorithm called Fast Adaptive Sequencin...

DASH: Distributed Adaptive Sequencing Heuristic for Submodular Maximization

The development of parallelizable algorithms for monotone, submodular ma...

A polynomial lower bound on adaptive complexity of submodular maximization

In large-data applications, it is desirable to design algorithms with a ...