DeepAI AI Chat
Log In Sign Up

Stochastic-Greedy++: Closing the Optimality Gap in Exact Weak Submodular Maximization

by   Gustavo de Veciana, et al.

Many problems in discrete optimization can be formulated as the task of maximizing a monotone and weak submodular function subject to a cardinality constraint. For such problems, a simple greedy algorithm is guaranteed to find a solution with a value no worse than 1-1/e of the optimal. Although the computational complexity of Greedy is linear in the size of the data m and cardinality constraint (k), even this linear complexity becomes prohibitive for large-scale datasets. Recently, Mirzasoleyman et al. propose a randomized greedy algorithm, Stochastic-Greedy, with the expected worst case approximation ratio of 1-1/e-ϵ where 0<ϵ<1 is a parameter controlling the trade-off between complexity and performance. We consider the following question: Given the small ϵ gap between the worst-case performance guarantees of Stochastic-Greedy and Greedy, can we expect nearly equivalent conditions for the exact identification of the optimal subset? In this paper we show that in general there is an unbounded gap between the exact performance of Stochastic-Greedy and Greedy by considering the problem of sparse support selection. Tropp and Gilbert show Greedy finds the optimal solution with high probability assuming n=O(klogm/k), the information theoretic lowerbound on minimum number of measurements for exact identification of the optimal subset. By contrast, we show that irrespective of the number of measurements, Stochastic-Greedy with overwhelming probability fails to find the optimal subset. We reveal that the failure of Stochastic-Greedy can be circumvented by progressively increasing the size of the stochastic search space. Employing this insight, we present the first sparse support selection algorithm that achieves exact identification of the optimal subset from O(klogm/k) measurements with complexity Õ(m) for arbitrary sparse vectors.


page 1

page 2

page 3

page 4


Budget-Smoothed Analysis for Submodular Maximization

The greedy algorithm for submodular function maximization subject to car...

Beyond Pointwise Submodularity: Non-Monotone Adaptive Submodular Maximization in Linear Time

In this paper, we study the non-monotone adaptive submodular maximizatio...

Robust Submodular Maximization: A Non-Uniform Partitioning Approach

We study the problem of maximizing a monotone submodular function subjec...

Two-Sided Weak Submodularity for Matroid Constrained Optimization and Regression

The concept of weak submodularity and the related submodularity ratio co...

Sampling and Reconstruction of Graph Signals via Weak Submodularity and Semidefinite Relaxation

We study the problem of sampling a bandlimited graph signal in the prese...

Scaling Submodular Optimization Approaches for Control Applications in Networked Systems

Often times, in many design problems, there is a need to select a small ...

Pareto Optimization for Subset Selection with Dynamic Cost Constraints

In this paper, we consider the subset selection problem for function f w...