Stochastic Submodular Maximization: The Case of Coverage Functions

11/05/2017
by   Mohammad Reza Karimi, et al.
0

Stochastic optimization of continuous objectives is at the heart of modern machine learning. However, many important problems are of discrete nature and often involve submodular objectives. We seek to unleash the power of stochastic continuous optimization, namely stochastic gradient descent and its variants, to such discrete problems. We first introduce the problem of stochastic submodular optimization, where one needs to optimize a submodular objective which is given as an expectation. Our model captures situations where the discrete objective arises as an empirical risk (e.g., in the case of exemplar-based clustering), or is given as an explicit stochastic model (e.g., in the case of influence maximization in social networks). By exploiting that common extensions act linearly on the class of submodular functions, we employ projected stochastic gradient ascent and its variants in the continuous domain, and perform rounding to obtain discrete solutions. We focus on the rich and widely used family of weighted coverage functions. We show that our approach yields solutions that are guaranteed to match the optimal approximation guarantees, while reducing the computational cost by several orders of magnitude, as we demonstrate empirically.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/24/2018

Stochastic Conditional Gradient Methods: From Convex Minimization to Submodular Maximization

This paper considers stochastic optimization problems for a large class ...
research
03/17/2023

Stochastic Submodular Maximization via Polynomial Estimators

In this paper, we study stochastic submodular maximization problems with...
research
02/22/2018

Projection-Free Online Optimization with Stochastic Gradient: From Convexity to Submodularity

Online optimization has been a successful framework for solving large-sc...
research
02/16/2018

Online Continuous Submodular Maximization

In this paper, we consider an online optimization process, where the obj...
research
07/04/2017

Unsupervised Submodular Rank Aggregation on Score-based Permutations

Unsupervised rank aggregation on score-based permutations, which is wide...
research
06/22/2021

Reusing Combinatorial Structure: Faster Iterative Projections over Submodular Base Polytopes

Optimization algorithms such as projected Newton's method, FISTA, mirror...
research
09/10/2019

Distorted stochastic dominance: a generalized family of stochastic orders

We study a generalized family of stochastic orders, semiparametrized by ...

Please sign up or login with your details

Forgot password? Click here to reset