Stochastic Submodular Probing with State-Dependent Costs

09/01/2019
by   Shaojie Tang, et al.
0

In this paper, we study a new stochastic submodular maximization problem with state-dependent costs and rejections. The input of our problem is a budget constraint B, and a set of items whose states (i.e., the marginal contribution and the cost of an item) are drawn from a known probability distribution. The only way to know the realized state of an item is to probe the item. We allow rejections, i.e., after probing an item and knowing its actual state, we must decide immediately and irrevocably whether to add that item to our solution or not. Our objective is to maximize the objective function subject to a budget constraint on the total cost of the selected items. We present a constant approximate solution to this problem. We show that our solution is also applied to an online setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/11/2021

Constrained Stochastic Submodular Maximization with State-Dependent Costs

In this paper, we study the constrained stochastic submodular maximizati...
research
07/07/2020

Adaptive Cascade Submodular Maximization

In this paper, we propose and study the cascade submodular maximization ...
research
05/23/2019

Price of Dependence: Stochastic Submodular Maximization with Dependent Items

In this paper, we study the stochastic submodular maximization problem w...
research
02/28/2021

Adaptive Regularized Submodular Maximization

In this paper, we study the problem of maximizing the difference between...
research
04/08/2022

Ranking with submodular functions on a budget

Submodular maximization has been the backbone of many important machine-...
research
12/11/2020

Adaptive Submodular Meta-Learning

Meta-Learning has gained increasing attention in the machine learning an...
research
02/01/2022

Sketching stochastic valuation functions

We consider the problem of sketching a stochastic valuation function, de...

Please sign up or login with your details

Forgot password? Click here to reset