Data Summarization beyond Monotonicity: Non-monotone Two-Stage Submodular Maximization

09/11/2023
by   Shaojie Tang, et al.
0

The objective of a two-stage submodular maximization problem is to reduce the ground set using provided training functions that are submodular, with the aim of ensuring that optimizing new objective functions over the reduced ground set yields results comparable to those obtained over the original ground set. This problem has applications in various domains including data summarization. Existing studies often assume the monotonicity of the objective function, whereas our work pioneers the extension of this research to accommodate non-monotone submodular functions. We have introduced the first constant-factor approximation algorithms for this more general case.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/15/2021

Multilinear extension of k-submodular functions

A k-submodular function is a function that given k disjoint subsets outp...
research
08/16/2023

Non-monotone Sequential Submodular Maximization

In this paper, we study a fundamental problem in submodular optimization...
research
06/07/2018

Data Summarization at Scale: A Two-Stage Submodular Approach

The sheer scale of modern datasets has resulted in a dire need for summa...
research
07/09/2019

Stochastic Monotone Submodular Maximization with Queries

We study a stochastic variant of monotone submodular maximization proble...
research
10/16/2012

Learning Mixtures of Submodular Shells with Application to Document Summarization

We introduce a method to learn a mixture of submodular "shells" in a lar...
research
10/20/2022

Neural Estimation of Submodular Functions with Applications to Differentiable Subset Selection

Submodular functions and variants, through their ability to characterize...
research
12/15/2022

Min-max Submodular Ranking for Multiple Agents

In the submodular ranking (SR) problem, the input consists of a set of s...

Please sign up or login with your details

Forgot password? Click here to reset