On the Complexity of Constrained Determinantal Point Processes

08/01/2016
by   L. Elisa Celis, et al.
0

Determinantal Point Processes (DPPs) are probabilistic models that arise in quantum physics and random matrix theory and have recently found numerous applications in computer science. DPPs define distributions over subsets of a given ground set, they exhibit interesting properties such as negative correlation, and, unlike other models, have efficient algorithms for sampling. When applied to kernel methods in machine learning, DPPs favor subsets of the given data with more diverse features. However, many real-world applications require efficient algorithms to sample from DPPs with additional constraints on the subset, e.g., partition or matroid constraints that are important to ensure priors, resource or fairness constraints on the sampled subset. Whether one can efficiently sample from DPPs in such constrained settings is an important problem that was first raised in a survey of DPPs by KuleszaTaskar12 and studied in some recent works in the machine learning literature. The main contribution of our paper is the first resolution of the complexity of sampling from DPPs with constraints. We give exact efficient algorithms for sampling from constrained DPPs when their description is in unary. Furthermore, we prove that when the constraints are specified in binary, this problem is #P-hard via a reduction from the problem of computing mixed discriminants implying that it may be unlikely that there is an FPRAS. Our results benefit from viewing the constrained sampling problem via the lens of polynomials. Consequently, we obtain a few algorithms of independent interest: 1) to count over the base polytope of regular matroids when there are additional (succinct) budget constraints and, 2) to evaluate and compute the mixed characteristic polynomials, that played a central role in the resolution of the Kadison-Singer problem, for certain special cases.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/25/2012

Determinantal point processes for machine learning

Determinantal point processes (DPPs) are elegant probabilistic models of...
research
07/10/2017

Subdeterminant Maximization via Nonconvex Relaxations and Anti-concentration

Several fundamental problems that arise in optimization and computer sci...
research
05/26/2016

Kronecker Determinantal Point Processes

Determinantal Point Processes (DPPs) are probabilistic models over all s...
research
05/25/2023

On sampling determinantal and Pfaffian point processes on a quantum computer

DPPs were introduced by Macchi as a model in quantum optics the 1970s. S...
research
02/12/2018

Fair and Diverse DPP-based Data Summarization

Sampling methods that choose a subset of the data proportional to its di...
research
03/21/2022

Improved Sampling-to-Counting Reductions in High-Dimensional Expanders and Faster Parallel Determinantal Sampling

We study parallel sampling algorithms for classes of distributions defin...
research
03/05/2018

Asymptotic Equivalence of Fixed-size and Varying-size Determinantal Point Processes

Determinantal Point Processes (DPPs) are popular models for point proces...

Please sign up or login with your details

Forgot password? Click here to reset