The Bayesian Low-Rank Determinantal Point Process Mixture Model

08/15/2016
by   Mike Gartrell, et al.
0

Determinantal point processes (DPPs) are an elegant model for encoding probabilities over subsets, such as shopping baskets, of a ground set, such as an item catalog. They are useful for a number of machine learning tasks, including product recommendation. DPPs are parametrized by a positive semi-definite kernel matrix. Recent work has shown that using a low-rank factorization of this kernel provides remarkable scalability improvements that open the door to training on large-scale datasets and computing online recommendations, both of which are infeasible with standard DPP models that use a full-rank kernel. In this paper we present a low-rank DPP mixture model that allows us to represent the latent structure present in observed subsets as a mixture of a number of component low-rank DPPs, where each component DPP is responsible for representing a portion of the observed data. The mixture model allows us to effectively address the capacity constraints of the low-rank DPP model. We present an efficient and scalable Markov Chain Monte Carlo (MCMC) learning algorithm for our model that uses Gibbs sampling and stochastic gradient Hamiltonian Monte Carlo (SGHMC). Using an evaluation on several real-world product recommendation datasets, we show that our low-rank DPP mixture model provides substantially better predictive performance than is possible with a single low-rank or full-rank DPP, and significantly better performance than several other competing recommendation methods in many cases.

READ FULL TEXT
research
02/17/2016

Low-Rank Factorization of Determinantal Point Processes for Recommendation

Determinantal point processes (DPPs) have garnered attention as an elega...
research
07/01/2022

Scalable MCMC Sampling for Nonsymmetric Determinantal Point Processes

A determinantal point process (DPP) is an elegant model that assigns a p...
research
01/15/2013

Matrix Approximation under Local Low-Rank Assumption

Matrix approximation is a common tool in machine learning for building a...
research
02/26/2016

Multivariate Hawkes Processes for Large-scale Inference

In this paper, we present a framework for fitting multivariate Hawkes pr...
research
08/21/2022

Bayesian Complementary Kernelized Learning for Multidimensional Spatiotemporal Data

Probabilistic modeling of multidimensional spatiotemporal data is critic...
research
09/05/2023

DAMM: Directionality-Aware Mixture Model Parallel Sampling for Efficient Dynamical System Learning

The Linear Parameter Varying Dynamical System (LPV-DS) is a promising fr...
research
08/31/2021

Scalable Spatiotemporally Varying Coefficient Modeling with Bayesian Kernelized Tensor Regression

As a regression technique in spatial statistics, spatiotemporally varyin...

Please sign up or login with your details

Forgot password? Click here to reset