Particle-Gibbs Sampling For Bayesian Feature Allocation Models

Bayesian feature allocation models are a popular tool for modelling data with a combinatorial latent structure. Exact inference in these models is generally intractable and so practitioners typically apply Markov Chain Monte Carlo (MCMC) methods for posterior inference. The most widely used MCMC strategies rely on an element wise Gibbs update of the feature allocation matrix. These element wise updates can be inefficient as features are typically strongly correlated. To overcome this problem we have developed a Gibbs sampler that can update an entire row of the feature allocation matrix in a single move. However, this sampler is impractical for models with a large number of features as the computational complexity scales exponentially in the number of features. We develop a Particle Gibbs sampler that targets the same distribution as the row wise Gibbs updates, but has computational complexity that only grows linearly in the number of features. We compare the performance of our proposed methods to the standard Gibbs sampler using synthetic data from a range of feature allocation models. Our results suggest that row wise updates using the PG methodology can significantly improve the performance of samplers for feature allocation models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/14/2020

Particle Gibbs Sampling for Bayesian Phylogenetic inference

The combinatorial sequential Monte Carlo (CSMC) has been demonstrated to...
research
04/12/2018

Efficiently Combining Pseudo Marginal and Particle Gibbs Sampling

Particle Markov Chain Monte Carlo methods are used to carry out inferenc...
research
07/19/2018

Approximate Collapsed Gibbs Clustering with Expectation Propagation

We develop a framework for approximating collapsed Gibbs sampling in gen...
research
12/08/2015

Gibbs-type Indian buffet processes

We investigate a class of feature allocation models that generalize the ...
research
06/09/2021

The Attraction Indian Buffet Distribution

We propose the attraction Indian buffet distribution (AIBD), a distribut...
research
01/16/2013

Minimum Message Length Clustering Using Gibbs Sampling

The K-Mean and EM algorithms are popular in clustering and mixture model...
research
06/06/2018

Regenerative Simulation for the Bayesian Lasso

The Gibbs sampler of Park and Casella is one of the most popular MCMC me...

Please sign up or login with your details

Forgot password? Click here to reset