Adaptive Scan Gibbs Sampler for Large Scale Inference Problems

01/27/2018
by   Vadim Smolyakov, et al.
0

For large scale on-line inference problems the update strategy is critical for performance. We derive an adaptive scan Gibbs sampler that optimizes the update frequency by selecting an optimum mini-batch size. We demonstrate performance of our adaptive batch-size Gibbs sampler by comparing it against the collapsed Gibbs sampler for Bayesian Lasso, Dirichlet Process Mixture Models (DPMM) and Latent Dirichlet Allocation (LDA) graphical models.

READ FULL TEXT
research
05/02/2023

Truncation Approximation for Enriched Dirichlet Process Mixture Models

Enriched Dirichlet process mixture (EDPM) models are Bayesian nonparamet...
research
08/02/2016

Blocking Collapsed Gibbs Sampler for Latent Dirichlet Allocation Models

The latent Dirichlet allocation (LDA) model is a widely-used latent vari...
research
01/28/2018

Adapting The Gibbs Sampler

The popularity of Adaptive MCMC has been fueled on the one hand by its s...
research
04/15/2021

Comments on: A Gibbs sampler for a class of random convex polytopes

In this comment we discuss relative strengths and weaknesses of simplex ...
research
11/15/2021

Amended Gibbs samplers for Cosmic Microwave Background power spectrum estimation

We study different variants of the Gibbs sampler algorithm from the pers...
research
04/11/2018

Interdependent Gibbs Samplers

Gibbs sampling, as a model learning method, is known to produce the most...
research
08/12/2020

Variational Bayes for Gaussian Factor Models under the Cumulative Shrinkage Process

The cumulative shrinkage process is an increasing shrinkage prior that c...

Please sign up or login with your details

Forgot password? Click here to reset