SAME but Different: Fast and High-Quality Gibbs Parameter Estimation

09/18/2014
by   Huasha Zhao, et al.
0

Gibbs sampling is a workhorse for Bayesian inference but has several limitations when used for parameter estimation, and is often much slower than non-sampling inference methods. SAME (State Augmentation for Marginal Estimation) Doucet99,Doucet02 is an approach to MAP parameter estimation which gives improved parameter estimates over direct Gibbs sampling. SAME can be viewed as cooling the posterior parameter distribution and allows annealed search for the MAP parameters, often yielding very high quality (lower loss) estimates. But it does so at the expense of additional samples per iteration and generally slower performance. On the other hand, SAME dramatically increases the parallelism in the sampling schedule, and is an excellent match for modern (SIMD) hardware. In this paper we explore the application of SAME to graphical model inference on modern hardware. We show that combining SAME with factored sample representation (or approximation) gives throughput competitive with the fastest symbolic methods, but with potentially better quality. We describe experiments on Latent Dirichlet Allocation, achieving speeds similar to the fastest reported methods (online Variational Bayes) and lower cross-validated loss than other LDA implementations. The method is simple to implement and should be applicable to many other models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/08/2015

Dense Distributions from Sparse Samples: Improved Gibbs Sampling Parameter Estimators for LDA

We introduce a novel approach for estimating Latent Dirichlet Allocation...
research
09/08/2019

Evaluating Topic Quality with Posterior Variability

Probabilistic topic models such as latent Dirichlet allocation (LDA) are...
research
12/25/2017

On Statistical Optimality of Variational Bayes

The article addresses a long-standing open problem on the justification ...
research
11/19/2015

Fast Parallel SAME Gibbs Sampling on General Discrete Bayesian Networks

A fundamental task in machine learning and related fields is to perform ...
research
11/03/2021

Perturb-and-max-product: Sampling and learning in discrete energy-based models

Perturb-and-MAP offers an elegant approach to approximately sample from ...
research
01/06/2016

Streaming Gibbs Sampling for LDA Model

Streaming variational Bayes (SVB) is successful in learning LDA models i...
research
10/09/2013

Discriminative Relational Topic Models

Many scientific and engineering fields involve analyzing network data. F...

Please sign up or login with your details

Forgot password? Click here to reset