Online but Accurate Inference for Latent Variable Models with Local Gibbs Sampling

03/08/2016
by   Christophe Dupuy, et al.
0

We study parameter inference in large-scale latent variable models. We first propose an unified treatment of online inference for latent variable models from a non-canonical exponential family, and draw explicit links between several previously proposed frequentist or Bayesian methods. We then propose a novel inference method for the frequentist estimation of parameters, that adapts MCMC methods to online inference of latent variable models with the proper use of local Gibbs sampling. Then, for latent Dirich-let allocation,we provide an extensive set of experiments and comparisons with existing work, where our new approach outperforms all previously proposed methods. In particular, using Gibbs sampling for latent variable inference is superior to variational inference in terms of test log-likelihoods. Moreover, Bayesian inference through variational methods perform poorly, sometimes leading to worse fits with latent variables of higher dimensionality.

READ FULL TEXT

page 28

page 31

page 32

research
07/01/2013

Algorithms of the LDA model [REPORT]

We review three algorithms for Latent Dirichlet Allocation (LDA). Two of...
research
03/12/2020

Truncated Inference for Latent Variable Optimization Problems: Application to Robust Estimation and Learning

Optimization problems with an auxiliary latent variable structure in add...
research
11/23/2017

Diversity-Promoting Bayesian Learning of Latent Variable Models

To address three important issues involved in latent variable models (LV...
research
12/12/2017

GibbsNet: Iterative Adversarial Inference for Deep Graphical Models

Directed latent variable models that formulate the joint distribution as...
research
11/25/2021

Variational Gibbs inference for statistical model estimation from incomplete data

Statistical models are central to machine learning with broad applicabil...
research
06/05/2018

Cycle-Consistent Adversarial Learning as Approximate Bayesian Inference

We formalize the problem of learning interdomain correspondences in the ...
research
11/13/2020

Ultimate Pólya Gamma Samplers – Efficient MCMC for possibly imbalanced binary and categorical data

Modeling binary and categorical data is one of the most commonly encount...

Please sign up or login with your details

Forgot password? Click here to reset