Sparse Stochastic Inference for Latent Dirichlet allocation

06/27/2012
by   David Mimno, et al.
0

We present a hybrid algorithm for Bayesian topic models that combines the efficiency of sparse Gibbs sampling with the scalability of online stochastic inference. We used our algorithm to analyze a corpus of 1.2 million books (33 billion words) with thousands of topics. Our approach reduces the bias of variational inference and generalizes to many Bayesian hidden-variable models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2012

Stochastic Variational Inference

We develop stochastic variational inference, a scalable algorithm for ap...
research
05/01/2017

Stochastic Divergence Minimization for Biterm Topic Model

As the emergence and the thriving development of social networks, a huge...
research
02/23/2017

Scalable Inference for Nested Chinese Restaurant Process Topic Models

Nested Chinese Restaurant Process (nCRP) topic models are powerful nonpa...
research
08/04/2014

Modulation Classification via Gibbs Sampling Based on a Latent Dirichlet Bayesian Network

A novel Bayesian modulation classification scheme is proposed for a sing...
research
05/09/2012

On Smoothing and Inference for Topic Models

Latent Dirichlet analysis, or topic modeling, is a flexible latent varia...
research
06/26/2015

An Empirical Study of Stochastic Variational Algorithms for the Beta Bernoulli Process

Stochastic variational inference (SVI) is emerging as the most promising...
research
03/31/2020

Exact marginal inference in Latent Dirichlet Allocation

Assume we have potential "causes" z∈ Z, which produce "events" w with kn...

Please sign up or login with your details

Forgot password? Click here to reset