Polya Urn Latent Dirichlet Allocation: a doubly sparse massively parallel sampler

04/12/2017
by   Alexander Terenin, et al.
0

Latent Dirichlet Allocation (LDA) is a topic model widely used in natural language processing and machine learning. Most approaches to training the model rely on iterative algorithms, which makes it difficult to run LDA on big data sets that are best analyzed in parallel and distributed computational environments. Indeed, current approaches to parallel inference either don't converge to the correct posterior or require storage of large dense matrices in memory. We present a novel sampler that overcomes both problems, and we show that this sampler is faster, both empirically and theoretically, than previous Gibbs samplers for LDA. We do so by employing a novel Pólya-Urn-based approximation in the sparse partially collapsed sampler for LDA. We prove that the approximation error vanishes with data size, making our algorithm asymptotically exact, a property of importance for large-scale topic models. In addition, we show, via an explicit example, that -- contrary to popular belief in the topic modeling literature -- partially collapsed samplers can be more efficient than fully collapsed samplers. We conclude by comparing the performance of our algorithm with that of other approaches on well-known corpora.

READ FULL TEXT
research
06/11/2015

Sparse Partially Collapsed MCMC for Parallel Inference in Topic Models

Topic models, and more specifically the class of Latent Dirichlet Alloca...
research
06/06/2019

Sparse Parallel Training of Hierarchical Dirichlet Process Topic Models

Nonparametric extensions of topic models such as Latent Dirichlet Alloca...
research
11/17/2013

Towards Big Topic Modeling

To solve the big topic modeling problem, we need to reduce both time and...
research
09/02/2013

Scalable Probabilistic Entity-Topic Modeling

We present an LDA approach to entity disambiguation. Each topic is assoc...
research
06/13/2019

Topic Modeling via Full Dependence Mixtures

We consider the topic modeling problem for large datasets. For this prob...
research
10/29/2015

WarpLDA: a Cache Efficient O(1) Algorithm for Latent Dirichlet Allocation

Developing efficient and scalable algorithms for Latent Dirichlet Alloca...
research
05/24/2016

Computing Web-scale Topic Models using an Asynchronous Parameter Server

Topic models such as Latent Dirichlet Allocation (LDA) have been widely ...

Please sign up or login with your details

Forgot password? Click here to reset