Expectation-Propogation for the Generative Aspect Model

12/12/2012
by   Thomas P. Minka, et al.
0

The generative aspect model is an extension of the multinomial model for text that allows word probabilities to vary stochastically across documents. Previous results with aspect models have been promising, but hindered by the computational difficulty of carrying out inference and learning. This paper demonstrates that the simple variational methods of Blei et al (2001) can lead to inaccurate inferences and biased learning for the generative aspect model. We develop an alternative approach that leads to higher accuracy at comparable cost. An extension of Expectation-Propagation is used for inference and then embedded in an EM algorithm for learning. Experimental results are presented for both synthetic and real data sets.

READ FULL TEXT

page 1

page 2

page 5

page 6

research
09/07/2022

Inference and Learning for Generative Capsule Models

Capsule networks (see e.g. Hinton et al., 2018) aim to encode knowledge ...
research
07/31/2020

Diet deep generative audio models with structured lottery

Deep learning models have provided extremely successful solutions in mos...
research
02/22/2018

Learning Causally-Generated Stationary Time Series

We present the Causal Gaussian Process Convolution Model (CGPCM), a doub...
research
10/28/2019

On the Global Convergence of (Fast) Incremental Expectation Maximization Methods

The EM algorithm is one of the most popular algorithm for inference in l...
research
02/07/2019

Towards Autoencoding Variational Inference for Aspect-based Opinion Summary

Aspect-based Opinion Summary (AOS), consisting of aspect discovery and s...
research
06/12/2015

Stochastic Expectation Propagation

Expectation propagation (EP) is a deterministic approximation algorithm ...
research
10/03/2017

Towards an Inferential Lexicon of Event Selecting Predicates for French

We present a manually constructed seed lexicon encoding the inferential ...

Please sign up or login with your details

Forgot password? Click here to reset