Evaluating Topic Quality with Posterior Variability

09/08/2019
by   Linzi Xing, et al.
0

Probabilistic topic models such as latent Dirichlet allocation (LDA) are popularly used with Bayesian inference methods such as Gibbs sampling to learn posterior distributions over topic model parameters. We derive a novel measure of LDA topic quality using the variability of the posterior distributions. Compared to several existing baselines for automatic topic evaluation, the proposed metric achieves state-of-the-art correlations with human judgments of topic quality in experiments on three corpora. We additionally demonstrate that topic quality estimation can be further improved using a supervised estimator that combines multiple metrics.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/22/2015

A 'Gibbs-Newton' Technique for Enhanced Inference of Multivariate Polya Parameters and Topic Models

Hyper-parameters play a major role in the learning and inference process...
research
04/10/2018

Towards Training Probabilistic Topic Models on Neuromorphic Multi-chip Systems

Probabilistic topic models are popular unsupervised learning methods, in...
research
04/07/2016

Combinatorial Topic Models using Small-Variance Asymptotics

Topic models have emerged as fundamental tools in unsupervised machine l...
research
12/10/2015

Inference in topic models: sparsity and trade-off

Topic models are popular for modeling discrete data (e.g., texts, images...
research
12/10/2015

Guaranteed inference in topic models

One of the core problems in statistical models is the estimation of a po...
research
09/18/2014

SAME but Different: Fast and High-Quality Gibbs Parameter Estimation

Gibbs sampling is a workhorse for Bayesian inference but has several lim...
research
09/12/2023

Evaluating Dynamic Topic Models

There is a lack of quantitative measures to evaluate the progression of ...

Please sign up or login with your details

Forgot password? Click here to reset