Convergence Rates of Latent Topic Models Under Relaxed Identifiability Conditions

10/30/2017
by   Yining Wang, et al.
0

In this paper we study the frequentist convergence rate for the Latent Dirichlet Allocation (Blei et al., 2003) topic models. We show that the maximum likelihood estimator converges to one of the finitely many equivalent parameters in Wasserstein's distance metric at a rate of n^-1/4 without assuming separability or non-degeneracy of the underlying topics and/or the existence of more than three words per document, thus generalizing the previous works of Anandkumar et al. (2012, 2014) from an information-theoretical perspective. We also show that the n^-1/4 convergence rate is optimal in the worst case.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/07/2015

An Explicit Rate Bound for the Over-Relaxed ADMM

The framework of Integral Quadratic Constraints of Lessard et al. (2014)...
research
04/03/2019

Minimum Volume Topic Modeling

We propose a new topic modeling procedure that takes advantage of the fa...
research
03/05/2023

Uncoupled and Convergent Learning in Two-Player Zero-Sum Markov Games

We revisit the problem of learning in two-player zero-sum Markov games, ...
research
02/17/2022

Refined Convergence Rates for Maximum Likelihood Estimation under Finite Mixture Models

We revisit convergence rates for maximum likelihood estimation (MLE) und...
research
10/10/2011

Convergence Rates for Mixture-of-Experts

In mixtures-of-experts (ME) model, where a number of submodels (experts)...
research
12/10/2015

Guaranteed inference in topic models

One of the core problems in statistical models is the estimation of a po...
research
10/08/2021

Learning Topic Models: Identifiability and Finite-Sample Analysis

Topic models provide a useful text-mining tool for learning, extracting ...

Please sign up or login with your details

Forgot password? Click here to reset