The Exact Asymptotic Form of Bayesian Generalization Error in Latent Dirichlet Allocation

Latent Dirichlet allocation (LDA) obtains essential information from data by using Bayesian inference. It is applied to knowledge discovery via dimension reducing and clustering in many fields. However, its generalization error had not been yet clarified since it is a singular statistical model where there is no one to one map from parameters to probability distributions. In this paper, we give the exact asymptotic form of its generalization error and marginal likelihood, by theoretical analysis of its learning coefficient using algebraic geometry. The theoretical result shows that the Bayesian generalization error in LDA is expressed in terms of that in matrix factorization and a penalty from the simplex restriction of LDA's parameter region.

09/13/2017

Asymptotic Bayesian Generalization Error in a General Stochastic Matrix Factorization for Markov Chain and Bayesian Network

Stochastic matrix factorization (SMF) can be regarded as a restriction o...
12/15/2020

Asymptotic Behavior of Free Energy When Optimal Probability Distribution Is Not Unique

Bayesian inference is a widely used statistical method. The free energy ...
05/04/2012

Variable Selection for Latent Dirichlet Allocation

In latent Dirichlet allocation (LDA), topics are multinomial distributio...
10/09/2020

Latent Dirichlet Allocation Model Training with Differential Privacy

Latent Dirichlet Allocation (LDA) is a popular topic modeling technique ...
06/19/2015

Sampling constrained probability distributions using Spherical Augmentation

Statistical models with constrained probability distributions are abunda...
08/07/2019

The Hitchhiker's Guide to LDA

Latent Dirichlet Allocation (LDA) model is a famous model in the topic m...
09/11/2021

Microbiome subcommunity learning with logistic-tree normal latent Dirichlet allocation

Mixed-membership (MM) models such as Latent Dirichlet Allocation (LDA) h...