The Exact Asymptotic Form of Bayesian Generalization Error in Latent Dirichlet Allocation

by   Naoki Hayashi, et al.

Latent Dirichlet allocation (LDA) obtains essential information from data by using Bayesian inference. It is applied to knowledge discovery via dimension reducing and clustering in many fields. However, its generalization error had not been yet clarified since it is a singular statistical model where there is no one to one map from parameters to probability distributions. In this paper, we give the exact asymptotic form of its generalization error and marginal likelihood, by theoretical analysis of its learning coefficient using algebraic geometry. The theoretical result shows that the Bayesian generalization error in LDA is expressed in terms of that in matrix factorization and a penalty from the simplex restriction of LDA's parameter region.


page 1

page 2

page 3

page 4


Asymptotic Bayesian Generalization Error in a General Stochastic Matrix Factorization for Markov Chain and Bayesian Network

Stochastic matrix factorization (SMF) can be regarded as a restriction o...

Asymptotic Behavior of Free Energy When Optimal Probability Distribution Is Not Unique

Bayesian inference is a widely used statistical method. The free energy ...

Variable Selection for Latent Dirichlet Allocation

In latent Dirichlet allocation (LDA), topics are multinomial distributio...

Latent Dirichlet Allocation Model Training with Differential Privacy

Latent Dirichlet Allocation (LDA) is a popular topic modeling technique ...

Sampling constrained probability distributions using Spherical Augmentation

Statistical models with constrained probability distributions are abunda...

The Hitchhiker's Guide to LDA

Latent Dirichlet Allocation (LDA) model is a famous model in the topic m...

Microbiome subcommunity learning with logistic-tree normal latent Dirichlet allocation

Mixed-membership (MM) models such as Latent Dirichlet Allocation (LDA) h...