DeepAI AI Chat
Log In Sign Up

Consistency of ELBO maximization for model selection

The Evidence Lower Bound (ELBO) is a quantity that plays a key role in variational inference. It can also be used as a criterion in model selection. However, though extremely popular in practice in the variational Bayes community, there has never been a general theoretic justification for selecting based on the ELBO. In this short paper, we show that the ELBO maximization strategy has strong theoretical guarantees, and is robust to model misspecification while most works rely on the assumption that one model is correctly specified. We illustrate our theoretical results by an application to the selection of the number of principal components in probabilistic PCA.


page 1

page 2

page 3

page 4


Consistency of Variational Bayes Inference for Estimation and Model Selection in Mixtures

Mixture models are widely used in Bayesian statistics and machine learni...

A Nonconformity Approach to Model Selection for SVMs

We investigate the issue of model selection and the use of the nonconfor...

Generalization Error Bounds for Deep Variational Inference

Variational inference is becoming more and more popular for approximatin...

A simple application of FIC to model selection

We have recently proposed a new information-based approach to model sele...

Clustering of count data through a mixture of multinomial PCA

Count data is becoming more and more ubiquitous in a wide range of appli...

Rapidly Mixing Multiple-try Metropolis Algorithms for Model Selection Problems

The multiple-try Metropolis (MTM) algorithm is an extension of the Metro...