Consistency of ELBO maximization for model selection

The Evidence Lower Bound (ELBO) is a quantity that plays a key role in variational inference. It can also be used as a criterion in model selection. However, though extremely popular in practice in the variational Bayes community, there has never been a general theoretic justification for selecting based on the ELBO. In this short paper, we show that the ELBO maximization strategy has strong theoretical guarantees, and is robust to model misspecification while most works rely on the assumption that one model is correctly specified. We illustrate our theoretical results by an application to the selection of the number of principal components in probabilistic PCA.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/14/2018

Consistency of Variational Bayes Inference for Estimation and Model Selection in Mixtures

Mixture models are widely used in Bayesian statistics and machine learni...
research
09/12/2009

A Nonconformity Approach to Model Selection for SVMs

We investigate the issue of model selection and the use of the nonconfor...
research
06/23/2013

A Variational Approximations-DIC Rubric for Parameter Estimation and Mixture Model Selection Within a Family Setting

Mixture model-based clustering has become an increasingly popular data a...
research
08/09/2019

Generalization Error Bounds for Deep Variational Inference

Variational inference is becoming more and more popular for approximatin...
research
06/19/2015

A simple application of FIC to model selection

We have recently proposed a new information-based approach to model sele...
research
09/02/2019

Clustering of count data through a mixture of multinomial PCA

Count data is becoming more and more ubiquitous in a wide range of appli...
research
07/01/2022

Rapidly Mixing Multiple-try Metropolis Algorithms for Model Selection Problems

The multiple-try Metropolis (MTM) algorithm is an extension of the Metro...

Please sign up or login with your details

Forgot password? Click here to reset