DeepAI AI Chat
Log In Sign Up

Consistency of Variational Bayes Inference for Estimation and Model Selection in Mixtures

Mixture models are widely used in Bayesian statistics and machine learning and proved their efficiency in many fields such as computational biology or natural language processing... Variational inference, a technique for approximating intractable posteriors thanks to optimization algorithms, is extremely popular in practice when dealing with complex models such as mixtures. The contribution of this paper is two-fold. First, we study the concentration of variational approximations of posteriors, which is still an open problem for general mixtures, and we derive consistency and rates of convergence. We also tackle the problem of model selection for the number of components: we study the approach already used in practice, which consists in maximizing a numerical criterion (ELBO). We prove that this strategy indeed leads to strong oracle inequalities. We illustrate our theoretical results by applications to Gaussian and multinomial mixtures.


page 1

page 2

page 3

page 4


Consistency of ELBO maximization for model selection

The Evidence Lower Bound (ELBO) is a quantity that plays a key role in v...

Variational Inference and Sparsity in High-Dimensional Deep Gaussian Mixture Models

Gaussian mixture models are a popular tool for model-based clustering, a...

Convergence Rates of Variational Inference in Sparse Deep Learning

Variational inference is becoming more and more popular for approximatin...

Frequentist Consistency of Variational Bayes

A key challenge for modern Bayesian statistics is how to perform scalabl...

A Variational Infinite Mixture for Probabilistic Inverse Dynamics Learning

Probabilistic regression techniques in control and robotics applications...

Fast Dual Variational Inference for Non-Conjugate LGMs

Latent Gaussian models (LGMs) are widely used in statistics and machine ...