A Deterministic Global Optimization Method for Variational Inference

03/21/2017
by   Hachem Saddiki, et al.
0

Variational inference methods for latent variable statistical models have gained popularity because they are relatively fast, can handle large data sets, and have deterministic convergence guarantees. However, in practice it is unclear whether the fixed point identified by the variational inference algorithm is a local or a global optimum. Here, we propose a method for constructing iterative optimization algorithms for variational inference problems that are guaranteed to converge to the ϵ-global variational lower bound on the log-likelihood. We derive inference algorithms for two variational approximations to a standard Bayesian Gaussian mixture model (BGMM). We present a minimal data set for empirically testing convergence and show that a variational inference algorithm frequently converges to a local optimum while our algorithm always converges to the globally optimal variational lower bound. We characterize the loss incurred by choosing a non-optimal variational approximation distribution suggesting that selection of the approximating variational distribution deserves as much attention as the selection of the original statistical model for a given data set.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2012

Fast Variational Inference in the Conjugate Exponential Family

We present a general method for deriving collapsed variational inference...
research
02/13/2019

On the Convergence of Extended Variational Inference for Non-Gaussian Statistical Models

Variational inference (VI) is a widely used framework in Bayesian estima...
research
10/19/2012

Bayesian Hierarchical Mixtures of Experts

The Hierarchical Mixture of Experts (HME) is a well-known tree-based mod...
research
07/10/2019

Trust-Region Variational Inference with Gaussian Mixture Models

Many methods for machine learning rely on approximate inference from int...
research
06/05/2013

Fast Dual Variational Inference for Non-Conjugate LGMs

Latent Gaussian models (LGMs) are widely used in statistics and machine ...
research
05/17/2018

A Forest Mixture Bound for Block-Free Parallel Inference

Coordinate ascent variational inference is an important algorithm for in...
research
10/29/2020

Gaussian Process Bandit Optimization of the Thermodynamic Variational Objective

Achieving the full promise of the Thermodynamic Variational Objective (T...

Please sign up or login with your details

Forgot password? Click here to reset