Boosting Variational Inference: an Optimization Perspective

08/05/2017
by   Francesco Locatello, et al.
0

Variational Inference is a popular technique to approximate a possibly intractable Bayesian posterior with a more tractable one. Recently, Boosting Variational Inference has been proposed as a new paradigm to approximate the posterior by a mixture of densities by greedily adding components to the mixture. In the present work, we study the convergence properties of this approach from a modern optimization viewpoint by establishing connections to the classic Frank-Wolfe algorithm. Our analyses yields novel theoretical insights on the Boosting of Variational Inference regarding the sufficient conditions for convergence, explicit sublinear/linear rates, and algorithmic simplifications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2018

Boosting Black Box Variational Inference

Approximating a probability density in a tractable manner is a central t...
research
05/19/2021

Boosting Variational Inference With Locally Adaptive Step-Sizes

Variational Inference makes a trade-off between the capacity of the vari...
research
10/19/2020

Statistical Guarantees and Algorithmic Convergence Issues of Variational Boosting

We provide statistical guarantees for Bayesian variational boosting by p...
research
11/20/2016

Variational Boosting: Iteratively Refining Posterior Approximations

We propose a black-box variational inference method to approximate intra...
research
09/10/2012

A Bayesian Boosting Model

We offer a novel view of AdaBoost in a statistical setting. We propose a...
research
10/26/2021

Relay Variational Inference: A Method for Accelerated Encoderless VI

Variational Inference (VI) offers a method for approximating intractable...
research
06/04/2019

Universal Boosting Variational Inference

Boosting variational inference (BVI) approximates an intractable probabi...

Please sign up or login with your details

Forgot password? Click here to reset