Variational Boosting: Iteratively Refining Posterior Approximations

11/20/2016
by   Andrew C. Miller, et al.
0

We propose a black-box variational inference method to approximate intractable distributions with an increasingly rich approximating class. Our method, termed variational boosting, iteratively refines an existing variational approximation by solving a sequence of optimization problems, allowing the practitioner to trade computation time for accuracy. We show how to expand the variational approximating class by incorporating additional covariance structure and by introducing new components to form a mixture. We apply variational boosting to synthetic and real statistical models, and show that resulting posterior inferences compare favorably to existing posterior approximation algorithms in both accuracy and efficiency.

READ FULL TEXT
research
11/17/2016

Boosting Variational Inference

Variational inference (VI) provides fast approximations of a Bayesian po...
research
08/05/2017

Boosting Variational Inference: an Optimization Perspective

Variational Inference is a popular technique to approximate a possibly i...
research
06/04/2019

Universal Boosting Variational Inference

Boosting variational inference (BVI) approximates an intractable probabi...
research
05/19/2021

Boosting Variational Inference With Locally Adaptive Step-Sizes

Variational Inference makes a trade-off between the capacity of the vari...
research
06/08/2021

A Bagging and Boosting Based Convexly Combined Optimum Mixture Probabilistic Model

Unlike previous studies on mixture distributions, a bagging and boosting...
research
06/06/2018

Boosting Black Box Variational Inference

Approximating a probability density in a tractable manner is a central t...
research
02/05/2019

Asymptotic Consistency of α-Rényi-Approximate Posteriors

In this work, we study consistency properties of α-Rényi approximate pos...

Please sign up or login with your details

Forgot password? Click here to reset