Boosting Black Box Variational Inference

06/06/2018
by   Francesco Locatello, et al.
0

Approximating a probability density in a tractable manner is a central task in Bayesian statistics. Variational Inference (VI) is a popular technique that achieves tractability by choosing a relatively simple variational family. Borrowing ideas from the classic boosting framework, recent approaches attempt to boost VI by replacing the selection of a single density with a greedily constructed mixture of densities. In order to guarantee convergence, previous works impose stringent assumptions that require significant effort for practitioners. Specifically, they require a custom implementation of the greedy step (called the LMO) for every probabilistic model with respect to an unnatural variational family of truncated distributions. Our work fixes these issues with novel theoretical and algorithmic insights. On the theoretical side, we show that boosting VI satisfies a relaxed smoothness assumption which is sufficient for the convergence of the functional Frank-Wolfe (FW) algorithm. Furthermore, we rephrase the LMO problem and propose to maximize the Residual ELBO (RELBO) which replaces the standard ELBO optimization in VI. These theoretical enhancements allow for black box implementation of the boosting subroutine. Finally, we present a stopping criterion drawn from the duality gap in the classic FW analyses and exhaustive experiments to illustrate the usefulness of our theoretical and algorithmic contributions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/05/2017

Boosting Variational Inference: an Optimization Perspective

Variational Inference is a popular technique to approximate a possibly i...
research
10/19/2020

Statistical Guarantees and Algorithmic Convergence Issues of Variational Boosting

We provide statistical guarantees for Bayesian variational boosting by p...
research
05/24/2023

Black-Box Variational Inference Converges

We provide the first convergence guarantee for full black-box variationa...
research
11/20/2016

Variational Boosting: Iteratively Refining Posterior Approximations

We propose a black-box variational inference method to approximate intra...
research
06/04/2019

Universal Boosting Variational Inference

Boosting variational inference (BVI) approximates an intractable probabi...
research
01/24/2019

Provable Smoothness Guarantees for Black-Box Variational Inference

Black-box variational inference tries to approximate a complex target di...
research
02/26/2020

Lipschitz standardization for robust multivariate learning

Current trends in machine learning rely on out-of-the-box gradient-based...

Please sign up or login with your details

Forgot password? Click here to reset