Black-Box Variational Inference Converges

05/24/2023
by   Kyurae Kim, et al.
0

We provide the first convergence guarantee for full black-box variational inference (BBVI), also known as Monte Carlo variational inference. While preliminary investigations worked on simplified versions of BBVI (e.g., bounded domain, bounded support, only optimizing for the scale, and such), our setup does not need any such algorithmic modifications. Our results hold for log-smooth posterior densities with and without strong log-concavity and the location-scale variational family. Also, our analysis reveals that certain algorithm design choices commonly employed in practice, particularly, nonlinear parameterizations of the scale of the variational approximation, can result in suboptimal convergence rates. Fortunately, running BBVI with proximal stochastic gradient descent fixes these limitations, and thus achieves the strongest known convergence rate guarantees. We evaluate this theoretical insight by comparing proximal SGD against other standard implementations of BBVI on large-scale Bayesian inference problems.

READ FULL TEXT

page 3

page 18

research
06/04/2023

Provable convergence guarantees for black-box variational inference

While black-box variational inference is widely used, there is no proof ...
research
03/18/2023

Practical and Matching Gradient Variance Bounds for Black-Box Variational Bayesian Inference

Understanding the gradient variance of black-box variational inference (...
research
07/27/2023

Linear Convergence of Black-Box Variational Inference: Should We Stick the Landing?

We prove that black-box variational inference (BBVI) with control variat...
research
01/24/2019

Provable Smoothness Guarantees for Black-Box Variational Inference

Black-box variational inference tries to approximate a complex target di...
research
11/09/2022

Regularized Rényi divergence minimization through Bregman proximal gradient algorithms

We study the variational inference problem of minimizing a regularized R...
research
06/06/2018

Boosting Black Box Variational Inference

Approximating a probability density in a tractable manner is a central t...
research
10/31/2015

Faster Stochastic Variational Inference using Proximal-Gradient Methods with General Divergence Functions

Several recent works have explored stochastic gradient methods for varia...

Please sign up or login with your details

Forgot password? Click here to reset