Provable Gradient Variance Guarantees for Black-Box Variational Inference

06/19/2019
by   Justin Domke, et al.
0

Recent variational inference methods use stochastic gradient estimators whose variance is not well understood. Theoretical guarantees for these estimators are important to understand when these methods will or will not work. This paper gives bounds for the common "reparameterization" estimators when the target is smooth and the variational family is a location-scale distribution. These bounds are unimprovable and thus provide the best possible guarantees under the stated assumptions.

READ FULL TEXT
research
06/04/2023

Provable convergence guarantees for black-box variational inference

While black-box variational inference is widely used, there is no proof ...
research
01/24/2019

Provable Smoothness Guarantees for Black-Box Variational Inference

Black-box variational inference tries to approximate a complex target di...
research
03/31/2019

Perturbative estimation of stochastic gradients

In this paper we introduce a family of stochastic gradient estimation te...
research
07/27/2023

Linear Convergence of Black-Box Variational Inference: Should We Stick the Landing?

We prove that black-box variational inference (BBVI) with control variat...
research
06/28/2016

Automatic Variational ABC

Approximate Bayesian Computation (ABC) is a framework for performing lik...
research
10/12/2022

Alpha-divergence Variational Inference Meets Importance Weighted Auto-Encoders: Methodology and Asymptotics

Several algorithms involving the Variational Rényi (VR) bound have been ...
research
03/18/2023

Practical and Matching Gradient Variance Bounds for Black-Box Variational Bayesian Inference

Understanding the gradient variance of black-box variational inference (...

Please sign up or login with your details

Forgot password? Click here to reset