Sticking the Landing: Simple, Lower-Variance Gradient Estimators for Variational Inference

03/27/2017
by   Geoffrey Roeder, et al.
0

We propose a simple and general variant of the standard reparameterized gradient estimator for the variational evidence lower bound. Specifically, we remove a part of the total derivative with respect to the variational parameters that corresponds to the score function. Removing this term produces an unbiased gradient estimator whose variance approaches zero as the approximate posterior approaches the exact posterior. We analyze the behavior of this gradient estimator theoretically and empirically, and generalize it to more complex variational distributions such as mixtures and importance-weighted posteriors.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/20/2020

VarGrad: A Low-Variance Gradient Estimator for Variational Inference

We analyse the properties of an unbiased gradient estimator of the ELBO ...
research
01/26/2021

Generalized Doubly Reparameterized Gradient Estimators

Efficient low-variance gradient estimation enabled by the reparameteriza...
research
02/27/2023

U-Statistics for Importance-Weighted Variational Inference

We propose the use of U-statistics to reduce variance for gradient estim...
research
12/24/2020

Variational Determinant Estimation with Spherical Normalizing Flows

This paper introduces the Variational Determinant Estimator (VDE), a var...
research
06/17/2022

Path-Gradient Estimators for Continuous Normalizing Flows

Recent work has established a path-gradient estimator for simple variati...
research
11/04/2020

Quantized Variational Inference

We present Quantized Variational Inference, a new algorithm for Evidence...
research
03/03/2020

Automatic Differentiation Variational Inference with Mixtures

Automatic Differentiation Variational Inference (ADVI) is a useful tool ...

Please sign up or login with your details

Forgot password? Click here to reset