Gradients should stay on Path: Better Estimators of the Reverse- and Forward KL Divergence for Normalizing Flows

07/17/2022
by   Lorenz Vaitl, et al.
0

We propose an algorithm to estimate the path-gradient of both the reverse and forward Kullback-Leibler divergence for an arbitrary manifestly invertible normalizing flow. The resulting path-gradient estimators are straightforward to implement, have lower variance, and lead not only to faster convergence of training but also to better overall approximation results compared to standard total gradient estimators. We also demonstrate that path-gradient training is less susceptible to mode-collapse. In light of our results, we expect that path-gradient estimators will become the new standard method to train normalizing flows for variational inference.

READ FULL TEXT

page 4

page 17

research
05/29/2018

Forward Amortized Inference for Likelihood-Free Variational Marginalization

In this paper, we introduce a new form of amortized variational inferenc...
research
09/21/2017

Perturbative Black Box Variational Inference

Black box variational inference (BBVI) with reparameterization gradients...
research
06/17/2022

Path-Gradient Estimators for Continuous Normalizing Flows

Recent work has established a path-gradient estimator for simple variati...
research
02/03/2022

Transport Score Climbing: Variational Inference Using Forward KL and Adaptive Neural Transport

Variational inference often minimizes the "reverse" Kullbeck-Leibler (KL...
research
01/17/2022

Analytic-DPM: an Analytic Estimate of the Optimal Reverse Variance in Diffusion Probabilistic Models

Diffusion probabilistic models (DPMs) represent a class of powerful gene...
research
05/11/2019

Hessian transport Gradient flows

We derive new gradient flows of divergence functions in the probability ...
research
02/04/2021

Variational Inference for Deblending Crowded Starfields

In the image data collected by astronomical surveys, stars and galaxies ...

Please sign up or login with your details

Forgot password? Click here to reset