KALE Flow: A Relaxed KL Gradient Flow for Probabilities with Disjoint Support

06/16/2021
by   Pierre Glaser, et al.
0

We study the gradient flow for a relaxed approximation to the Kullback-Leibler (KL) divergence between a moving source and a fixed target distribution. This approximation, termed the KALE (KL approximate lower-bound estimator), solves a regularized version of the Fenchel dual problem defining the KL over a restricted class of functions. When using a Reproducing Kernel Hilbert Space (RKHS) to define the function class, we show that the KALE continuously interpolates between the KL and the Maximum Mean Discrepancy (MMD). Like the MMD and other Integral Probability Metrics, the KALE remains well defined for mutually singular distributions. Nonetheless, the KALE inherits from the limiting KL a greater sensitivity to mismatch in the support of the distributions, compared with the MMD. These two properties make the KALE gradient flow particularly well suited when the target distribution is supported on a low-dimensional manifold. Under an assumption of sufficient smoothness of the trajectories, we show the global convergence of the KALE flow. We propose a particle implementation of the flow given initial samples from the source and the target distribution, which we use to empirically confirm the KALE's properties.

READ FULL TEXT
research
02/10/2021

On the Properties of Kullback-Leibler Divergence Between Gaussians

Kullback-Leibler (KL) divergence is one of the most important divergence...
research
06/11/2019

Maximum Mean Discrepancy Gradient Flow

We construct a Wasserstein gradient flow of the maximum mean discrepancy...
research
05/25/2017

Convergence of Langevin MCMC in KL-divergence

Langevin diffusion is a commonly used tool for sampling from a given dis...
research
05/24/2023

Variational Gradient Descent using Local Linear Models

Stein Variational Gradient Descent (SVGD) can transport particles along ...
research
04/05/2022

Practical Bounds of Kullback-Leibler Divergence Using Maximum Mean Discrepancy

Estimating Kullback Leibler (KL) divergence from data samples is a stren...
research
06/01/2019

BreGMN: scaled-Bregman Generative Modeling Networks

The family of f-divergences is ubiquitously applied to generative modeli...
research
02/25/2020

Reliable Estimation of Kullback-Leibler Divergence by Controlling Discriminator Complexity in the Reproducing Kernel Hilbert Space

Several scalable methods to compute the Kullback Leibler (KL) divergence...

Please sign up or login with your details

Forgot password? Click here to reset