An Explicit Expansion of the Kullback-Leibler Divergence along its Fisher-Rao Gradient Flow

02/23/2023
by   Carles Domingo Enrich, et al.
0

Let V_* : ℝ^d →ℝ be some (possibly non-convex) potential function, and consider the probability measure π∝ e^-V_*. When π exhibits multiple modes, it is known that sampling techniques based on Wasserstein gradient flows of the Kullback-Leibler (KL) divergence (e.g. Langevin Monte Carlo) suffer poorly in the rate of convergence, where the dynamics are unable to easily traverse between modes. In stark contrast, the work of Lu et al. (2019; 2022) has shown that the gradient flow of the KL with respect to the Fisher-Rao (FR) geometry exhibits a convergence rate to π is that independent of the potential function. In this short note, we complement these existing results in the literature by providing an explicit expansion of KL(ρ_t^FRπ) in terms of e^-t, where (ρ_t^FR)_t≥ 0 is the FR gradient flow of the KL divergence. In turn, we are able to provide a clean asymptotic convergence rate, where the burn-in time is guaranteed to be finite. Our proof is based on observing a similarity between FR gradient flows and simulated annealing with linear scaling, and facts about cumulant generating functions. We conclude with simple synthetic experiments that demonstrate our theoretical findings are indeed tight. Based on our numerics, we conjecture that the asymptotic rates of convergence for Wasserstein-Fisher-Rao gradient flows are possibly related to this expansion in some cases.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/25/2015

A Note on the Kullback-Leibler Divergence for the von Mises-Fisher distribution

We present a derivation of the Kullback Leibler (KL)-Divergence (also kn...
research
05/25/2017

Convergence of Langevin MCMC in KL-divergence

Langevin diffusion is a commonly used tool for sampling from a given dis...
research
02/01/2019

Understanding MCMC Dynamics as Flows on the Wasserstein Space

It is known that the Langevin dynamics used in MCMC is the gradient flow...
research
07/12/2020

Fisher Auto-Encoders

It has been conjectured that the Fisher divergence is more robust to mod...
research
06/03/2020

SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence

Stein Variational Gradient Descent (SVGD), a popular sampling algorithm,...
research
10/25/2022

Whitening Convergence Rate of Coupling-based Normalizing Flows

Coupling-based normalizing flows (e.g. RealNVP) are a popular family of ...
research
05/27/2020

On the Convergence of Langevin Monte Carlo: The Interplay between Tail Growth and Smoothness

We study sampling from a target distribution ν_* = e^-f using the unadju...

Please sign up or login with your details

Forgot password? Click here to reset