SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence

by   Sinho Chewi, et al.

Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is often described as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of optimal transport. We introduce a new perspective on SVGD that instead views SVGD as the (kernelized) gradient flow of the chi-squared divergence which, we show, exhibits a strong form of uniform exponential ergodicity under conditions as weak as a Poincaré inequality. This perspective leads us to propose an alternative to SVGD, called Laplacian Adjusted Wasserstein Gradient Descent (LAWGD), that can be implemented from the spectral decomposition of the Laplacian operator associated with the target density. We show that LAWGD exhibits strong convergence guarantees and good practical performance.


page 1

page 2

page 3

page 4


Gradient descent algorithms for Bures-Wasserstein barycenters

We study first order methods to compute the barycenter of a probability ...

Stein Variational Gradient Descent as Gradient Flow

Stein variational gradient descent (SVGD) is a deterministic sampling al...

Regularized Stein Variational Gradient Flow

The Stein Variational Gradient Descent (SVGD) algorithm is an determinis...

Variational Gradient Descent using Local Linear Models

Stein Variational Gradient Descent (SVGD) can transport particles along ...

Improved Stein Variational Gradient Descent with Importance Weights

Stein Variational Gradient Descent (SVGD) is a popular sampling algorith...

Complexity Analysis of Stein Variational Gradient Descent Under Talagrand's Inequality T1

We study the complexity of Stein Variational Gradient Descent (SVGD), wh...

An Explicit Expansion of the Kullback-Leibler Divergence along its Fisher-Rao Gradient Flow

Let V_* : ℝ^d →ℝ be some (possibly non-convex) potential function, and c...

Please sign up or login with your details

Forgot password? Click here to reset