Convergence of Stein Variational Gradient Descent under a Weaker Smoothness Condition

by   Lukang Sun, et al.
King Abdullah University of Science and Technology

Stein Variational Gradient Descent (SVGD) is an important alternative to the Langevin-type algorithms for sampling from probability distributions of the form π(x) ∝exp(-V(x)). In the existing theory of Langevin-type algorithms and SVGD, the potential function V is often assumed to be L-smooth. However, this restrictive condition excludes a large class of potential functions such as polynomials of degree greater than 2. Our paper studies the convergence of the SVGD algorithm for distributions with (L_0,L_1)-smooth potentials. This relaxed smoothness assumption was introduced by Zhang et al. [2019a] for the analysis of gradient clipping algorithms. With the help of trajectory-independent auxiliary conditions, we provide a descent lemma establishing that the algorithm decreases the KL divergence at each iteration and prove a complexity bound for SVGD in the population limit in terms of the Stein Fisher information.


page 1

page 2

page 3

page 4


A Note on the Convergence of Mirrored Stein Variational Gradient Descent under (L_0,L_1)-Smoothness Condition

In this note, we establish a descent lemma for the population limit Mirr...

A Non-Asymptotic Analysis for Stein Variational Gradient Descent

We study the Stein Variational Gradient Descent (SVGD) algorithm, which ...

Complexity Analysis of Stein Variational Gradient Descent Under Talagrand's Inequality T1

We study the complexity of Stein Variational Gradient Descent (SVGD), wh...

Analysis of Gradient Clipping and Adaptive Scaling with a Relaxed Smoothness Condition

We provide a theoretical explanation for the fast convergence of gradien...

On the geometry of Stein variational gradient descent

Bayesian inference problems require sampling or approximating high-dimen...

Improved Stein Variational Gradient Descent with Importance Weights

Stein Variational Gradient Descent (SVGD) is a popular sampling algorith...

Optimizing quantum optimization algorithms via faster quantum gradient computation

We consider a generic framework of optimization algorithms based on grad...

Please sign up or login with your details

Forgot password? Click here to reset