A Finite-Particle Convergence Rate for Stein Variational Gradient Descent

11/17/2022
by   Jiaxin Shi, et al.
0

We provide a first finite-particle convergence rate for Stein variational gradient descent (SVGD). Specifically, whenever the target distribution is sub-Gaussian with a Lipschitz score, SVGD with n particles and an appropriate step size sequence drives the kernel Stein discrepancy to zero at an order 1/sqrt(log log n) rate. We suspect that the dependence on n can be improved, and we hope that our explicit, non-asymptotic proof strategy will serve as a template for future refinements.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/17/2020

A Non-Asymptotic Analysis for Stein Variational Gradient Descent

We study the Stein Variational Gradient Descent (SVGD) algorithm, which ...
research
05/18/2023

Augmented Message Passing Stein Variational Gradient Descent

Stein Variational Gradient Descent (SVGD) is a popular particle-based me...
research
05/27/2023

Provably Fast Finite Particle Variants of SVGD via Virtual Particle Stochastic Approximation

Stein Variational Gradient Descent (SVGD) is a popular variational infer...
research
05/21/2018

Frank-Wolfe Stein Sampling

In Bayesian inference, the posterior distributions are difficult to obta...
research
09/29/2020

Unbalanced Sobolev Descent

We introduce Unbalanced Sobolev Descent (USD), a particle descent algori...
research
05/23/2023

Towards Understanding the Dynamics of Gaussian-Stein Variational Gradient Descent

Stein Variational Gradient Descent (SVGD) is a nonparametric particle-ba...
research
10/20/2021

Adaptive Gradient Descent for Optimal Control of Parabolic Equations with Random Parameters

In this paper we extend the adaptive gradient descent (AdaGrad) algorith...

Please sign up or login with your details

Forgot password? Click here to reset