DeepAI AI Chat
Log In Sign Up

Complexity Analysis of Stein Variational Gradient Descent Under Talagrand's Inequality T1

06/06/2021
by   Adil Salim, et al.
5

We study the complexity of Stein Variational Gradient Descent (SVGD), which is an algorithm to sample from π(x) ∝exp(-F(x)) where F smooth and nonconvex. We provide a clean complexity bound for SVGD in the population limit in terms of the Stein Fisher Information (or squared Kernelized Stein Discrepancy), as a function of the dimension of the problem d and the desired accuracy ε. Unlike existing work, we do not make any assumption on the trajectory of the algorithm. Instead, our key assumption is that the target distribution satisfies Talagrand's inequality T1.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/20/2022

A Note on the Convergence of Mirrored Stein Variational Gradient Descent under (L_0,L_1)-Smoothness Condition

In this note, we establish a descent lemma for the population limit Mirr...
06/01/2022

Convergence of Stein Variational Gradient Descent under a Weaker Smoothness Condition

Stein Variational Gradient Descent (SVGD) is an important alternative to...
10/02/2022

Improved Stein Variational Gradient Descent with Importance Weights

Stein Variational Gradient Descent (SVGD) is a popular sampling algorith...
05/29/2020

Agnostic Learning of a Single Neuron with Gradient Descent

We consider the problem of learning the best-fitting single neuron as me...
06/03/2020

SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence

Stein Variational Gradient Descent (SVGD), a popular sampling algorithm,...
01/09/2020

How to trap a gradient flow

We consider the problem of finding an ε-approximate stationary point of ...
09/28/2018

Efficiently testing local optimality and escaping saddles for ReLU networks

We provide a theoretical algorithm for checking local optimality and esc...