Complexity Analysis of Stein Variational Gradient Descent Under Talagrand's Inequality T1

06/06/2021
by   Adil Salim, et al.
5

We study the complexity of Stein Variational Gradient Descent (SVGD), which is an algorithm to sample from π(x) ∝exp(-F(x)) where F smooth and nonconvex. We provide a clean complexity bound for SVGD in the population limit in terms of the Stein Fisher Information (or squared Kernelized Stein Discrepancy), as a function of the dimension of the problem d and the desired accuracy ε. Unlike existing work, we do not make any assumption on the trajectory of the algorithm. Instead, our key assumption is that the target distribution satisfies Talagrand's inequality T1.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

06/17/2020

A Non-Asymptotic Analysis for Stein Variational Gradient Descent

We study the Stein Variational Gradient Descent (SVGD) algorithm, which ...
05/29/2020

Agnostic Learning of a Single Neuron with Gradient Descent

We consider the problem of learning the best-fitting single neuron as me...
10/04/2019

The Complexity of Finding Stationary Points with Stochastic Gradient Descent

We study the iteration complexity of stochastic gradient descent (SGD) f...
06/03/2020

SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence

Stein Variational Gradient Descent (SVGD), a popular sampling algorithm,...
01/09/2020

How to trap a gradient flow

We consider the problem of finding an ε-approximate stationary point of ...
02/19/2012

Beneath the valley of the noncommutative arithmetic-geometric mean inequality: conjectures, case-studies, and consequences

Randomized algorithms that base iteration-level decisions on samples fro...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.