A Note on the Convergence of Mirrored Stein Variational Gradient Descent under (L_0,L_1)-Smoothness Condition

06/20/2022
by   Lukang Sun, et al.
0

In this note, we establish a descent lemma for the population limit Mirrored Stein Variational Gradient Method (MSVGD). This descent lemma does not rely on the path information of MSVGD but rather on a simple assumption for the mirrored distribution ∇Ψ_#π∝exp(-V). Our analysis demonstrates that MSVGD can be applied to a broader class of constrained sampling problems with non-smooth V. We also investigate the complexity of the population limit MSVGD in terms of dimension d.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/01/2022

Convergence of Stein Variational Gradient Descent under a Weaker Smoothness Condition

Stein Variational Gradient Descent (SVGD) is an important alternative to...
research
06/06/2021

Complexity Analysis of Stein Variational Gradient Descent Under Talagrand's Inequality T1

We study the complexity of Stein Variational Gradient Descent (SVGD), wh...
research
10/02/2022

Improved Stein Variational Gradient Descent with Importance Weights

Stein Variational Gradient Descent (SVGD) is a popular sampling algorith...
research
12/13/2017

Potential-Function Proofs for First-Order Methods

This note discusses proofs for convergence of first-order methods based ...
research
01/26/2023

A Bound for Stieltjes Constants

The goal of this note is to improve on the currently available bounds fo...
research
06/21/2019

Deep Polyphonic ADSR Piano Note Transcription

We investigate a late-fusion approach to piano transcription, combined w...
research
07/20/2022

A note on the variation of geometric functionals

Calculus of Variation combined with Differential Geometry as tools of mo...

Please sign up or login with your details

Forgot password? Click here to reset