A stochastic version of Stein Variational Gradient Descent for efficient sampling

02/09/2019
by   Lei Li, et al.
0

We propose in this work RBM-SVGD, a stochastic version of Stein Variational Gradient Descent (SVGD) method for efficiently sampling from a given probability measure and thus useful for Bayesian inference. The method is to apply the Random Batch Method (RBM) for interacting particle systems proposed by Jin et al to the interacting particle systems in SVGD. While keeping the behaviors of SVGD, it reduces the computational cost, especially when the interacting kernel has long range. Numerical examples verify the efficiency of this new version of SVGD.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/25/2021

Stein Variational Gradient Descent: many-particle and long-time asymptotics

Stein variational gradient descent (SVGD) refers to a class of methods f...
research
12/02/2019

On the geometry of Stein variational gradient descent

Bayesian inference problems require sampling or approximating high-dimen...
research
04/08/2020

Mirror Descent Algorithms for Minimizing Interacting Free Energy

This note considers the problem of minimizing interacting free energy. M...
research
12/06/2022

Further analysis of multilevel Stein variational gradient descent with an application to the Bayesian inference of glacier ice models

Multilevel Stein variational gradient descent is a method for particle-b...
research
10/28/2019

Stein Variational Gradient Descent With Matrix-Valued Kernels

Stein variational gradient descent (SVGD) is a particle-based inference ...
research
05/24/2023

Learning Rate Free Bayesian Inference in Constrained Domains

We introduce a suite of new particle-based algorithms for sampling on co...

Please sign up or login with your details

Forgot password? Click here to reset