Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm

08/16/2016
by   Qiang Liu, et al.
0

We propose a general purpose variational inference algorithm that forms a natural counterpart of gradient descent for optimization. Our method iteratively transports a set of particles to match the target distribution, by applying a form of functional gradient descent that minimizes the KL divergence. Empirical studies are performed on various real world models and datasets, on which our method is competitive with existing state-of-the-art methods. The derivation of our method is based on a new theoretical result that connects the derivative of KL divergence under smooth transforms with Stein's identity and a recently proposed kernelized Stein discrepancy, which is of independent interest.

READ FULL TEXT
research
04/25/2017

Stein Variational Gradient Descent as Gradient Flow

Stein variational gradient descent (SVGD) is a deterministic sampling al...
research
06/08/2018

A Stein variational Newton method

Stein variational gradient descent (SVGD) was recently proposed as a gen...
research
11/13/2017

Analyzing and Improving Stein Variational Gradient Descent for High-dimensional Marginal Inference

Stein variational gradient descent (SVGD) is a nonparametric inference m...
research
05/24/2023

Variational Gradient Descent using Local Linear Models

Stein Variational Gradient Descent (SVGD) can transport particles along ...
research
07/06/2020

Kernel Stein Generative Modeling

We are interested in gradient-based Explicit Generative Modeling where s...
research
07/20/2017

Learning to Draw Samples with Amortized Stein Variational Gradient Descent

We propose a simple algorithm to train stochastic neural networks to dra...
research
03/01/2020

Stein Variational Inference for Discrete Distributions

Gradient-based approximate inference methods, such as Stein variational ...

Please sign up or login with your details

Forgot password? Click here to reset