Riemannian Stein Variational Gradient Descent for Bayesian Inference

11/30/2017
by   Chang Liu, et al.
0

We develop Riemannian Stein Variational Gradient Descent (RSVGD), a Bayesian inference method that generalizes Stein Variational Gradient Descent (SVGD) to Riemann manifold. The benefits are two-folds: (i) for inference tasks in Euclidean spaces, RSVGD has the advantage over SVGD of utilizing information geometry, and (ii) for inference tasks on Riemann manifolds, RSVGD brings the unique advantages of SVGD to the Riemannian world. To appropriately transfer to Riemann manifolds, we conceive novel and non-trivial techniques for RSVGD, which are required by the intrinsically different characteristics of general Riemann manifolds from Euclidean spaces. We also discover Riemannian Stein's Identity and Riemannian Kernelized Stein Discrepancy. Experimental results show the advantages over SVGD of exploring distribution geometry and the advantages of particle-efficiency, iteration-effectiveness and approximation flexibility over other inference methods on Riemann manifolds.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/22/2021

Neural Variational Gradient Descent

Particle-based approximate Bayesian inference approaches such as Stein V...
research
10/01/2018

Riemannian Adaptive Optimization Methods

Several first order stochastic optimization methods commonly used in the...
research
08/17/2021

Kähler information manifolds of signal processing filters in weighted Hardy spaces

We generalize Kähler information manifolds of complex-valued signal proc...
research
09/21/2016

Partial Least Squares Regression on Riemannian Manifolds and Its Application in Classifications

Partial least squares regression (PLSR) has been a popular technique to ...
research
12/01/2021

Diffusion Mean Estimation on the Diagonal of Product Manifolds

Computing sample means on Riemannian manifolds is typically computationa...
research
07/04/2018

Accelerated First-order Methods on the Wasserstein Space for Bayesian Inference

We consider doing Bayesian inference by minimizing the KL divergence on ...
research
09/20/2019

Trivializations for Gradient-Based Optimization on Manifolds

We introduce a framework to study the transformation of problems with ma...

Please sign up or login with your details

Forgot password? Click here to reset