Stein variational gradient descent with local approximations

04/13/2021
by   Liang Yan, et al.
0

Bayesian computation plays an important role in modern machine learning and statistics to reason about uncertainty. A key computational challenge in Bayesian inference is to develop efficient techniques to approximate, or draw samples from posterior distributions. Stein variational gradient decent (SVGD) has been shown to be a powerful approximate inference algorithm for this issue. However, the vanilla SVGD requires calculating the gradient of the target density and cannot be applied when the gradient is unavailable or too expensive to evaluate. In this paper we explore one way to address this challenge by the construction of a local surrogate for the target distribution in which the gradient can be obtained in a much more computationally feasible manner. More specifically, we approximate the forward model using a deep neural network (DNN) which is trained on a carefully chosen training set, which also determines the quality of the surrogate. To this end, we propose a general adaptation procedure to refine the local approximation online without destroying the convergence of the resulting SVGD. This significantly reduces the computational cost of SVGD and leads to a suite of algorithms that are straightforward to implement. The new algorithm is illustrated on a set of challenging Bayesian inverse problems, and numerical experiments demonstrate a clear improvement in performance and applicability of standard SVGD.

READ FULL TEXT
research
04/13/2021

An acceleration strategy for randomize-then-optimize sampling via deep neural networks

Randomize-then-optimize (RTO) is widely used for sampling from posterior...
research
06/07/2018

Stein Variational Gradient Descent Without Gradient

Stein variational gradient decent (SVGD) has been shown to be a powerful...
research
07/12/2018

Fast yet Simple Natural-Gradient Descent for Variational Inference in Complex Models

Bayesian inference plays an important role in advancing machine learning...
research
02/25/2020

Stein variational reduced basis Bayesian inversion

We propose and analyze a Stein variational reduced basis method (SVRB) t...
research
03/07/2020

Scalable Approximate Inference and Some Applications

Approximate inference in probability models is a fundamental task in mac...
research
04/05/2021

Multilevel Stein variational gradient descent with applications to Bayesian inverse problems

This work presents a multilevel variant of Stein variational gradient de...
research
05/15/2023

Refining Amortized Posterior Approximations using Gradient-Based Summary Statistics

We present an iterative framework to improve the amortized approximation...

Please sign up or login with your details

Forgot password? Click here to reset