Stein variational gradient descent with local approximations

04/13/2021
by   Liang Yan, et al.
0

Bayesian computation plays an important role in modern machine learning and statistics to reason about uncertainty. A key computational challenge in Bayesian inference is to develop efficient techniques to approximate, or draw samples from posterior distributions. Stein variational gradient decent (SVGD) has been shown to be a powerful approximate inference algorithm for this issue. However, the vanilla SVGD requires calculating the gradient of the target density and cannot be applied when the gradient is unavailable or too expensive to evaluate. In this paper we explore one way to address this challenge by the construction of a local surrogate for the target distribution in which the gradient can be obtained in a much more computationally feasible manner. More specifically, we approximate the forward model using a deep neural network (DNN) which is trained on a carefully chosen training set, which also determines the quality of the surrogate. To this end, we propose a general adaptation procedure to refine the local approximation online without destroying the convergence of the resulting SVGD. This significantly reduces the computational cost of SVGD and leads to a suite of algorithms that are straightforward to implement. The new algorithm is illustrated on a set of challenging Bayesian inverse problems, and numerical experiments demonstrate a clear improvement in performance and applicability of standard SVGD.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset