Stein variational gradient descent on infinite-dimensional space and applications to statistical inverse problems

by   Junxiong Jia, et al.

For solving Bayesian inverse problems governed by large-scale forward problems, we present an infinite-dimensional version of the Stein variational gradient descent (iSVGD) method, which has the ability to generate approximate samples from the posteriors efficiently. Specifically, we introduce the concept of the operator-valued kernel and the corresponding function-valued reproducing kernel Hilbert space (RKHS). Through the properties of RKHS, we give an explicit meaning of the infinite-dimensional objects (e.g., the Stein operator) and prove that the infinite-dimensional objects are indeed the limit of finite-dimensional items. Furthermore, by generalizing the change of variables formula, we construct iSVGD with preconditioning operators, yielding more efficient iSVGD. During these generalizations, we introduce a regularity parameter s∈[0,1]. Our analysis shows that the intuitive trivial version (i.e., by directly taking finite-dimensional objects as infinite-dimensional items) of iSVGD with preconditioning operators (s=0) will yield inaccurate estimates, and the parameter s should be chosen larger than 0 and smaller than 0.5. Finally, the proposed algorithms are applied to an inverse problem governed by the Helmholtz equation. Numerical results confirm the correctness of our theoretical findings and demonstrate the potential usefulness of the proposed approach in the posterior sampling of large-scale nonlinear statistical inverse problems.


page 21

page 26


Variational Bayes' method for functions with applications to some inverse problems

Bayesian approach as a useful tool for quantifying uncertainties has bee...

A unified approach to calculation of information operators in semiparametric models

The infinite-dimensional information operator for the nuisance parameter...

Variational Inverting Network for Statistical Inverse Problems of Partial Differential Equations

To quantify uncertainties of the inverse problems governed by partial di...

Diving into the shallows: a computational perspective on large-scale shallow learning

In this paper we first identify a basic limitation in gradient descent-b...

Learning Schatten--Von Neumann Operators

We study the learnability of a class of compact operators known as Schat...

Well-posedness of Bayesian inverse problems in quasi-Banach spaces with stable priors

The Bayesian perspective on inverse problems has attracted much mathemat...

Total variation multiscale estimators for linear inverse problems

Even though the statistical theory of linear inverse problems is a well-...