Stein variational reduced basis Bayesian inversion

02/25/2020
by   Peng Chen, et al.
0

We propose and analyze a Stein variational reduced basis method (SVRB) to solve large-scale PDE-constrained Bayesian inverse problems. To address the computational challenge of drawing numerous samples requiring expensive PDE solves from the posterior distribution, we integrate an adaptive and goal-oriented model reduction technique with an optimization-based Stein variational gradient descent method (SVGD). The samples are drawn from the prior distribution and iteratively pushed to the posterior by a sequence of transport maps, which are constructed by SVGD, requiring the evaluation of the potential—the negative log of the likelihood function—and its gradient with respect to the random parameters, which depend on the solution of the PDE. To reduce the computational cost, we develop an adaptive and goal-oriented model reduction technique based on reduced basis approximations for the evaluation of the potential and its gradient. We present a detailed analysis for the reduced basis approximation errors of the potential and its gradient, the induced errors of the posterior distribution measured by Kullback–Leibler divergence, as well as the errors of the samples. To demonstrate the computational accuracy and efficiency of SVRB, we report results of numerical experiments on a Bayesian inverse problem governed by a diffusion PDE with random parameters with both uniform and Gaussian prior distributions. Over 100X speedups can be achieved while the accuracy of the approximation of the potential and its gradient is preserved.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset