DeepAI AI Chat
Log In Sign Up

Random forward models and log-likelihoods in Bayesian inverse problems

by   H. C. Lie, et al.
Freie Universität Berlin
Zuse Institute Berlin

We consider the use of randomised forward models and log-likelihoods within the Bayesian approach to inverse problems. Such random approximations to the exact forward model or log-likelihood arise naturally when a computationally expensive model is approximated using a cheaper stochastic surrogate, as in Gaussian process emulation (kriging), or in the field of probabilistic numerical methods. We show that the Hellinger distance between the exact and approximate Bayesian posteriors is bounded by moments of the difference between the true and approximate log-likelihoods. Example applications of these stability results are given for randomised misfit models in large data applications and the probabilistic solution of ordinary differential equations.


page 1

page 2

page 3

page 4


Error bounds for some approximate posterior measures in Bayesian inference

In certain applications involving the solution of a Bayesian inverse pro...

Differentiable Likelihoods for Fast Inversion of 'Likelihood-Free' Dynamical Systems

Likelihood-free (a.k.a. simulation-based) inference problems are inverse...

Sampling Methods for Bayesian Inference Involving Convergent Noisy Approximations of Forward Maps

We present Bayesian techniques for solving inverse problems which involv...

Data Mining for Faster, Interpretable Solutions to Inverse Problems: A Case Study Using Additive Manufacturing

Solving inverse problems, where we find the input values that result in ...

A note on Γ-convergence of Tikhonov functionals for nonlinear inverse problems

We consider variational regularization of nonlinear inverse problems in ...