DeepAI AI Chat
Log In Sign Up

Random forward models and log-likelihoods in Bayesian inverse problems

12/15/2017
by   H. C. Lie, et al.
Freie Universität Berlin
Zuse Institute Berlin
0

We consider the use of randomised forward models and log-likelihoods within the Bayesian approach to inverse problems. Such random approximations to the exact forward model or log-likelihood arise naturally when a computationally expensive model is approximated using a cheaper stochastic surrogate, as in Gaussian process emulation (kriging), or in the field of probabilistic numerical methods. We show that the Hellinger distance between the exact and approximate Bayesian posteriors is bounded by moments of the difference between the true and approximate log-likelihoods. Example applications of these stability results are given for randomised misfit models in large data applications and the probabilistic solution of ordinary differential equations.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/13/2019

Error bounds for some approximate posterior measures in Bayesian inference

In certain applications involving the solution of a Bayesian inverse pro...
02/21/2020

Differentiable Likelihoods for Fast Inversion of 'Likelihood-Free' Dynamical Systems

Likelihood-free (a.k.a. simulation-based) inference problems are inverse...
11/05/2021

Sampling Methods for Bayesian Inference Involving Convergent Noisy Approximations of Forward Maps

We present Bayesian techniques for solving inverse problems which involv...
06/07/2023

Data Mining for Faster, Interpretable Solutions to Inverse Problems: A Case Study Using Additive Manufacturing

Solving inverse problems, where we find the input values that result in ...
08/11/2022

A note on Γ-convergence of Tikhonov functionals for nonlinear inverse problems

We consider variational regularization of nonlinear inverse problems in ...