Adaptive particle-based approximations of the Gibbs posterior for inverse problems

07/02/2019
by   Zilong Zou, et al.
0

In this work, we adopt a general framework based on the Gibbs posterior to update belief distributions for inverse problems governed by partial differential equations (PDEs). The Gibbs posterior formulation is a generalization of standard Bayesian inference that only relies on a loss function connecting the unknown parameters to the data. It is particularly useful when the true data generating mechanism (or noise distribution) is unknown or difficult to specify. The Gibbs posterior coincides with Bayesian updating when a true likelihood function is known and the loss function corresponds to the negative log-likelihood, yet provides subjective inference in more general settings. We employ a sequential Monte Carlo (SMC) approach to approximate the Gibbs posterior using particles. To manage the computational cost of propagating increasing numbers of particles through the loss function, we employ a recently developed local reduced basis method to build an efficient surrogate loss function that is used in the Gibbs update formula in place of the true loss. We derive error bounds for our approximation and propose an adaptive approach to construct the surrogate model in an efficient manner. We demonstrate the efficiency of our approach through several numerical examples.

READ FULL TEXT

page 27

page 30

page 31

page 32

research
09/14/2021

Gibbs posterior inference on a Levy density under discrete sampling

In mathematical finance, Levy processes are widely used for their abilit...
research
03/01/2021

General Bayesian L^2 calibration of mathematical models

A general Bayesian method for L^2 calibration of a mathematical model is...
research
05/14/2021

Bayesian inference under model misspecification using transport-Lagrangian distances: an application to seismic inversion

Model misspecification constitutes a major obstacle to reliable inferenc...
research
02/09/2023

Introduction To Gaussian Process Regression In Bayesian Inverse Problems, With New ResultsOn Experimental Design For Weighted Error Measures

Bayesian posterior distributions arising in modern applications, includi...
research
12/08/2020

Gibbs posterior concentration rates under sub-exponential type losses

Bayesian posterior distributions are widely used for inference, but thei...
research
03/07/2022

Discovering Inductive Bias with Gibbs Priors: A Diagnostic Tool for Approximate Bayesian Inference

Full Bayesian posteriors are rarely analytically tractable, which is why...
research
04/07/2020

Stability of Gibbs Posteriors from the Wasserstein Loss for Bayesian Full Waveform Inversion

Recently, the Wasserstein loss function has been proven to be effective ...

Please sign up or login with your details

Forgot password? Click here to reset