On the Locally Lipschitz Robustness of Bayesian Inverse Problems

06/17/2019
by   Björn Sprungk, et al.
0

In this note we consider the robustness of posterior measures occuring in Bayesian inference w.r.t. perturbations of the prior measure and the log-likelihood function. This extends the well-posedness analysis of Bayesian inverse problems. In particular, we prove a general local Lipschitz continuous dependence of the posterior on the prior and the log-likelihood w.r.t. various common distances of probability measures. These include the Hellinger and Wasserstein distance and the Kullback-Leibler divergence. We only assume the boundedness of the likelihoods and measure their perturbations in an L^p-norm w.r.t. the prior. Our results indicate an increasing sensitivity of Bayesian inference as the posterior becomes more concentrated, e.g., due to more or more accurate data. This confirms and extends previous observations made in the sensitivity analysis of Bayesian inference.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset