Bernstein-von Mises theorems and uncertainty quantification for linear inverse problems

11/09/2018
by   Matteo Giordano, et al.
0

We consider the statistical inverse problem of approximating an unknown function f from a linear measurement corrupted by additive Gaussian white noise. We employ a nonparametric Bayesian approach with standard Gaussian priors, for which the posterior-based reconstruction of f corresponds to a Tikhonov regulariser f̅ with a Cameron-Martin space norm penalty. We prove a semiparametric Bernstein-von Mises theorem for a large collection of linear functionals of f, implying that semiparametric posterior estimation and uncertainty quantification are valid and optimal from a frequentist point of view. The result is illustrated and further developed for some examples both in mildly and severely ill-posed cases. For the problem of recovering the source function in elliptic partial differential equations, we also obtain a nonparametric version of the theorem that entails the convergence of the posterior distribution to a fixed infinite-dimensional Gaussian probability measure with minimal covariance in suitable function spaces. As a consequence, we show that the distribution of the Tikhonov regulariser f̅ is asymptotically normal and attains the information lower bound, and that credible sets centred at f̅ have correct frequentist coverage and optimal diameter.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset