Robust MCMC Sampling with Non-Gaussian and Hierarchical Priors in High Dimensions

03/09/2018
by   Victor Chen, et al.
0

A key problem in inference for high dimensional unknowns is the design of sampling algorithms whose performance scales favourably with the dimension of the unknown. A typical setting in which these problems arise is the area of Bayesian inverse problems. In such problems, which include graph-based learning, nonparametric regression and PDE-based inversion, the unknown can be viewed as an infinite-dimensional parameter (such as a function) that has been discretised. This results in a high-dimensional space for inference. Here we study robustness of an MCMC algorithm for posterior inference; this refers to MCMC convergence rates that do not deteriorate as the discretisation becomes finer. When a Gaussian prior is employed there is a known methodology for the design of robust MCMC samplers. However, one often requires more flexibility than a Gaussian prior can provide: hierarchical models are used to enable inference of parameters underlying a Gaussian prior; or non-Gaussian priors, such as Besov, are employed to induce sparse MAP estimators; or deep Gaussian priors are used to represent other non-Gaussian phenomena; and piecewise constant functions, which are necessarily non-Gaussian, are required for classification problems. The purpose of this article is to show that the simulation technology available for Gaussian priors can be exported to such non-Gaussian priors. The underlying methodology is based on a white noise representation of the unknown. This is exploited both for robust posterior sampling and for joint inference of the function and parameters involved in the specification of its prior, in which case our framework borrows strength from the well-developed non-centred methodology for Bayesian hierarchical models. The desired robustness of the proposed sampling algorithms is supported by some theory and by extensive numerical evidence from several challenging problems.

READ FULL TEXT

page 1

page 17

page 20

page 23

page 25

research
02/15/2020

Optimization-Based MCMC Methods for Nonlinear Hierarchical Statistical Inverse Problems

In many hierarchical inverse problems, not only do we want to estimate h...
research
11/02/2018

Efficient Marginalization-based MCMC Methods for Hierarchical Bayesian Inverse Problems

Hierarchical models in Bayesian inverse problems are characterized by an...
research
05/10/2019

Hyperparameter Estimation in Bayesian MAP Estimation: Parameterizations and Consistency

The Bayesian formulation of inverse problems is attractive for three pri...
research
09/05/2022

On free energy barriers in Gaussian priors and failure of MCMC for high-dimensional unimodal distributions

We exhibit examples of high-dimensional unimodal posterior distributions...
research
12/20/2020

Dimension-robust Function Space MCMC With Neural Network Priors

This paper introduces a new prior on functions spaces which scales more ...
research
03/29/2023

Computationally efficient sampling methods for sparsity promoting hierarchical Bayesian models

Bayesian hierarchical models have been demonstrated to provide efficient...
research
04/20/2018

A Metropolis-Hastings algorithm for posterior measures with self-decomposable priors

We introduce a new class of Metropolis-Hastings algorithms for sampling ...

Please sign up or login with your details

Forgot password? Click here to reset