Hyperparameter Estimation in Bayesian MAP Estimation: Parameterizations and Consistency

05/10/2019
by   Matthew M. Dunlop, et al.
0

The Bayesian formulation of inverse problems is attractive for three primary reasons: it provides a clear modelling framework; means for uncertainty quantification; and it allows for principled learning of hyperparameters. The posterior distribution may be explored by sampling methods, but for many problems it is computationally infeasible to do so. In this situation maximum a posteriori (MAP) estimators are often sought. Whilst these are relatively cheap to compute, and have an attractive variational formulation, a key drawback is their lack of invariance under change of parameterization. This is a particularly significant issue when hierarchical priors are employed to learn hyperparameters. In this paper we study the effect of the choice of parameterization on MAP estimators when a conditionally Gaussian hierarchical prior distribution is employed. Specifically we consider the centred parameterization, the natural parameterization in which the unknown state is solved for directly, and the noncentred parameterization, which works with a whitened Gaussian as the unknown state variable, and arises when considering dimension-robust MCMC algorithms; MAP estimation is well-defined in the nonparametric setting only for the noncentred parameterization. However, we show that MAP estimates based on the noncentred parameterization are not consistent as estimators of hyperparameters; conversely, we show that limits of finite-dimensional centred MAP estimators are consistent as the dimension tends to infinity. We also consider empirical Bayesian hyperparameter estimation, show consistency of these estimates, and demonstrate that they are more robust with respect to noise than centred MAP estimates. An underpinning concept throughout is that hyperparameters may only be recovered up to measure equivalence, a well-known phenomenon in the context of the Ornstein-Uhlenbeck process.

READ FULL TEXT
research
11/14/2022

Path-following methods for Maximum a Posteriori estimators in Bayesian hierarchical models: How estimates depend on hyperparameters

Maximum a posteriori (MAP) estimation, like all Bayesian methods, depend...
research
03/09/2018

Robust MCMC Sampling with Non-Gaussian and Hierarchical Priors in High Dimensions

A key problem in inference for high dimensional unknowns is the design o...
research
03/29/2023

Computationally efficient sampling methods for sparsity promoting hierarchical Bayesian models

Bayesian hierarchical models have been demonstrated to provide efficient...
research
10/31/2019

Multiplicative noise in Bayesian inverse problems: Well-posedness and consistency of MAP estimators

Multiplicative noise arises in inverse problems when, for example, uncer...
research
04/04/2018

Posterior Inference for Sparse Hierarchical Non-stationary Models

Gaussian processes are valuable tools for non-parametric modelling, wher...
research
03/31/2021

Bayesian estimation of nonlinear Hawkes process

Multivariate point processes are widely applied to model event-type data...
research
02/16/2018

Robust estimation in controlled branching processes: Bayesian estimators via disparities

This paper is concerned with Bayesian inferential methods for data from ...

Please sign up or login with your details

Forgot password? Click here to reset