Iterative Construction of Gaussian Process Surrogate Models for Bayesian Inference

by   Leen Alawieh, et al.

A new algorithm is developed to tackle the issue of sampling non-Gaussian model parameter posterior probability distributions that arise from solutions to Bayesian inverse problems. The algorithm aims to mitigate some of the hurdles faced by traditional Markov Chain Monte Carlo (MCMC) samplers, through constructing proposal probability densities that are both, easy to sample and that provide a better approximation to the target density than a simple Gaussian proposal distribution would. To achieve that, a Gaussian proposal distribution is augmented with a Gaussian Process (GP) surface that helps capture non-linearities in the log-likelihood function. In order to train the GP surface, an iterative approach is adopted for the optimal selection of points in parameter space. Optimality is sought by maximizing the information gain of the GP surface using a minimum number of forward model simulation runs. The accuracy of the GP-augmented surface approximation is assessed in two ways. The first consists of comparing predictions obtained from the approximate surface with those obtained through running the actual simulation model at hold-out points in parameter space. The second consists of a measure based on the relative variance of sample weights obtained from sampling the approximate posterior probability distribution of the model parameters. The efficacy of this new algorithm is tested on inferring reaction rate parameters in a 3-node and 6-node network toy problems, which imitate idealized reaction networks in combustion applications.


Approximate Bayesian inference from noisy likelihoods with Gaussian process emulated MCMC

We present an efficient approach for doing approximate Bayesian inferenc...

Adaptive Gaussian process approximation for Bayesian inference with expensive likelihood functions

We consider Bayesian inference problems with computationally intensive l...

Accelerating ABC methods using Gaussian processes

Approximate Bayesian computation (ABC) methods are used to approximate p...

Approximate Sampling using an Accelerated Metropolis-Hastings based on Bayesian Optimization and Gaussian Processes

Markov Chain Monte Carlo (MCMC) methods have a drawback when working wit...

A Bayesian Approach To Graph Partitioning

A new algorithm based on bayesian inference for learning local graph con...

Bayesian learning of orthogonal embeddings for multi-fidelity Gaussian Processes

We present a Bayesian approach to identify optimal transformations that ...

Please sign up or login with your details

Forgot password? Click here to reset