Importance Sampling with the Integrated Nested Laplace Approximation

03/03/2021
by   Martin Outzen Berild, et al.
0

The Integrated Nested Laplace Approximation (INLA) is a deterministic approach to Bayesian inference on latent Gaussian models (LGMs) and focuses on fast and accurate approximation of posterior marginals for the parameters in the models. Recently, methods have been developed to extend this class of models to those that can be expressed as conditional LGMs by fixing some of the parameters in the models to descriptive values. These methods differ in the manner descriptive values are chosen. This paper proposes to combine importance sampling with INLA (IS-INLA), and extends this approach with the more robust adaptive multiple importance sampling algorithm combined with INLA (AMIS-INLA). This paper gives a comparison between these approaches and existing methods on a series of applications with simulated and observed datasets and evaluates their performance based on accuracy, efficiency, and robustness. The approaches are validated by exact posteriors in a simple bivariate linear model; then, they are applied to a Bayesian lasso model, a Bayesian imputation of missing covariate values, and lastly, in parametric Bayesian quantile regression. The applications show that the AMIS-INLA approach, in general, outperforms the other methods, but the IS-INLA algorithm could be considered for faster inference when good proposals are available.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/02/2019

Bayesian model averaging with the integrated nested Laplace approximation

The integrated nested Laplace approximation (INLA) for Bayesian inferenc...
research
10/08/2019

Distilling importance sampling

The two main approaches to Bayesian inference are sampling and optimisat...
research
08/16/2021

Multimodal Information Gain in Bayesian Design of Experiments

One of the well-known challenges in optimal experimental design is how t...
research
04/10/2022

Parallelized integrated nested Laplace approximations for fast Bayesian inference

There is a growing demand for performing larger-scale Bayesian inference...
research
09/12/2022

Uncovering Regions of Maximum Dissimilarity on Random Process Data

The comparison of local characteristics of two random processes can shed...
research
05/24/2020

Bayesian Multiresolution Modeling Of Georeferenced Data

Current implementations of multiresolution methods are limited in terms ...
research
05/12/2023

Locking and Quacking: Stacking Bayesian model predictions by log-pooling and superposition

Combining predictions from different models is a central problem in Baye...

Please sign up or login with your details

Forgot password? Click here to reset