Fast, Scalable Approximations to Posterior Distributions in Extended Latent Gaussian Models

03/12/2021 ∙ by Alex Stringer, et al. ∙ 0

We define a novel class of additive models called Extended Latent Gaussian Models and develop a fast, scalable approximate Bayesian inference methodology for this class. The new class covers a wide range of interesting models, and the new methodology is better suited to large samples than existing approaches. We discuss convergence theory for our posterior approximations. We then illustrate the computational aspects of our approach through a comparison to existing methods, and demonstrate its application in three challenging examples: the analysis of aggregated spatial point process data, the fitting of a Cox proportional hazards model with partial likelihood and a latent spatial point process, and an astrophysical model for estimating the mass of the Milky Way in the presence of multivariate measurement uncertainties. Computations make use of the publicly available aghq package in the R language and code for the examples in the paper is available from https://github.com/awstringer1/elgm-paper-code

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 22

page 23

page 27

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.