Scalable computation for Bayesian hierarchical models

03/19/2021
by   Omiros Papaspiliopoulos, et al.
0

The article is about algorithms for learning Bayesian hierarchical models, the computational complexity of which scales linearly with the number of observations and the number of parameters in the model. It focuses on crossed random effect and nested multilevel models, which are used ubiquitously in applied sciences, and illustrates the methodology on two challenging real data analyses on predicting electoral results and real estate prices respectively. The posterior dependence in both classes is sparse: in crossed random effects models it resembles a random graph, whereas in nested multilevel models it is tree-structured. For each class we develop a framework for scalable computation based on collapsed Gibbs sampling and belief propagation respectively. We provide a number of negative (for crossed) and positive (for nested) results for the scalability (or lack thereof) of methods based on sparse linear algebra, which are relevant also to Laplace approximation methods for such models. Our numerical experiments compare with off-the-shelf variational approximations and Hamiltonian Monte Carlo. Our theoretical results, although partial, are useful in suggesting interesting methodologies and lead to conclusions that our numerics suggest to hold well beyond the scope of the underlying assumptions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/16/2021

On Gibbs Sampling for Structured Bayesian Models Discussion of paper by Zanella and Roberts

This article is a discussion of Zanella and Roberts' paper: Multilevel l...
research
01/19/2022

Fitting Double Hierarchical Models with the Integrated Nested Laplace Approximation

Double hierarchical generalized linear models (DHGLM) are a family of mo...
research
09/30/2022

Mixture of experts models for multilevel data: modelling framework and approximation theory

Multilevel data are prevalent in many real-world applications. However, ...
research
04/14/2023

Complexity of Gibbs samplers through Bayesian asymptotics

Gibbs samplers are popular algorithms to approximate posterior distribut...
research
03/07/2019

Solutions to Sparse Multilevel Matrix Problems

We define and solve classes of sparse matrix problems that arise in mult...
research
03/02/2022

Convolutional neural networks as an alternative to Bayesian retrievals

Exoplanet observations are currently analysed with Bayesian retrieval te...
research
07/14/2017

Big Data vs. complex physical models: a scalable inference algorithm

The data torrent unleashed by current and upcoming instruments requires ...

Please sign up or login with your details

Forgot password? Click here to reset