Locking and Quacking: Stacking Bayesian model predictions by log-pooling and superposition

05/12/2023
by   Yuling Yao, et al.
0

Combining predictions from different models is a central problem in Bayesian inference and machine learning more broadly. Currently, these predictive distributions are almost exclusively combined using linear mixtures such as Bayesian model averaging, Bayesian stacking, and mixture of experts. Such linear mixtures impose idiosyncrasies that might be undesirable for some applications, such as multi-modality. While there exist alternative strategies (e.g. geometric bridge or superposition), optimising their parameters usually involves computing an intractable normalising constant repeatedly. We present two novel Bayesian model combination tools. These are generalisations of model stacking, but combine posterior densities by log-linear pooling (locking) and quantum superposition (quacking). To optimise model weights while avoiding the burden of normalising constants, we investigate the Hyvarinen score of the combined posterior predictions. We demonstrate locking with an illustrative example and discuss its practical application with importance sampling.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/08/2019

Distilling importance sampling

The two main approaches to Bayesian inference are sampling and optimisat...
research
02/01/2022

Bayesian inference and prediction for mean-mixtures of normal distributions

We study frequentist risk properties of predictive density estimators fo...
research
01/22/2021

Bayesian hierarchical stacking

Stacking is a widely used model averaging technique that yields asymptot...
research
03/03/2021

Importance Sampling with the Integrated Nested Laplace Approximation

The Integrated Nested Laplace Approximation (INLA) is a deterministic ap...
research
11/01/2021

Swift sky localization of gravitational waves using deep learning seeded importance sampling

Fast, highly accurate, and reliable inference of the sky origin of gravi...
research
07/01/2021

q-Paths: Generalizing the Geometric Annealing Path using Power Means

Many common machine learning methods involve the geometric annealing pat...
research
05/14/2021

Posterior Regularisation on Bayesian Hierarchical Mixture Clustering

We study a recent inferential framework, named posterior regularisation,...

Please sign up or login with your details

Forgot password? Click here to reset