Guaranteed Accuracy of Semi-Modular Posteriors

01/26/2023
by   David T. Frazier, et al.
0

Bayesian inference has widely acknowledged advantages in many problems, but it can also be unreliable when the model is misspecified. Bayesian modular inference is concerned with complex models which have been specified through a collection of coupled submodels, and is useful when there is misspecification in some of the submodels. The submodels are often called modules in the literature. Cutting feedback is a widely used Bayesian modular inference method which ensures that information from suspect model components is not used in making inferences about parameters in correctly specified modules. However, it may be hard to decide in what circumstances this “cut posterior” is preferred to the exact posterior. When misspecification is not severe, cutting feedback may increase the uncertainty in Bayesian posterior inference greatly without reducing estimation bias substantially. This motivates semi-modular inference methods, which avoid the binary cut of cutting feedback approaches. In this work, we precisely formalize the bias-variance trade-off involved in semi-modular inference for the first time in the literature, using a framework of local model misspecification. We then implement a mixture-based semi-modular inference approach, demonstrating theoretically that it delivers inferences that are more accurate, in terms of a user-defined loss function, than either the cut or full posterior on its own. The new method is demonstrated in a number of applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/18/2022

Modularized Bayesian analyses and cutting feedback in likelihood-free inference

There has been much recent interest in modifying Bayesian inference for ...
research
03/15/2020

Semi-Modular Inference: enhanced learning in multi-modular models by tempering the influence of components

Bayesian statistical inference loses predictive optimality when generati...
research
04/01/2022

Scalable Semi-Modular Inference with Variational Meta-Posteriors

The Cut posterior and related Semi-Modular Inference are Generalised Bay...
research
01/24/2022

Valid belief updates for prequentially additive loss functions arising in Semi-Modular Inference

Model-based Bayesian evidence combination leads to models with multiple ...
research
06/02/2020

Stochastic Approximation Cut Algorithm for Inference in Modularized Bayesian Models

Bayesian modelling enables us to accommodate complex forms of data and m...
research
10/21/2021

Asymptotics of cut distributions and robust modular inference using Posterior Bootstrap

Bayesian inference provides a framework to combine an arbitrary number o...
research
02/21/2022

Cutting feedback and modularized analyses in generalized Bayesian inference

Even in relatively simple settings, model misspecification can make the ...

Please sign up or login with your details

Forgot password? Click here to reset