Generalized Bayesian Updating and the Loss-Likelihood Bootstrap

09/22/2017
by   Simon Lyddon, et al.
0

In this paper, we revisit the weighted likelihood bootstrap and show that it is well-motivated for Bayesian inference under misspecified models. We extend the underlying idea to a wider family of inferential problems. This allows us to calibrate an analogue of the likelihood function in situations where little is known about the data-generating mechanism. We demonstrate our method on a number of examples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/26/2021

Introducing prior information in Weighted Likelihood Bootstrap with applications to model misspecification

We propose Posterior Bootstrap, a set of algorithms extending Weighted L...
research
05/30/2022

Deep Bootstrap for Bayesian Inference

For a Bayesian, the task to define the likelihood can be as perplexing a...
research
01/31/2023

On the Stability of General Bayesian Inference

We study the stability of posterior predictive inferences to the specifi...
research
04/18/2020

Statistical inference in massive datasets by empirical likelihood

In this paper, we propose a new statistical inference method for massive...
research
10/30/2020

Parametric bootstrap inference for stratified models with high-dimensional nuisance specifications

Inference about a scalar parameter of interest typically relies on the a...
research
10/13/2011

BAMBI: blind accelerated multimodal Bayesian inference

In this paper we present an algorithm for rapid Bayesian analysis that c...
research
05/18/2023

Generalised likelihood profiles for models with intractable likelihoods

Likelihood profiling is an efficient and powerful frequentist approach f...

Please sign up or login with your details

Forgot password? Click here to reset