Principled Bayesian Minimum Divergence Inference

02/26/2018
by   Jack Jewson, et al.
0

When it is acknowledged that all candidate parameterised statistical models are misspecified relative to the data generating process, the decision maker must concern themselves with the KL-divergence minimising parameter in order to maintain principled statistical practice (Walker, 2013). However, it has long been known that the KL-divergence places a large weight on correctly capturing the tails of the data generating process. As a result traditional inference can be very non-robust. In this paper we advance recent methodological developments in general Bayesian updating (Bissiri, Holmes and Walker, 2016) to propose a statistically well principled Bayesian updating of beliefs targeting the minimisation of any statistical divergence. We improve both the motivation and the statistical foundations of existing Bayesian minimum divergence estimation (Hooker and Vidyashankar, 2014, Ghosh and Basu, 2016), for the first time allowing the well principled Bayesian to target predictions from the model that are close to the data generating process in terms of some alternative divergence measure to the KL-divergence. We argue that defining this divergence measure forms an important, subjective part of any statistical analysis. We here illustrate our method a broad array of divergence measures. We then compare the performance of the different divergence measures for conducting simple inference tasks on both simulated and real data sets, and discuss then how our methods might apply to more complicated, high dimensional models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/26/2018

Principles of Bayesian Inference using General Divergence Criteria

When it is acknowledged that all candidate parameterised statistical mod...
research
10/12/2022

On Divergence Measures for Bayesian Pseudocoresets

A Bayesian pseudocoreset is a small synthetic dataset for which the post...
research
09/26/2014

Generalized Twin Gaussian Processes using Sharma-Mittal Divergence

There has been a growing interest in mutual information measures due to ...
research
08/26/2013

A Comparison of Algorithms for Learning Hidden Variables in Normal Graphs

A Bayesian factor graph reduced to normal form consists in the interconn...
research
06/13/2021

Adaptation of the Tuning Parameter in General Bayesian Inference with Robust Divergence

We introduce a methodology for robust Bayesian estimation with robust di...
research
11/11/2021

Causal KL: Evaluating Causal Discovery

The two most commonly used criteria for assessing causal model discovery...
research
03/15/2012

Inference by Minimizing Size, Divergence, or their Sum

We speed up marginal inference by ignoring factors that do not significa...

Please sign up or login with your details

Forgot password? Click here to reset