Generalised Bayes Updates with f-divergences through Probabilistic Classifiers

by   Owen Thomas, et al.

A stream of algorithmic advances has steadily increased the popularity of the Bayesian approach as an inference paradigm, both from the theoretical and applied perspective. Even with apparent successes in numerous application fields, a rising concern is the robustness of Bayesian inference in the presence of model misspecification, which may lead to undesirable extreme behavior of the posterior distributions for large sample sizes. Generalized belief updating with a loss function represents a central principle to making Bayesian inference more robust and less vulnerable to deviations from the assumed model. Here we consider such updates with f-divergences to quantify a discrepancy between the assumed statistical model and the probability distribution which generated the observed data. Since the latter is generally unknown, estimation of the divergence may be viewed as an intractable problem. We show that the divergence becomes accessible through the use of probabilistic classifiers that can leverage an estimate of the ratio of two probability distributions even when one or both of them is unknown. We demonstrate the behavior of generalized belief updates for various specific choices under the f-divergence family. We show that for specific divergence functions such an approach can even improve on methods evaluating the correct model likelihood function analytically.


page 1

page 2

page 3

page 4


Diagnosing model misspecification and performing generalized Bayes' updates via probabilistic classifiers

Model misspecification is a long-standing enigma of the Bayesian inferen...

Robust Generalised Bayesian Inference for Intractable Likelihoods

Generalised Bayesian inference updates prior beliefs using a loss functi...

Computing Bayes: Bayesian Computation from 1763 to the 21st Century

The Bayesian statistical paradigm uses the language of probability to ex...

Bayes posterior convergence for loss functions via almost additive Thermodynamic Formalism

Statistical inference can be seen as information processing involving in...

A path integral approach to Bayesian inference in Markov processes

We formulate Bayesian updates in Markov processes by means of path integ...

General Bayesian Loss Function Selection and the use of Improper Models

Statisticians often face the choice between using probability models or ...

On the Stability of General Bayesian Inference

We study the stability of posterior predictive inferences to the specifi...

Please sign up or login with your details

Forgot password? Click here to reset