On the Robustness to Misspecification of α-Posteriors and Their Variational Approximations

04/16/2021
by   Marco Avella-Medina, et al.
0

α-posteriors and their variational approximations distort standard posterior inference by downweighting the likelihood and introducing variational approximation errors. We show that such distortions, if tuned appropriately, reduce the Kullback-Leibler (KL) divergence from the true, but perhaps infeasible, posterior distribution when there is potential parametric model misspecification. To make this point, we derive a Bernstein-von Mises theorem showing convergence in total variation distance of α-posteriors and their variational approximations to limiting Gaussian distributions. We use these distributions to evaluate the KL divergence between true and reported posteriors. We show this divergence is minimized by choosing α strictly smaller than one, assuming there is a vanishingly small probability of model misspecification. The optimized value becomes smaller as the the misspecification becomes more severe. The optimized KL divergence increases logarithmically in the degree of misspecification and not linearly as with the usual posterior.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/01/2022

Tight bounds for augmented KL divergence in terms of augmented total variation distance

We provide optimal variational upper and lower bounds for the augmented ...
research
07/15/2023

Minimal Random Code Learning with Mean-KL Parameterization

This paper studies the qualitative behavior and robustness of two varian...
research
05/31/2022

Parallel Tempering With a Variational Reference

Sampling from complex target distributions is a challenging task fundame...
research
11/11/2021

Causal KL: Evaluating Causal Discovery

The two most commonly used criteria for assessing causal model discovery...
research
01/08/2023

Skewed Bernstein-von Mises theorem and skew-modal approximations

Deterministic Gaussian approximations of intractable posterior distribut...
research
03/08/2019

Rates of Convergence for Sparse Variational Gaussian Process Regression

Excellent variational approximations to Gaussian process posteriors have...
research
06/23/2021

Sampling with Mirrored Stein Operators

We introduce a new family of particle evolution samplers suitable for co...

Please sign up or login with your details

Forgot password? Click here to reset