An Easy to Interpret Diagnostic for Approximate Inference: Symmetric Divergence Over Simulations

02/25/2021
by   Justin Domke, et al.
0

It is important to estimate the errors of probabilistic inference algorithms. Existing diagnostics for Markov chain Monte Carlo methods assume inference is asymptotically exact, and are not appropriate for approximate methods like variational inference or Laplace's method. This paper introduces a diagnostic based on repeatedly simulating datasets from the prior and performing inference on each. The central observation is that it is possible to estimate a symmetric KL-divergence defined over these simulations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/19/2017

AIDE: An algorithm for measuring the accuracy of probabilistic inference algorithms

Approximate probabilistic inference algorithms are central to many field...
research
05/10/2019

A Contrastive Divergence for Combining Variational Inference and MCMC

We develop a method to combine Markov chain Monte Carlo (MCMC) and varia...
research
12/28/2018

Divergence Triangle for Joint Training of Generator Model, Energy-based Model, and Inference Model

This paper proposes the divergence triangle as a framework for joint tra...
research
05/24/2023

Discriminative calibration

To check the accuracy of Bayesian computations, it is common to use rank...
research
05/02/2018

Alpha-Beta Divergence For Variational Inference

This paper introduces a variational approximation framework using direct...
research
03/05/2022

Recursive Monte Carlo and Variational Inference with Auxiliary Variables

A key challenge in applying Monte Carlo and variational inference (VI) i...
research
12/16/2022

Estimating truncation effects of quantum bosonic systems using sampling algorithms

To simulate bosons on a qubit- or qudit-based quantum computer, one has ...

Please sign up or login with your details

Forgot password? Click here to reset