Note on the bias and variance of variational inference

06/09/2019
by   Chin-Wei Huang, et al.
0

In this note, we study the relationship between the variational gap and the variance of the (log) likelihood ratio. We show that the gap can be upper bounded by some form of dispersion measure of the likelihood ratio, which suggests the bias of variational inference can be reduced by making the distribution of the likelihood ratio more concentrated, such as via averaging and variance reduction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/20/2020

Gradient Regularisation as Approximate Variational Inference

Variational inference in Bayesian neural networks is usually performed u...
research
10/20/2020

VarGrad: A Low-Variance Gradient Estimator for Variational Inference

We analyse the properties of an unbiased gradient estimator of the ELBO ...
research
11/04/2021

Variational Inference with Holder Bounds

The recent introduction of thermodynamic integration techniques has prov...
research
06/06/2022

Embrace the Gap: VAEs Perform Independent Mechanism Analysis

Variational autoencoders (VAEs) are a popular framework for modeling com...
research
06/23/2020

On the Relationship Between Active Inference and Control as Inference

Active Inference (AIF) is an emerging framework in the brain sciences wh...
research
06/22/2023

A diagnosis of the primary difference between EuroForMix and STRmix

There is interest in comparing the output, principally the likelihood ra...
research
06/28/2016

Automatic Variational ABC

Approximate Bayesian Computation (ABC) is a framework for performing lik...

Please sign up or login with your details

Forgot password? Click here to reset