Log In Sign Up

f-Divergence Variational Inference

by   Neng Wan, et al.

This paper introduces the f-divergence variational inference (f-VI) that generalizes variational inference to all f-divergences. Initiated from minimizing a crafty surrogate f-divergence that shares the statistical consistency with the f-divergence, the f-VI framework not only unifies a number of existing VI methods, e.g. Kullback-Leibler VI, Rényi's α-VI, and χ-VI, but offers a standardized toolkit for VI subject to arbitrary divergences from f-divergence family. A general f-variational bound is derived and provides a sandwich estimate of marginal likelihood (or evidence). The development of the f-VI unfolds with a stochastic optimization scheme that utilizes the reparameterization trick, importance weighting and Monte Carlo approximation; a mean-field approximation scheme that generalizes the well-known coordinate ascent variational inference (CAVI) is also proposed for f-VI. Empirical examples, including variational autoencoders and Bayesian neural networks, are provided to demonstrate the effectiveness and the wide applicability of f-VI.


Rényi Divergence Variational Inference

This paper introduces the variational Rényi bound (VR) that extends trad...

Fast Variational Inference in the Conjugate Exponential Family

We present a general method for deriving collapsed variational inference...

Alpha-Beta Divergence For Variational Inference

This paper introduces a variational approximation framework using direct...

Variational Inference MPC using Tsallis Divergence

In this paper, we provide a generalized framework for Variational Infere...

Distribution Matching in Variational Inference

The difficulties in matching the latent posterior to the prior, balancin...

Bayesian Neural Networks With Maximum Mean Discrepancy Regularization

Bayesian Neural Networks (BNNs) are trained to optimize an entire distri...

Variational Inference with Holder Bounds

The recent introduction of thermodynamic integration techniques has prov...