DeepAI AI Chat
Log In Sign Up

Quantized Variational Inference

11/04/2020
by   Amir Dib, et al.
0

We present Quantized Variational Inference, a new algorithm for Evidence Lower Bound maximization. We show how Optimal Voronoi Tesselation produces variance free gradients for ELBO optimization at the cost of introducing asymptotically decaying bias. Subsequently, we propose a Richardson extrapolation type method to improve the asymptotic bound. We show that using the Quantized Variational Inference framework leads to fast convergence for both score function and the reparametrized gradient estimator at a comparable computational cost. Finally, we propose several experiments to assess the performance of our method and its limitations.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/22/2012

Fast Variational Inference in the Conjugate Exponential Family

We present a general method for deriving collapsed variational inference...
03/27/2017

Sticking the Landing: Simple, Lower-Variance Gradient Estimators for Variational Inference

We propose a simple and general variant of the standard reparameterized ...
01/15/2021

Efficient Semi-Implicit Variational Inference

In this paper, we propose CI-VI an efficient and scalable solver for sem...
06/12/2020

Approximate Inference for Spectral Mixture Kernel

A spectral mixture (SM) kernel is a flexible kernel used to model any st...
09/12/2018

Discretely Relaxing Continuous Variables for tractable Variational Inference

We explore a new research direction in Bayesian variational inference wi...
06/23/2020

Variational Orthogonal Features

Sparse stochastic variational inference allows Gaussian process models t...
10/30/2018

Using Large Ensembles of Control Variates for Variational Inference

Variational inference is increasingly being addressed with stochastic op...