DeepAI AI Chat
Log In Sign Up

Quantized Variational Inference

by   Amir Dib, et al.

We present Quantized Variational Inference, a new algorithm for Evidence Lower Bound maximization. We show how Optimal Voronoi Tesselation produces variance free gradients for ELBO optimization at the cost of introducing asymptotically decaying bias. Subsequently, we propose a Richardson extrapolation type method to improve the asymptotic bound. We show that using the Quantized Variational Inference framework leads to fast convergence for both score function and the reparametrized gradient estimator at a comparable computational cost. Finally, we propose several experiments to assess the performance of our method and its limitations.


page 1

page 2

page 3

page 4


Fast Variational Inference in the Conjugate Exponential Family

We present a general method for deriving collapsed variational inference...

Sticking the Landing: Simple, Lower-Variance Gradient Estimators for Variational Inference

We propose a simple and general variant of the standard reparameterized ...

Efficient Semi-Implicit Variational Inference

In this paper, we propose CI-VI an efficient and scalable solver for sem...

Approximate Inference for Spectral Mixture Kernel

A spectral mixture (SM) kernel is a flexible kernel used to model any st...

Discretely Relaxing Continuous Variables for tractable Variational Inference

We explore a new research direction in Bayesian variational inference wi...

Variational Orthogonal Features

Sparse stochastic variational inference allows Gaussian process models t...

Using Large Ensembles of Control Variates for Variational Inference

Variational inference is increasingly being addressed with stochastic op...