Importance Sampled Stochastic Optimization for Variational Inference

04/19/2017
by   Joseph Sakaya, et al.
0

Variational inference approximates the posterior distribution of a probabilistic model with a parameterized density by maximizing a lower bound for the model evidence. Modern solutions fit a flexible approximation with stochastic gradient descent, using Monte Carlo approximation for the gradients. This enables variational inference for arbitrary differentiable probabilistic models, and consequently makes variational inference feasible for probabilistic programming languages. In this work we develop more efficient inference algorithms for the task by considering importance sampling estimates for the gradients. We show how the gradient with respect to the approximation parameters can often be evaluated efficiently without needing to re-compute gradients of the model itself, and then proceed to derive practical algorithms that use importance sampled estimates to speed up computation.We present importance sampled stochastic gradient descent that outperforms standard stochastic gradient descent by a clear margin for a range of models, and provide a justifiable variant of stochastic average gradients for variational inference.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/29/2020

Bayesian Neural Network via Stochastic Gradient Descent

The goal of bayesian approach used in variational inference is to minimi...
research
10/05/2018

Differentiable Antithetic Sampling for Variance Reduction in Stochastic Variational Inference

Stochastic optimization techniques are standard in variational inference...
research
05/19/2018

Sampling-Free Variational Inference of Bayesian Neural Nets

We propose a new Bayesian Neural Net (BNN) formulation that affords vari...
research
02/07/2018

Yes, but Did It Work?: Evaluating Variational Inference

While it's always possible to compute a variational approximation to a p...
research
01/09/2023

Fast and Correct Gradient-Based Optimisation for Probabilistic Programming via Smoothing

We study the foundations of variational inference, which frames posterio...
research
04/13/2021

The computational asymptotics of Gaussian variational inference

Variational inference is a popular alternative to Markov chain Monte Car...
research
10/12/2022

Alpha-divergence Variational Inference Meets Importance Weighted Auto-Encoders: Methodology and Asymptotics

Several algorithms involving the Variational Rényi (VR) bound have been ...

Please sign up or login with your details

Forgot password? Click here to reset