Vprop: Variational Inference using RMSprop

12/04/2017
by   Mohammad Emtiyaz Khan, et al.
0

Many computationally-efficient methods for Bayesian deep learning rely on continuous optimization algorithms, but the implementation of these methods requires significant changes to existing code-bases. In this paper, we propose Vprop, a method for Gaussian variational inference that can be implemented with two minor changes to the off-the-shelf RMSprop optimizer. Vprop also reduces the memory requirements of Black-Box Variational Inference by half. We derive Vprop using the conjugate-computation variational inference method, and establish its connections to Newton's method, natural-gradient methods, and extended Kalman filters. Overall, this paper presents Vprop as a principled, computationally-efficient, and easy-to-implement method for Bayesian deep learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/31/2013

Black Box Variational Inference

Variational inference has become a widely used method to approximate pos...
research
11/15/2017

Variational Adaptive-Newton Method for Explorative Learning

We present the Variational Adaptive Newton (VAN) method which is a black...
research
10/26/2022

Exact Manifold Gaussian Variational Bayes

We propose an optimization algorithm for Variational Inference (VI) in c...
research
06/13/2018

Fast and Scalable Bayesian Deep Learning by Weight-Perturbation in Adam

Uncertainty computation in deep learning is essential to design robust a...
research
09/21/2023

Bayesian sparsification for deep neural networks with Bayesian model reduction

Deep learning's immense capabilities are often constrained by the comple...
research
01/13/2017

Truncation-free Hybrid Inference for DPMM

Dirichlet process mixture models (DPMM) are a cornerstone of Bayesian no...
research
11/04/2022

Black-box Coreset Variational Inference

Recent advances in coreset methods have shown that a selection of repres...

Please sign up or login with your details

Forgot password? Click here to reset