Good Initializations of Variational Bayes for Deep Models

10/18/2018
by   Simone Rossi, et al.
0

Stochastic variational inference is an established way to carry out approximate Bayesian inference for deep models. While there have been effective proposals for good initializations for loss minimization in deep learning, far less attention has been devoted to the issue of initialization of stochastic variational inference. We address this by proposing a novel layer-wise initialization strategy based on Bayesian linear models. The proposed method is extensively validated on regression and classification tasks, including Bayesian DeepNets and ConvNets, showing faster convergence compared to alternatives inspired by the literature on initializations for loss minimization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/27/2019

Walsh-Hadamard Variational Inference for Bayesian Deep Learning

Over-parameterized models, such as DeepNets and ConvNets, form a class o...
research
05/28/2015

A trust-region method for stochastic variational inference with applications to streaming data

Stochastic variational inference allows for fast posterior inference in ...
research
10/03/2014

BayesPy: Variational Bayesian Inference in Python

BayesPy is an open-source Python software package for performing variati...
research
11/02/2014

Population Empirical Bayes

Bayesian predictive inference analyzes a dataset to make predictions abo...
research
11/15/2022

On the Performance of Direct Loss Minimization for Bayesian Neural Networks

Direct Loss Minimization (DLM) has been proposed as a pseudo-Bayesian me...
research
10/18/2017

Variational Inference based on Robust Divergences

Robustness to outliers is a central issue in real-world machine learning...
research
05/19/2022

Variational Inference for Bayesian Bridge Regression

We study the implementation of Automatic Differentiation Variational inf...

Please sign up or login with your details

Forgot password? Click here to reset