Walsh-Hadamard Variational Inference for Bayesian Deep Learning

05/27/2019
by   Simone Rossi, et al.
0

Over-parameterized models, such as DeepNets and ConvNets, form a class of models that are routinely adopted in a wide variety of applications, and for which Bayesian inference is desirable but extremely challenging. Variational inference offers the tools to tackle this challenge in a scalable way and with some degree of flexibility on the approximation, but for over-parameterized models this is challenging due to the over-regularization property of the variational objective. Inspired by the literature on kernel methods, and in particular on structured approximations of distributions of random matrices, this paper proposes Walsh-Hadamard Variational Inference (WHVI), which uses Walsh-Hadamard-based factorization strategies to reduce the parameterization and accelerate computations, thus avoiding over-regularization issues with the variational objective. Extensive theoretical and empirical analyses demonstrate that WHVI yields considerable speedups and model reductions compared to other techniques to carry out approximate inference for over-parameterized models, and ultimately show how advances in kernel methods can be translated into advances in approximate Bayesian inference.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/29/2019

Efficient Approximate Inference with Walsh-Hadamard Variational Inference

Variational inference offers scalable and flexible tools to tackle intra...
research
10/18/2018

Good Initializations of Variational Bayes for Deep Models

Stochastic variational inference is an established way to carry out appr...
research
02/26/2019

Function Space Particle Optimization for Bayesian Neural Networks

While Bayesian neural networks (BNNs) have drawn increasing attention, t...
research
11/29/2017

On the use of bootstrap with variational inference: Theory, interpretation, and a two-sample test example

Variational inference is a general approach for approximating complex de...
research
04/08/2019

A Generalization Bound for Online Variational Inference

Bayesian inference provides an attractive online-learning framework to a...
research
06/10/2022

PAVI: Plate-Amortized Variational Inference

Given some observed data and a probabilistic generative model, Bayesian ...
research
08/16/2022

Langevin Diffusion Variational Inference

Many methods that build powerful variational distributions based on unad...

Please sign up or login with your details

Forgot password? Click here to reset