Sampling-free Variational Inference for Neural Networks with Multiplicative Activation Noise

03/15/2021
by   Jannik Schmitt, et al.
0

To adopt neural networks in safety critical domains, knowing whether we can trust their predictions is crucial. Bayesian neural networks (BNNs) provide uncertainty estimates by averaging predictions with respect to the posterior weight distribution. Variational inference methods for BNNs approximate the intractable weight posterior with a tractable distribution, yet mostly rely on sampling from the variational distribution during training and inference. Recent sampling-free approaches offer an alternative, but incur a significant parameter overhead. We here propose a more efficient parameterization of the posterior approximation for sampling-free variational inference that relies on the distribution induced by multiplicative Gaussian activation noise. This allows us to combine parameter efficiency with the benefits of sampling-free variational inference. Our approach yields competitive results for standard regression problems and scales well to large-scale image classification tasks including ImageNet.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/29/2019

Efficient Approximate Inference with Walsh-Hadamard Variational Inference

Variational inference offers scalable and flexible tools to tackle intra...
research
10/13/2017

Bayesian Hypernetworks

We propose Bayesian hypernetworks: a framework for approximate Bayesian ...
research
10/24/2020

Implicit Variational Inference: the Parameter and the Predictor Space

Having access to accurate confidence levels along with the predictions a...
research
02/07/2020

The k-tied Normal Distribution: A Compact Parameterization of Gaussian Mean Field Posteriors in Bayesian Neural Networks

Variational Bayesian Inference is a popular methodology for approximatin...
research
10/30/2019

Thompson Sampling via Local Uncertainty

Thompson sampling is an efficient algorithm for sequential decision maki...
research
09/21/2022

Variational Inference for Infinitely Deep Neural Networks

We introduce the unbounded depth neural network (UDN), an infinitely dee...
research
11/08/2017

Variational Gaussian Dropout is not Bayesian

Gaussian multiplicative noise is commonly used as a stochastic regularis...

Please sign up or login with your details

Forgot password? Click here to reset