Scalable Bayesian neural networks by layer-wise input augmentation

10/26/2020
by   Trung Trinh, et al.
0

We introduce implicit Bayesian neural networks, a simple and scalable approach for uncertainty representation in deep learning. Standard Bayesian approach to deep learning requires the impractical inference of the posterior distribution over millions of parameters. Instead, we propose to induce a distribution that captures the uncertainty over neural networks by augmenting each layer's inputs with latent variables. We present appropriate input distributions and demonstrate state-of-the-art performance in terms of calibration, robustness and uncertainty characterisation over large-scale, multi-million parameter image classification tasks.

READ FULL TEXT

page 4

page 5

research
06/06/2022

Tackling covariate shift with node-based Bayesian neural networks

Bayesian neural networks (BNNs) promise improved generalization under co...
research
12/04/2020

Encoding the latent posterior of Bayesian Neural Networks for uncertainty quantification

Bayesian neural networks (BNNs) have been long considered an ideal, yet ...
research
12/12/2021

Spatial-Temporal-Fusion BNN: Variational Bayesian Feature Layer

Bayesian neural networks (BNNs) have become a principal approach to alle...
research
04/02/2019

Correlated Parameters to Accurately Measure Uncertainty in Deep Neural Networks

In this article a novel approach for training deep neural networks using...
research
05/30/2019

Modeling Uncertainty by Learning a Hierarchy of Deep Neural Connections

Quantifying and measuring uncertainty in deep neural networks, despite r...
research
12/23/2021

Improving Robustness and Uncertainty Modelling in Neural Ordinary Differential Equations

Neural ordinary differential equations (NODE) have been proposed as a co...
research
12/06/2019

Sampling-Free Learning of Bayesian Quantized Neural Networks

Bayesian learning of model parameters in neural networks is important in...

Please sign up or login with your details

Forgot password? Click here to reset