Variational Neural Networks

07/04/2022
by   Illia Oleksiienko, et al.
17

Bayesian Neural Networks (BNNs) provide a tool to estimate the uncertainty of a neural network by considering a distribution over weights and sampling different models for each input. In this paper, we propose a method for uncertainty estimation in neural networks called Variational Neural Network that, instead of considering a distribution over weights, generates parameters for the output distribution of a layer by transforming its inputs with learnable sub-layers. In uncertainty quality estimation experiments, we show that VNNs achieve better uncertainty quality than Monte Carlo Dropout or Bayes By Backpropagation methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/05/2016

Known Unknowns: Uncertainty Quality in Bayesian Neural Networks

We evaluate the uncertainty quality in neural networks using anomaly det...
research
05/20/2015

Weight Uncertainty in Neural Networks

We introduce a new, efficient, principled and backpropagation-compatible...
research
08/04/2020

On Feature Relevance Uncertainty: A Monte Carlo Dropout Sampling Approach

Understanding decisions made by neural networks is key for the deploymen...
research
10/27/2020

Know Where To Drop Your Weights: Towards Faster Uncertainty Estimation

Estimating epistemic uncertainty of models used in low-latency applicati...
research
12/10/2018

Bayesian Layers: A Module for Neural Network Uncertainty

We describe Bayesian Layers, a module designed for fast experimentation ...
research
10/03/2018

Inhibited Softmax for Uncertainty Estimation in Neural Networks

We present a new method for uncertainty estimation and out-of-distributi...
research
10/14/2020

Differentiable Implicit Layers

In this paper, we introduce an efficient backpropagation scheme for non-...

Please sign up or login with your details

Forgot password? Click here to reset