BaCOUn: Bayesian Classifers with Out-of-Distribution Uncertainty

07/12/2020
by   Théo Guénais, et al.
0

Traditional training of deep classifiers yields overconfident models that are not reliable under dataset shift. We propose a Bayesian framework to obtain reliable uncertainty estimates for deep classifiers. Our approach consists of a plug-in "generator" used to augment the data with an additional class of points that lie on the boundary of the training data, followed by Bayesian inference on top of features that are trained to distinguish these "out-of-distribution" points.

READ FULL TEXT

page 4

page 10

page 14

page 15

page 16

page 18

page 21

page 22

research
05/13/2018

Spatial Uncertainty Sampling for End-to-End Control

End-to-end trained neural networks (NNs) are a compelling approach to au...
research
06/26/2020

Unlabelled Data Improves Bayesian Uncertainty Calibration under Covariate Shift

Modern neural networks have proven to be powerful function approximators...
research
11/08/2018

BAR: Bayesian Activity Recognition using variational inference

Uncertainty estimation in deep neural networks is essential for designin...
research
06/18/2021

Being a Bit Frequentist Improves Bayesian Neural Networks

Despite their compelling theoretical properties, Bayesian neural network...
research
05/09/2021

A Bit More Bayesian: Domain-Invariant Learning with Uncertainty

Domain generalization is challenging due to the domain shift and the unc...
research
03/28/2018

Supervising Feature Influence

Causal influence measures for machine learnt classifiers shed light on t...
research
06/28/2021

Scalable Optimal Classifiers for Adversarial Settings under Uncertainty

We consider the problem of finding optimal classifiers in an adversarial...

Please sign up or login with your details

Forgot password? Click here to reset