Dropout Distillation for Efficiently Estimating Model Confidence

09/27/2018
by   Corina Gurau, et al.
20

We propose an efficient way to output better calibrated uncertainty scores from neural networks. The Distilled Dropout Network (DDN) makes standard (non-Bayesian) neural networks more introspective by adding a new training loss which prevents them from being overconfident. Our method is more efficient than Bayesian neural networks or model ensembles which, despite providing more reliable uncertainty scores, are more cumbersome to train and slower to test. We evaluate DDN on the the task of image classification on the CIFAR-10 dataset and show that our calibration results are competitive even when compared to 100 Monte Carlo samples from a dropout network while they also increase the classification accuracy. We also propose better calibration within the state of the art Faster R-CNN object detection framework and show, using the COCO dataset, that DDN helps train better calibrated object detectors.

READ FULL TEXT

page 8

page 9

page 10

research
06/20/2020

Calibration of Model Uncertainty for Dropout Variational Inference

The model uncertainty obtained by variational Bayesian inference with Mo...
research
06/23/2019

Confidence Calibration for Convolutional Neural Networks Using Structured Dropout

In classification applications, we often want probabilistic predictions ...
research
07/09/2021

Gradient-Based Quantification of Epistemic Uncertainty for Deep Object Detectors

Reliable epistemic uncertainty estimation is an essential component for ...
research
09/28/2018

Confidence Calibration in Deep Neural Networks through Stochastic Inferences

We propose a generic framework to calibrate accuracy and confidence (sco...
research
09/05/2022

Ensemble of Pre-Trained Neural Networks for Segmentation and Quality Detection of Transmission Electron Microscopy Images

Automated analysis of electron microscopy datasets poses multiple challe...
research
11/19/2019

Parameters Estimation for the Cosmic Microwave Background with Bayesian Neural Networks

In this paper, we present the first study that compares different models...
research
09/14/2022

Distribution Calibration for Out-of-Domain Detection with Bayesian Approximation

Out-of-Domain (OOD) detection is a key component in a task-oriented dial...

Please sign up or login with your details

Forgot password? Click here to reset