Self-Distribution Distillation: Efficient Uncertainty Estimation

03/15/2022
by   Yassir Fathullah, et al.
0

Deep learning is increasingly being applied in safety-critical domains. For these scenarios it is important to know the level of uncertainty in a model's prediction to ensure appropriate decisions are made by the system. Deep ensembles are the de-facto standard approach to obtaining various measures of uncertainty. However, ensembles often significantly increase the resources required in the training and/or deployment phases. Approaches have been developed that typically address the costs in one of these phases. In this work we propose a novel training approach, self-distribution distillation (S2D), which is able to efficiently train a single model that can estimate uncertainties. Furthermore it is possible to build ensembles of these models and apply hierarchical ensemble distillation approaches. Experiments on CIFAR-100 showed that S2D models outperformed standard models and Monte-Carlo dropout. Additional out-of-distribution detection experiments on LSUN, Tiny ImageNet, SVHN showed that even a standard deep ensemble can be outperformed using S2D based ensembles and novel distilled models.

READ FULL TEXT
research
02/26/2020

A general framework for ensemble distribution distillation

Ensembles of neural networks have been shown to give better performance ...
research
05/17/2023

Logit-Based Ensemble Distribution Distillation for Robust Autoregressive Sequence Uncertainties

Efficiently and reliably estimating uncertainty is an important objectiv...
research
06/08/2022

Ensembles for Uncertainty Estimation: Benefits of Prior Functions and Bootstrapping

In machine learning, an agent needs to estimate uncertainty to efficient...
research
10/06/2021

Deep Classifiers with Label Noise Modeling and Distance Awareness

Uncertainty estimation in deep learning has recently emerged as a crucia...
research
06/05/2022

Functional Ensemble Distillation

Bayesian models have many desirable properties, most notable is their ab...
research
03/17/2023

DUDES: Deep Uncertainty Distillation using Ensembles for Semantic Segmentation

Deep neural networks lack interpretability and tend to be overconfident,...
research
05/19/2022

Simple Regularisation for Uncertainty-Aware Knowledge Distillation

Considering uncertainty estimation of modern neural networks (NNs) is on...

Please sign up or login with your details

Forgot password? Click here to reset