Density-Softmax: Scalable and Distance-Aware Uncertainty Estimation under Distribution Shifts

02/13/2023
by   Ha Manh Bui, et al.
0

Prevalent deep learning models suffer from significant over-confidence under distribution shifts. In this paper, we propose Density-Softmax, a single deterministic approach for uncertainty estimation via a combination of density function with the softmax layer. By using the latent representation's likelihood value, our approach produces more uncertain predictions when test samples are distant from the training samples. Theoretically, we prove that Density-Softmax is distance aware, which means its associated uncertainty metrics are monotonic functions of distance metrics. This has been shown to be a necessary condition for a neural network to produce high-quality uncertainty estimation. Empirically, our method enjoys similar computational efficiency as standard softmax on shifted CIFAR-10, CIFAR-100, and ImageNet dataset across modern deep learning architectures. Notably, Density-Softmax uses 4 times fewer parameters than Deep Ensembles and 6 times lower latency than Rank-1 Bayesian Neural Network, while obtaining competitive predictive performance and lower calibration errors under distribution shifts.

READ FULL TEXT

page 2

page 16

page 17

page 20

page 21

page 22

page 23

page 26

research
06/21/2023

Density Uncertainty Layers for Reliable Uncertainty Estimation

Assessing the predictive uncertainty of deep neural networks is crucial ...
research
07/10/2020

Revisiting One-vs-All Classifiers for Predictive Uncertainty and Out-of-Distribution Detection in Neural Networks

Accurate estimation of predictive uncertainty in modern neural networks ...
research
07/01/2022

Robustness of Epinets against Distributional Shifts

Recent work introduced the epinet as a new approach to uncertainty model...
research
02/11/2020

Fine-grained Uncertainty Modeling in Neural Networks

Existing uncertainty modeling approaches try to detect an out-of-distrib...
research
02/23/2021

Deterministic Neural Networks with Appropriate Inductive Biases Capture Epistemic and Aleatoric Uncertainty

We show that a single softmax neural net with minimal changes can beat t...
research
06/09/2020

A t-distribution based operator for enhancing out of distribution robustness of neural network classifiers

Neural Network (NN) classifiers can assign extreme probabilities to samp...
research
12/09/2020

Know Your Limits: Monotonicity Softmax Make Neural Classifiers Overconfident on OOD Data

A crucial requirement for reliable deployment of deep learning models fo...

Please sign up or login with your details

Forgot password? Click here to reset