Distinction Maximization Loss: Fast, Scalable, Turnkey, and Native Neural Networks Out-of-Distribution Detection simply by Replacing the SoftMax Loss

08/15/2019
by   David Macêdo, et al.
1

Recently, many methods to reduce neural networks uncertainty have been proposed. However, most of the techniques used in these solutions usually present severe drawbacks. In this paper, we argue that neural networks low out-of-distribution detection performance is mainly due to the SoftMax loss anisotropy. Therefore, we built an isotropic loss to reduce neural networks uncertainty in a fast, scalable, turnkey, and native approach. Our experiments showed that our proposal overcomes ODIN typically by a large margin while producing usually competitive results against state-of-the-art Mahalanobis method while avoiding their limitations.

READ FULL TEXT

page 4

page 7

page 8

research
10/03/2018

Inhibited Softmax for Uncertainty Estimation in Neural Networks

We present a new method for uncertainty estimation and out-of-distributi...
research
05/30/2021

Improving Entropic Out-of-Distribution Detection using Isometric Distances and the Minimum Distance Score

Current out-of-distribution detection approaches usually present special...
research
06/09/2021

Understanding Softmax Confidence and Uncertainty

It is often remarked that neural networks fail to increase their uncerta...
research
09/17/2019

Relaxed Softmax for learning from Positive and Unlabeled data

In recent years, the softmax model and its fast approximations have beco...
research
10/31/2022

Probability-Dependent Gradient Decay in Large Margin Softmax

In the past few years, Softmax has become a common component in neural n...

Please sign up or login with your details

Forgot password? Click here to reset