Being Bayesian about Categorical Probability

02/19/2020
by   Taejong Joo, et al.
0

Neural networks utilize the softmax as a building block in classification tasks, which contains an overconfidence problem and lacks an uncertainty representation ability. As a Bayesian alternative to the softmax, we consider a random variable of a categorical probability over class labels. In this framework, the prior distribution explicitly models the presumed noise inherent in the observed label, which provides consistent gains in generalization performance in multiple challenging tasks. The proposed method inherits advantages of Bayesian approaches that achieve better uncertainty estimation and model calibration. Our method can be implemented as a plug-and-play loss function with negligible computational overhead compared to the softmax with the cross-entropy loss function.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/30/2020

What's in a Loss Function for Image Classification?

It is common to use the softmax cross-entropy loss to train neural netwo...
research
03/31/2023

A two-head loss function for deep Average-K classification

Average-K classification is an alternative to top-K classification in wh...
research
07/10/2020

Revisiting One-vs-All Classifiers for Predictive Uncertainty and Out-of-Distribution Detection in Neural Networks

Accurate estimation of predictive uncertainty in modern neural networks ...
research
03/02/2022

GSC Loss: A Gaussian Score Calibrating Loss for Deep Learning

Cross entropy (CE) loss integrated with softmax is an orthodox component...
research
08/31/2021

Chi-square Loss for Softmax: an Echo of Neural Network Structure

Softmax working with cross-entropy is widely used in classification, whi...
research
07/02/2023

Towards Unbiased Exploration in Partial Label Learning

We consider learning a probabilistic classifier from partially-labelled ...
research
07/27/2021

Energy-Based Open-World Uncertainty Modeling for Confidence Calibration

Confidence calibration is of great importance to the reliability of deci...

Please sign up or login with your details

Forgot password? Click here to reset