Revisiting Softmax for Uncertainty Approximation in Text Classification

10/25/2022
by   Andreas Nugaard Holm, et al.
0

Uncertainty approximation in text classification is an important area with applications in domain adaptation and interpretability. The most widely used uncertainty approximation method is Monte Carlo Dropout, which is computationally expensive as it requires multiple forward passes through the model. A cheaper alternative is to simply use a softmax to estimate model uncertainty. However, prior work has indicated that the softmax can generate overconfident uncertainty estimates and can thus be tricked into producing incorrect predictions. In this paper, we perform a thorough empirical analysis of both methods on five datasets with two base neural architectures in order to reveal insight into the trade-offs between the two. We compare the methods' uncertainty approximations and downstream text classification performance, while weighing their performance against their computational complexity as a cost-benefit analysis, by measuring runtime (cost) and the downstream performance (benefit). We find that, while Monte Carlo produces the best uncertainty approximations, using a simple softmax leads to competitive uncertainty estimation for text classification at a much lower computational cost, suggesting that softmax can in fact be a sufficient uncertainty estimate when computational resources are a concern.

READ FULL TEXT

page 6

page 13

page 14

research
10/03/2018

Inhibited Softmax for Uncertainty Estimation in Neural Networks

We present a new method for uncertainty estimation and out-of-distributi...
research
12/27/2021

Transformer Uncertainty Estimation with Hierarchical Stochastic Attention

Transformers are state-of-the-art in a wide range of NLP tasks and have ...
research
09/14/2022

Distribution Calibration for Out-of-Domain Detection with Bayesian Approximation

Out-of-Domain (OOD) detection is a key component in a task-oriented dial...
research
09/11/2022

Learning When to Say "I Don't Know"

We propose a new Reject Option Classification technique to identify and ...
research
10/15/2021

Identifying Incorrect Classifications with Balanced Uncertainty

Uncertainty estimation is critical for cost-sensitive deep-learning appl...
research
08/02/2023

Global Hierarchical Neural Networks using Hierarchical Softmax

This paper presents a framework in which hierarchical softmax is used to...
research
04/15/2021

Text Guide: Improving the quality of long text classification by a text selection method based on feature importance

The performance of text classification methods has improved greatly over...

Please sign up or login with your details

Forgot password? Click here to reset