Locally Adaptive Label Smoothing for Predictive Churn

02/09/2021
by   Dara Bahri, et al.
0

Training modern neural networks is an inherently noisy process that can lead to high prediction churn – disagreements between re-trainings of the same model due to factors such as randomization in the parameter initialization and mini-batches – even when the trained models all attain similar accuracies. Such prediction churn can be very undesirable in practice. In this paper, we present several baselines for reducing churn and show that training on soft labels obtained by adaptively smoothing each example's label based on the example's neighboring labels often outperforms the baselines on churn while improving accuracy on a variety of benchmark classification tasks and model architectures.

READ FULL TEXT
research
06/15/2022

ALASCA: Rethinking Label Smoothing for Deep Learning Under Label Noise

As label noise, one of the most popular distribution shifts, severely de...
research
11/25/2020

Delving Deep into Label Smoothing

Label smoothing is an effective regularization tool for deep neural netw...
research
12/09/2020

Label Confusion Learning to Enhance Text Classification Models

Representing a true label as a one-hot vector is a common practice in tr...
research
02/05/2021

On the Reproducibility of Neural Network Predictions

Standard training techniques for neural networks involve multiple source...
research
01/28/2022

Calibrating Histopathology Image Classifiers using Label Smoothing

The classification of histopathology images fundamentally differs from t...
research
05/30/2021

Diversifying Dialog Generation via Adaptive Label Smoothing

Neural dialogue generation models trained with the one-hot target distri...
research
02/04/2022

Learning with Neighbor Consistency for Noisy Labels

Recent advances in deep learning have relied on large, labelled datasets...

Please sign up or login with your details

Forgot password? Click here to reset