Symmetric Cross Entropy for Robust Learning with Noisy Labels

08/16/2019
by   Yisen Wang, et al.
0

Training accurate deep neural networks (DNNs) in the presence of noisy labels is an important and challenging task. Though a number of approaches have been proposed for learning with noisy labels, many open issues remain. In this paper, we show that DNN learning with Cross Entropy (CE) exhibits overfitting to noisy labels on some classes ("easy" classes), but more surprisingly, it also suffers from significant under learning on some other classes ("hard" classes). Intuitively, CE requires an extra term to facilitate learning of hard classes, and more importantly, this term should be noise tolerant, so as to avoid overfitting to noisy labels. Inspired by the symmetric KL-divergence, we propose the approach of Symmetric cross entropy Learning (SL), boosting CE symmetrically with a noise robust counterpart Reverse Cross Entropy (RCE). Our proposed SL approach simultaneously addresses both the under learning and overfitting problem of CE in the presence of noisy labels. We provide a theoretical analysis of SL and also empirically show, on a range of benchmark and real-world datasets, that SL outperforms state-of-the-art methods. We also show that SL can be easily incorporated into existing methods in order to further enhance their performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/20/2018

Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels

Deep neural networks (DNNs) have achieved tremendous success in a variet...
research
08/07/2022

Preserving Fine-Grain Feature Information in Classification via Entropic Regularization

Labeling a classification dataset implies to define classes and associat...
research
12/01/2022

Noisy Label Classification using Label Noise Selection with Test-Time Augmentation Cross-Entropy and NoiseMix Learning

As the size of the dataset used in deep learning tasks increases, the no...
research
06/05/2023

Deep Learning From Crowdsourced Labels: Coupled Cross-entropy Minimization, Identifiability, and Regularization

Using noisy crowdsourced labels from multiple annotators, a deep learnin...
research
03/26/2020

Matrix Smoothing: A Regularization for DNN with Transition Matrix under Noisy Labels

Training deep neural networks (DNNs) in the presence of noisy labels is ...
research
07/31/2021

Learning with Noisy Labels via Sparse Regularization

Learning with noisy labels is an important and challenging task for trai...
research
08/15/2019

Improved Mix-up with KL-Entropy for Learning From Noisy Labels

Despite the deep neural networks (DNN) has achieved excellent performanc...

Please sign up or login with your details

Forgot password? Click here to reset