Emphasis Regularisation by Gradient Rescaling for Training Deep Neural Networks with Noisy Labels

05/27/2019
by   Xinshao Wang, et al.
0

It is fundamental and challenging to train robust and accurate Deep Neural Networks (DNNs) when noisy labels exist. Although great progress has been made, there is still one crucial research question which is not thoroughly explored yet: What training examples should be focused and how much more should they be emphasised when training DNNs under label noise? In this work, we study this question and propose gradient rescaling (GR) to solve it. GR modifies the magnitude of logit vector's gradient to emphasise on relatively easier training data points when severe noise exists, which functions as explicit emphasis regularisation to improve the generalisation performance of DNNs. Apart from regularisation, we also interpret GR from the perspectives of sample reweighting and designing robust loss functions. Therefore, our proposed GR helps connect these three approaches in the literature. We empirically demonstrate that GR is highly noise-robust and outperforms the state-of-the-art noise-tolerant algorithms by a large margin, e.g., increasing 7 with 40 regularisors. Furthermore, we present comprehensive ablation studies to explore the behaviours of GR under different cases, which is informative for applying GR in real-world scenarios.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/24/2020

Normalized Loss Functions for Deep Learning with Noisy Labels

Robust loss functions are essential for training accurate deep neural ne...
research
05/20/2018

Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels

Deep neural networks (DNNs) have achieved tremendous success in a variet...
research
05/22/2021

Generation and Analysis of Feature-Dependent Pseudo Noise for Training Deep Neural Networks

Training Deep neural networks (DNNs) on noisy labeled datasets is a chal...
research
03/28/2019

Improving MAE against CCE under Label Noise

Label noise is inherent in many deep learning tasks when the training se...
research
11/21/2019

Synthetic vs Real: Deep Learning on Controlled Noise

Performing controlled experiments on noisy data is essential in thorough...
research
01/14/2021

Tackling Instance-Dependent Label Noise via a Universal Probabilistic Model

The drastic increase of data quantity often brings the severe decrease o...
research
10/26/2022

Deep Learning is Provably Robust to Symmetric Label Noise

Deep neural networks (DNNs) are capable of perfectly fitting the trainin...

Please sign up or login with your details

Forgot password? Click here to reset