Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach

09/13/2016
by   Giorgio Patrini, et al.
0

We present a theoretically grounded approach to train deep neural networks, including recurrent networks, subject to class-dependent label noise. We propose two procedures for loss correction that are agnostic to both application domain and network architecture. They simply amount to at most a matrix inversion and multiplication, provided that we know the probability of each class being corrupted into another. We further show how one can estimate these probabilities, adapting a recent technique for noise estimation to the multi-class setting, and thus providing an end-to-end framework. Extensive experiments on MNIST, IMDB, CIFAR-10, CIFAR-100 and a large scale dataset of clothing images employing a diversity of architectures --- stacking dense, convolutional, pooling, dropout, batch normalization, word embedding, LSTM and residual layers --- demonstrate the noise robustness of our proposals. Incidentally, we also prove that, when ReLU is the only non-linearity, the loss curvature is immune to class-dependent label noise.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/27/2022

Synergistic Network Learning and Label Correction for Noise-robust Image Classification

Large training datasets almost always contain examples with inaccurate o...
research
09/06/2021

Tensor Normalization and Full Distribution Training

In this work, we introduce pixel wise tensor normalization, which is ins...
research
03/04/2022

Learning from Label Proportions by Learning with Label Noise

Learning from label proportions (LLP) is a weakly supervised classificat...
research
04/25/2019

Unsupervised label noise modeling and loss correction

Despite being robust to small amounts of label noise, convolutional neur...
research
06/29/2021

How Does Heterogeneous Label Noise Impact Generalization in Neural Nets?

Incorrectly labeled examples, or label noise, is common in real-world co...
research
12/16/2019

PyHessian: Neural Networks Through the Lens of the Hessian

We present PyHessian, a new scalable framework that enables fast computa...
research
06/14/2019

Deep neural network for fringe pattern filtering and normalisation

We propose a new framework for processing Fringe Patterns (FP). Our nove...

Please sign up or login with your details

Forgot password? Click here to reset