Noisy Concurrent Training for Efficient Learning under Label Noise

09/17/2020
by   Fahad Sarfraz, et al.
16

Deep neural networks (DNNs) fail to learn effectively under label noise and have been shown to memorize random labels which affect their generalization performance. We consider learning in isolation, using one-hot encoded labels as the sole source of supervision, and a lack of regularization to discourage memorization as the major shortcomings of the standard training procedure. Thus, we propose Noisy Concurrent Training (NCT) which leverages collaborative learning to use the consensus between two models as an additional source of supervision. Furthermore, inspired by trial-to-trial variability in the brain, we propose a counter-intuitive regularization technique, target variability, which entails randomly changing the labels of a percentage of training samples in each batch as a deterrent to memorization and over-generalization in DNNs. Target variability is applied independently to each model to keep them diverged and avoid the confirmation bias. As DNNs tend to prioritize learning simple patterns first before memorizing the noisy labels, we employ a dynamic learning scheme whereby as the training progresses, the two models increasingly rely more on their consensus. NCT also progressively increases the target variability to avoid memorization in later stages. We demonstrate the effectiveness of our approach on both synthetic and real-world noisy benchmark datasets.

READ FULL TEXT
research
03/31/2021

Collaborative Label Correction via Entropy Thresholding

Deep neural networks (DNNs) have the capacity to fit extremely noisy lab...
research
05/13/2019

Understanding and Utilizing Deep Neural Networks Trained with Noisy Labels

Noisy labels are ubiquitous in real-world datasets, which poses a challe...
research
05/22/2021

Generation and Analysis of Feature-Dependent Pseudo Noise for Training Deep Neural Networks

Training Deep neural networks (DNNs) on noisy labeled datasets is a chal...
research
06/13/2019

A Meta Approach to Defend Noisy Labels by the Manifold Regularizer PSDR

Noisy labels are ubiquitous in real-world datasets, which poses a challe...
research
07/01/2020

Temporal Calibrated Regularization for Robust Noisy Label Learning

Deep neural networks (DNNs) exhibit great success on many tasks with the...
research
10/10/2020

Training Binary Neural Networks through Learning with Noisy Supervision

This paper formalizes the binarization operations over neural networks f...
research
06/07/2018

Dimensionality-Driven Learning with Noisy Labels

Datasets with significant proportions of noisy (incorrect) class labels ...

Please sign up or login with your details

Forgot password? Click here to reset