Pumpout: A Meta Approach for Robustly Training Deep Neural Networks with Noisy Labels

09/28/2018
by   Bo Han, et al.
8

It is challenging to train deep neural networks robustly on the industrial-level data, since labels of such data are heavily noisy, and their label generation processes are normally agnostic. To handle these issues, by using the memorization effects of deep neural networks, we may train deep neural networks on the whole dataset only the first few iterations. Then, we may employ early stopping or the small-loss trick to train them on selected instances. However, in such training procedures, deep neural networks inevitably memorize some noisy labels, which will degrade their generalization. In this paper, we propose a meta algorithm called Pumpout to overcome the problem of memorizing noisy labels. By using scaled stochastic gradient ascent, Pumpout actively squeezes out the negative effects of noisy labels from the training model, instead of passively forgetting these effects. We leverage Pumpout to upgrade two representative methods: MentorNet and Backward Correction. Empirical results on benchmark datasets demonstrate that Pumpout can significantly improve the robustness of representative methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/23/2021

Co-matching: Combating Noisy Labels by Augmentation Anchoring

Deep learning with noisy labels is challenging as deep neural networks h...
research
07/11/2020

Meta Soft Label Generation for Noisy Labels

The existence of noisy labels in the dataset causes significant performa...
research
04/18/2018

Co-sampling: Training Robust Networks for Extremely Noisy Supervision

Training robust deep networks is challenging under noisy labels. Current...
research
05/27/2019

Understanding Generalization of Deep Neural Networks Trained with Noisy Labels

Over-parameterized deep neural networks trained by simple first-order me...
research
03/19/2021

MetaLabelNet: Learning to Generate Soft-Labels from Noisy-Labels

Real-world datasets commonly have noisy labels, which negatively affects...
research
03/21/2023

Fighting over-fitting with quantization for learning deep neural networks on noisy labels

The rising performance of deep neural networks is often empirically attr...
research
08/19/2022

Intersection of Parallels as an Early Stopping Criterion

A common way to avoid overfitting in supervised learning is early stoppi...

Please sign up or login with your details

Forgot password? Click here to reset