Limited Gradient Descent: Learning With Noisy Labels

11/20/2018
by   Yi Sun, et al.
14

Label noise may handicap the generalization of classifiers, and it is an important issue how to effectively learn main pattern from samples with noisy labels. Recent studies have witnessed that deep neural networks tend to prioritize learning simple patterns and then memorize noise patterns. This suggests a method to search the best generalization, which learns the main pattern until the noise begins to be memorized. A natural idea is to use a supervised approach to find the stop timing of learning, for example resorting clean verification set. In practice, however, a clean verification set is sometimes not easy to obtain. To solve this problem, we propose an unsupervised method called limited gradient descent to estimate the best stop timing. We modified the labels of few samples in noisy dataset to be almost false labels as reverse pattern. By monitoring the learning progresses of the noisy samples and the reverse samples, we can determine the stop timing of learning. In this paper, we also provide some sufficient conditions on learning with noisy labels. Experimental results on CIFAR-10 demonstrate that our approach has similar generalization performance to those supervised methods. For uncomplicated datasets, such as MNIST, we add relabeling strategy to further improve generalization and achieve state-of-the-art performance.

READ FULL TEXT

page 1

page 2

page 4

page 8

page 9

page 11

page 12

page 13

research
10/17/2022

Bootstrapping the Relationship Between Images and Their Clean and Noisy Labels

Many state-of-the-art noisy-label learning methods rely on learning mech...
research
03/06/2021

LongReMix: Robust Learning with High Confidence Samples in a Noisy Label Environment

Deep neural network models are robust to a limited amount of label noise...
research
05/14/2019

Task-Driven Data Verification via Gradient Descent

We introduce a novel algorithm for the detection of possible sample corr...
research
10/22/2021

PropMix: Hard Sample Filtering and Proportional MixUp for Learning with Noisy Labels

The most competitive noisy label learning methods rely on an unsupervise...
research
08/05/2022

Neighborhood Collective Estimation for Noisy Label Identification and Correction

Learning with noisy labels (LNL) aims at designing strategies to improve...
research
06/05/2023

On Emergence of Clean-Priority Learning in Early Stopped Neural Networks

When random label noise is added to a training dataset, the prediction e...
research
11/15/2020

Coresets for Robust Training of Neural Networks against Noisy Labels

Modern neural networks have the capacity to overfit noisy labels frequen...

Please sign up or login with your details

Forgot password? Click here to reset