Instance Correction for Learning with Open-set Noisy Labels

06/01/2021
by   Xiaobo Xia, et al.
0

The problem of open-set noisy labels denotes that part of training data have a different label space that does not contain the true class. Lots of approaches, e.g., loss correction and label correction, cannot handle such open-set noisy labels well, since they need training data and test data to share the same label space, which does not hold for learning with open-set noisy labels. The state-of-the-art methods thus employ the sample selection approach to handle open-set noisy labels, which tries to select clean data from noisy data for network parameters updates. The discarded data are seen to be mislabeled and do not participate in training. Such an approach is intuitive and reasonable at first glance. However, a natural question could be raised "can such data only be discarded during training?". In this paper, we show that the answer is no. Specifically, we discuss that the instances of discarded data could consist of some meaningful information for generalization. For this reason, we do not abandon such data, but use instance correction to modify the instances of the discarded data, which makes the predictions for the discarded data consistent with given labels. Instance correction are performed by targeted adversarial attacks. The corrected data are then exploited for training to help generalization. In addition to the analytical results, a series of empirical evidences are provided to justify our claims.

READ FULL TEXT
research
11/19/2020

Error-Bounded Correction of Noisy Labels

To collect large scale annotated data, it is inevitable to introduce lab...
research
03/31/2018

Iterative Learning with Open-set Noisy Labels

Large-scale datasets possessing clean label annotations are crucial for ...
research
11/29/2018

Learning with Labels of Existing and Nonexisting

We study the classification or detection problems where the label only s...
research
09/02/2023

Regularly Truncated M-estimators for Learning with Noisy Labels

The sample selection approach is very popular in learning with noisy lab...
research
08/23/2022

Learning from Noisy Labels with Coarse-to-Fine Sample Credibility Modeling

Training deep neural network (DNN) with noisy labels is practically chal...
research
11/06/2019

Searching to Exploit Memorization Effect in Learning from Corrupted Labels

Sample-selection approaches, which attempt to pick up clean instances fr...
research
11/24/2018

Alternating Loss Correction for Preterm-Birth Prediction from EHR Data with Noisy Labels

In this paper we are interested in the prediction of preterm birth based...

Please sign up or login with your details

Forgot password? Click here to reset