LNL+K: Learning with Noisy Labels and Noise Source Distribution Knowledge

06/20/2023
by   Siqi Wang, et al.
0

Learning with noisy labels (LNL) is challenging as the model tends to memorize noisy labels, which can lead to overfitting. Many LNL methods detect clean samples by maximizing the similarity between samples in each category, which does not make any assumptions about likely noise sources. However, we often have some knowledge about the potential source(s) of noisy labels. For example, an image mislabeled as a cheetah is more likely a leopard than a hippopotamus due to their visual similarity. Thus, we introduce a new task called Learning with Noisy Labels and noise source distribution Knowledge (LNL+K), which assumes we have some knowledge about likely source(s) of label noise that we can take advantage of. By making this presumption, methods are better equipped to distinguish hard negatives between categories from label noise. In addition, this enables us to explore datasets where the noise may represent the majority of samples, a setting that breaks a critical premise of most methods developed for the LNL task. We explore several baseline LNL+K approaches that integrate noise source knowledge into state-of-the-art LNL methods across three diverse datasets and three types of noise, where we report a 5-15 we find that LNL methods do not generalize well in every setting, highlighting the importance of directly exploring our LNL+K task.

READ FULL TEXT
research
06/14/2020

Class2Simi: A New Perspective on Learning with Label Noise

Label noise is ubiquitous in the era of big data. Deep learning algorith...
research
10/22/2021

PropMix: Hard Sample Filtering and Proportional MixUp for Learning with Noisy Labels

The most competitive noisy label learning methods rely on an unsupervise...
research
12/18/2019

Towards Robust Learning with Different Label Noise Distributions

Noisy labels are an unavoidable consequence of automatic image labeling ...
research
10/20/2021

One-Step Abductive Multi-Target Learning with Diverse Noisy Samples

One-step abductive multi-target learning (OSAMTL) was proposed to handle...
research
09/02/2022

Instance-Dependent Noisy Label Learning via Graphical Modelling

Noisy labels are unavoidable yet troublesome in the ecosystem of deep le...
research
05/29/2023

ReSup: Reliable Label Noise Suppression for Facial Expression Recognition

Because of the ambiguous and subjective property of the facial expressio...
research
07/20/2023

Differences Between Hard and Noisy-labeled Samples: An Empirical Study

Extracting noisy or incorrectly labeled samples from a labeled dataset w...

Please sign up or login with your details

Forgot password? Click here to reset