PARS: Pseudo-Label Aware Robust Sample Selection for Learning with Noisy Labels

01/26/2022
by   Arushi Goel, et al.
0

Acquiring accurate labels on large-scale datasets is both time consuming and expensive. To reduce the dependency of deep learning models on learning from clean labeled data, several recent research efforts are focused on learning with noisy labels. These methods typically fall into three design categories to learn a noise robust model: sample selection approaches, noise robust loss functions, or label correction methods. In this paper, we propose PARS: Pseudo-Label Aware Robust Sample Selection, a hybrid approach that combines the best from all three worlds in a joint-training framework to achieve robustness to noisy labels. Specifically, PARS exploits all training samples using both the raw/noisy labels and estimated/refurbished pseudo-labels via self-training, divides samples into an ambiguous and a noisy subset via loss analysis, and designs label-dependent noise-aware loss functions for both sets of filtered labels. Results show that PARS significantly outperforms the state of the art on extensive studies on the noisy CIFAR-10 and CIFAR-100 datasets, particularly on challenging high-noise and low-resource settings. In particular, PARS achieved an absolute 12 with 90 accuracy when only 1/5 of the noisy labels are available during training as an additional restriction. On a real-world noisy dataset, Clothing1M, PARS achieves competitive results to the state of the art.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/30/2022

Reliable Label Correction is a Good Booster When Learning with Extremely Noisy Labels

Learning with noisy labels has aroused much research interest since data...
research
10/10/2022

Is your noise correction noisy? PLS: Robustness to label noise with two stage detection

Designing robust algorithms capable of training accurate neural networks...
research
07/06/2021

An Ensemble Noise-Robust K-fold Cross-Validation Selection Method for Noisy Labels

We consider the problem of training robust and accurate deep neural netw...
research
12/02/2022

Model and Data Agreement for Learning with Noisy Labels

Learning with noisy labels is a vital topic for practical deep learning ...
research
11/16/2022

Learning with Noisy Labels over Imbalanced Subpopulations

Learning with Noisy Labels (LNL) has attracted significant attention fro...
research
08/17/2022

CTRL: Clustering Training Losses for Label Error Detection

In supervised machine learning, use of correct labels is extremely impor...
research
08/24/2022

Self-Filtering: A Noise-Aware Sample Selection for Label Noise with Confidence Penalization

Sample selection is an effective strategy to mitigate the effect of labe...

Please sign up or login with your details

Forgot password? Click here to reset