Bootstrapping the Relationship Between Images and Their Clean and Noisy Labels

10/17/2022
by   Brandon Smart, et al.
0

Many state-of-the-art noisy-label learning methods rely on learning mechanisms that estimate the samples' clean labels during training and discard their original noisy labels. However, this approach prevents the learning of the relationship between images, noisy labels and clean labels, which has been shown to be useful when dealing with instance-dependent label noise problems. Furthermore, methods that do aim to learn this relationship require cleanly annotated subsets of data, as well as distillation or multi-faceted models for training. In this paper, we propose a new training algorithm that relies on a simple model to learn the relationship between clean and noisy labels without the need for a cleanly labelled subset of data. Our algorithm follows a 3-stage process, namely: 1) self-supervised pre-training followed by an early-stopping training of the classifier to confidently predict clean labels for a subset of the training set; 2) use the clean set from stage (1) to bootstrap the relationship between images, noisy labels and clean labels, which we exploit for effective relabelling of the remaining training set using semi-supervised learning; and 3) supervised training of the classifier with all relabelled samples from stage (2). By learning this relationship, we achieve state-of-the-art performance in asymmetric and instance-dependent label noise problems.

READ FULL TEXT
research
03/06/2021

LongReMix: Robust Learning with High Confidence Samples in a Noisy Label Environment

Deep neural network models are robust to a limited amount of label noise...
research
11/20/2018

Limited Gradient Descent: Learning With Noisy Labels

Label noise may handicap the generalization of classifiers, and it is an...
research
07/31/2018

A Robust Deep Attention Network to Noisy Labels in Semi-supervised Biomedical Segmentation

Learning-based methods suffer from limited clean annotations, especially...
research
05/05/2023

Uncertainty-Aware Bootstrap Learning for Joint Extraction on Distantly-Supervised Data

Jointly extracting entity pairs and their relations is challenging when ...
research
11/20/2022

SplitNet: Learnable Clean-Noisy Label Splitting for Learning with Noisy Labels

Annotating the dataset with high-quality labels is crucial for performan...
research
07/29/2022

Centrality and Consistency: Two-Stage Clean Samples Identification for Learning with Instance-Dependent Noisy Labels

Deep models trained with noisy labels are prone to over-fitting and stru...
research
11/20/2019

Robust Triple-Matrix-Recovery-Based Auto-Weighted Label Propagation for Classification

The graph-based semi-supervised label propagation algorithm has delivere...

Please sign up or login with your details

Forgot password? Click here to reset