Contrast to Divide: Self-Supervised Pre-Training for Learning with Noisy Labels

03/25/2021
by   Evgenii Zheltonozhskii, et al.
4

The success of learning with noisy labels (LNL) methods relies heavily on the success of a warm-up stage where standard supervised training is performed using the full (noisy) training set. In this paper, we identify a "warm-up obstacle": the inability of standard warm-up stages to train high quality feature extractors and avert memorization of noisy labels. We propose "Contrast to Divide" (C2D), a simple framework that solves this problem by pre-training the feature extractor in a self-supervised fashion. Using self-supervised pre-training boosts the performance of existing LNL approaches by drastically reducing the warm-up stage's susceptibility to noise level, shortening its duration, and increasing extracted feature quality. C2D works out of the box with existing methods and demonstrates markedly improved performance, especially in the high noise regime, where we get a boost of more than 27 CIFAR-100 with 90 settings, C2D trained on mini-WebVision outperforms previous works both in WebVision and ImageNet validation sets by 3 in-depth analysis of the framework, including investigating the performance of different pre-training approaches and estimating the effective upper bound of the LNL performance with semi-supervised learning. Code for reproducing our experiments is available at https://github.com/ContrastToDivide/C2D

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/20/2022

Self-supervised Learning for Sonar Image Classification

Self-supervised learning has proved to be a powerful approach to learn i...
research
11/13/2022

SSL4EO-S12: A Large-Scale Multi-Modal, Multi-Temporal Dataset for Self-Supervised Learning in Earth Observation

Self-supervised pre-training bears potential to generate expressive repr...
research
06/16/2022

Censer: Curriculum Semi-supervised Learning for Speech Recognition Based on Self-supervised Pre-training

Recent studies have shown that the benefits provided by self-supervised ...
research
03/11/2023

PRSNet: A Masked Self-Supervised Learning Pedestrian Re-Identification Method

In recent years, self-supervised learning has attracted widespread acade...
research
10/19/2022

Self-supervised Heterogeneous Graph Pre-training Based on Structural Clustering

Recent self-supervised pre-training methods on Heterogeneous Information...
research
04/14/2022

SNP2Vec: Scalable Self-Supervised Pre-Training for Genome-Wide Association Study

Self-supervised pre-training methods have brought remarkable breakthroug...
research
03/15/2022

Scalable Penalized Regression for Noise Detection in Learning with Noisy Labels

Noisy training set usually leads to the degradation of generalization an...

Please sign up or login with your details

Forgot password? Click here to reset