A Framework using Contrastive Learning for Classification with Noisy Labels

04/19/2021
by   Madalina Ciortan, et al.
0

We propose a framework using contrastive learning as a pre-training task to perform image classification in the presence of noisy labels. Recent strategies such as pseudo-labeling, sample selection with Gaussian Mixture models, weighted supervised contrastive learning have been combined into a fine-tuning phase following the pre-training. This paper provides an extensive empirical study showing that a preliminary contrastive learning step brings a significant gain in performance when using different loss functions: non-robust, robust, and early-learning regularized. Our experiments performed on standard benchmarks and real-world datasets demonstrate that: i) the contrastive pre-training increases the robustness of any loss function to noisy labels and ii) the additional fine-tuning phase can further improve accuracy but at the cost of additional complexity.

READ FULL TEXT
research
02/27/2023

The Role of Pre-training Data in Transfer Learning

The transfer learning paradigm of model pre-training and subsequent fine...
research
02/13/2023

Understanding Multimodal Contrastive Learning and Incorporating Unpaired Data

Language-supervised vision models have recently attracted great attentio...
research
12/08/2020

Multi-Objective Interpolation Training for Robustness to Label Noise

Deep neural networks trained with standard cross-entropy loss memorize n...
research
10/09/2021

Adversarial Training for Face Recognition Systems using Contrastive Adversarial Learning and Triplet Loss Fine-tuning

Though much work has been done in the domain of improving the adversaria...
research
05/27/2021

Contrastive Fine-tuning Improves Robustness for Neural Rankers

The performance of state-of-the-art neural rankers can deteriorate subst...
research
01/03/2023

Graph Contrastive Learning for Multi-omics Data

Advancements in technologies related to working with omics data require ...
research
08/07/2023

Towards General Text Embeddings with Multi-stage Contrastive Learning

We present GTE, a general-purpose text embedding model trained with mult...

Please sign up or login with your details

Forgot password? Click here to reset