DeepAI AI Chat
Log In Sign Up

On Learning Contrastive Representations for Learning with Noisy Labels

03/03/2022
by   Li Yi, et al.
Western University
NYU college
0

Deep neural networks are able to memorize noisy labels easily with a softmax cross-entropy (CE) loss. Previous studies attempted to address this issue focus on incorporating a noise-robust loss function to the CE loss. However, the memorization issue is alleviated but still remains due to the non-robust CE loss. To address this issue, we focus on learning robust contrastive representations of data on which the classifier is hard to memorize the label noise under the CE loss. We propose a novel contrastive regularization function to learn such representations over noisy data where label noise does not dominate the representation learning. By theoretically investigating the representations induced by the proposed regularization function, we reveal that the learned representations keep information related to true labels and discard information related to corrupted labels. Moreover, our theoretical results also indicate that the learned representations are robust to the label noise. The effectiveness of this method is demonstrated with experiments on benchmark datasets.

READ FULL TEXT
12/08/2020

Multi-Objective Interpolation Training for Robustness to Label Noise

Deep neural networks trained with standard cross-entropy loss memorize n...
11/02/2017

Deep Learning from Noisy Image Labels with Quality Embedding

There is an emerging trend to leverage noisy image datasets in many visu...
12/08/2020

A Deep Marginal-Contrastive Defense against Adversarial Attacks on 1D Models

Deep learning algorithms have been recently targeted by attackers due to...
03/08/2022

Selective-Supervised Contrastive Learning with Noisy Labels

Deep networks have strong capacities of embedding data into latent repre...
06/27/2022

Compressing Features for Learning with Noisy Labels

Supervised learning can be viewed as distilling relevant information fro...
04/26/2021

An Exploration into why Output Regularization Mitigates Label Noise

Label noise presents a real challenge for supervised learning algorithms...
01/29/2022

Investigating Why Contrastive Learning Benefits Robustness Against Label Noise

Self-supervised contrastive learning has recently been shown to be very ...