Selective-Supervised Contrastive Learning with Noisy Labels

03/08/2022
by   Shikun Li, et al.
0

Deep networks have strong capacities of embedding data into latent representations and finishing following tasks. However, the capacities largely come from high-quality annotated labels, which are expensive to collect. Noisy labels are more affordable, but result in corrupted representations, leading to poor generalization performance. To learn robust representations and handle noisy labels, we propose selective-supervised contrastive learning (Sel-CL) in this paper. Specifically, Sel-CL extend supervised contrastive learning (Sup-CL), which is powerful in representation learning, but is degraded when there are noisy labels. Sel-CL tackles the direct cause of the problem of Sup-CL. That is, as Sup-CL works in a pair-wise manner, noisy pairs built by noisy labels mislead representation learning. To alleviate the issue, we select confident pairs out of noisy ones for Sup-CL without knowing noise rates. In the selection process, by measuring the agreement between learned representations and given labels, we first identify confident examples that are exploited to build confident pairs. Then, the representation similarity distribution in the built confident pairs is exploited to identify more confident pairs out of noisy pairs. All obtained confident pairs are finally used for Sup-CL to enhance representations. Experiments on multiple noisy datasets demonstrate the robustness of the learned representations by our method, following the state-of-the-art performance. Source codes are available at https://github.com/ShikunLi/Sel-CL

READ FULL TEXT
research
03/13/2023

Twin Contrastive Learning with Noisy Labels

Learning from noisy data is a challenging task that significantly degene...
research
03/03/2022

On Learning Contrastive Representations for Learning with Noisy Labels

Deep neural networks are able to memorize noisy labels easily with a sof...
research
01/29/2022

Investigating Why Contrastive Learning Benefits Robustness Against Label Noise

Self-supervised contrastive learning has recently been shown to be very ...
research
02/01/2022

HCSC: Hierarchical Contrastive Selective Coding

Hierarchical semantic structures naturally exist in an image dataset, in...
research
01/27/2022

Ranking Info Noise Contrastive Estimation: Boosting Contrastive Learning via Ranked Positives

This paper introduces Ranking Info Noise Contrastive Estimation (RINCE),...
research
11/18/2021

CLMB: deep contrastive learning for robust metagenomic binning

The reconstruction of microbial genomes from large metagenomic datasets ...
research
10/11/2022

C-Mixup: Improving Generalization in Regression

Improving the generalization of deep networks is an important open chall...

Please sign up or login with your details

Forgot password? Click here to reset