Boosting Semi-Supervised Learning with Contrastive Complementary Labeling

12/13/2022
by   Qinyi Deng, et al.
0

Semi-supervised learning (SSL) has achieved great success in leveraging a large amount of unlabeled data to learn a promising classifier. A popular approach is pseudo-labeling that generates pseudo labels only for those unlabeled data with high-confidence predictions. As for the low-confidence ones, existing methods often simply discard them because these unreliable pseudo labels may mislead the model. Nevertheless, we highlight that these data with low-confidence pseudo labels can be still beneficial to the training process. Specifically, although the class with the highest probability in the prediction is unreliable, we can assume that this sample is very unlikely to belong to the classes with the lowest probabilities. In this way, these data can be also very informative if we can effectively exploit these complementary labels, i.e., the classes that a sample does not belong to. Inspired by this, we propose a novel Contrastive Complementary Labeling (CCL) method that constructs a large number of reliable negative pairs based on the complementary labels and adopts contrastive learning to make use of all the unlabeled data. Extensive experiments demonstrate that CCL significantly improves the performance on top of existing methods. More critically, our CCL is particularly effective under the label-scarce settings. For example, we yield an improvement of 2.43

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/17/2022

Contrastive Regularization for Semi-Supervised Learning

Consistency regularization on label predictions becomes a fundamental te...
research
02/27/2023

Revisiting Self-Training with Regularized Pseudo-Labeling for Tabular Data

Recent progress in semi- and self-supervised learning has caused a rift ...
research
11/17/2022

Contrastive Credibility Propagation for Reliable Semi-Supervised Learning

Inferencing unlabeled data from labeled data is an error-prone process. ...
research
10/20/2022

Towards Mitigating the Problem of Insufficient and Ambiguous Supervision in Online Crowdsourcing Annotation

In real-world crowdsourcing annotation systems, due to differences in us...
research
10/10/2022

Contrastive Learning Approach for Semi-Supervised Seismic Facies Identification Using High-Confidence Representations

The manual seismic facies annotation relies heavily on the experience of...
research
06/01/2021

Semi-Supervised Domain Generalization with Stochastic StyleMatch

Most existing research on domain generalization assumes source data gath...
research
10/15/2021

Surrogate- and invariance-boosted contrastive learning for data-scarce applications in science

Deep learning techniques have been increasingly applied to the natural s...

Please sign up or login with your details

Forgot password? Click here to reset