Contrastive Regularization for Semi-Supervised Learning

01/17/2022
by   Doyup Lee, et al.
0

Consistency regularization on label predictions becomes a fundamental technique in semi-supervised learning, but it still requires a large number of training iterations for high performance. In this study, we analyze that the consistency regularization restricts the propagation of labeling information due to the exclusion of samples with unconfident pseudo-labels in the model updates. Then, we propose contrastive regularization to improve both efficiency and accuracy of the consistency regularization by well-clustered features of unlabeled data. In specific, after strongly augmented samples are assigned to clusters by their pseudo-labels, our contrastive regularization updates the model so that the features with confident pseudo-labels aggregate the features in the same cluster, while pushing away features in different clusters. As a result, the information of confident pseudo-labels can be effectively propagated into more unlabeled samples during training by the well-clustered features. On benchmarks of semi-supervised learning tasks, our contrastive regularization improves the previous consistency-based methods and achieves state-of-the-art results, especially with fewer training iterations. Our method also shows robust performance on open-set semi-supervised learning where unlabeled data includes out-of-distribution samples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/08/2019

Pseudo-Labeling and Confirmation Bias in Deep Semi-Supervised Learning

Semi-supervised learning, i.e. jointly learning from labeled an unlabele...
research
11/17/2022

Contrastive Credibility Propagation for Reliable Semi-Supervised Learning

Inferencing unlabeled data from labeled data is an error-prone process. ...
research
12/13/2022

Boosting Semi-Supervised Learning with Contrastive Complementary Labeling

Semi-supervised learning (SSL) has achieved great success in leveraging ...
research
06/28/2022

Semi-supervised Contrastive Outlier removal for Pseudo Expectation Maximization (SCOPE)

Semi-supervised learning is the problem of training an accurate predicti...
research
09/29/2021

Cross-domain Semi-Supervised Audio Event Classification Using Contrastive Regularization

In this study, we proposed a novel semi-supervised training method that ...
research
11/28/2022

Deep Semi-supervised Learning with Double-Contrast of Features and Semantics

In recent years, the field of intelligent transportation systems (ITS) h...
research
06/19/2020

Statistical and Algorithmic Insights for Semi-supervised Learning with Self-training

Self-training is a classical approach in semi-supervised learning which ...

Please sign up or login with your details

Forgot password? Click here to reset