Interpolation-based Contrastive Learning for Few-Label Semi-Supervised Learning

02/24/2022
by   Xihong Yang, et al.
0

Semi-supervised learning (SSL) has long been proved to be an effective technique to construct powerful models with limited labels. In the existing literature, consistency regularization-based methods, which force the perturbed samples to have similar predictions with the original ones have attracted much attention for their promising accuracy. However, we observe that, the performance of such methods decreases drastically when the labels get extremely limited, e.g., 2 or 3 labels for each category. Our empirical study finds that the main problem lies with the drifting of semantic information in the procedure of data augmentation. The problem can be alleviated when enough supervision is provided. However, when little guidance is available, the incorrect regularization would mislead the network and undermine the performance of the algorithm. To tackle the problem, we (1) propose an interpolation-based method to construct more reliable positive sample pairs; (2) design a novel contrastive loss to guide the embedding of the learned network to change linearly between samples so as to improve the discriminative capability of the network by enlarging the margin decision boundaries. Since no destructive regularization is introduced, the performance of our proposed algorithm is largely improved. Specifically, the proposed algorithm outperforms the second best algorithm (Comatch) with 5.3 classification accuracy when only two labels are available for each class on the CIFAR-10 dataset. Moreover, we further prove the generality of the proposed method by improving the performance of the existing state-of-the-art algorithms considerably with our proposed strategy.

READ FULL TEXT

page 1

page 2

page 4

page 6

research
06/06/2022

Interpolation-based Correlation Reduction Network for Semi-Supervised Graph Learning

Graph Neural Networks (GNNs) have achieved promising performance in semi...
research
10/06/2021

ActiveMatch: End-to-end Semi-supervised Active Representation Learning

Semi-supervised learning (SSL) is an efficient framework that can train ...
research
03/28/2020

Gradient-based Data Augmentation for Semi-Supervised Learning

In semi-supervised learning (SSL), a technique called consistency regula...
research
10/09/2021

RankingMatch: Delving into Semi-Supervised Learning with Consistency Regularization and Ranking Loss

Semi-supervised learning (SSL) has played an important role in leveragin...
research
04/19/2021

Epsilon Consistent Mixup: An Adaptive Consistency-Interpolation Tradeoff

In this paper we propose ϵ-Consistent Mixup (ϵmu). ϵmu is a data-based s...
research
09/18/2022

Distributed Semi-supervised Fuzzy Regression with Interpolation Consistency Regularization

Recently, distributed semi-supervised learning (DSSL) algorithms have sh...
research
07/17/2020

Hybrid Discriminative-Generative Training via Contrastive Learning

Contrastive learning and supervised learning have both seen significant ...

Please sign up or login with your details

Forgot password? Click here to reset