Shrinking Class Space for Enhanced Certainty in Semi-Supervised Learning

by   Lihe Yang, et al.

Semi-supervised learning is attracting blooming attention, due to its success in combining unlabeled data. To mitigate potentially incorrect pseudo labels, recent frameworks mostly set a fixed confidence threshold to discard uncertain samples. This practice ensures high-quality pseudo labels, but incurs a relatively low utilization of the whole unlabeled set. In this work, our key insight is that these uncertain samples can be turned into certain ones, as long as the confusion classes for the top-1 class are detected and removed. Invoked by this, we propose a novel method dubbed ShrinkMatch to learn uncertain samples. For each uncertain sample, it adaptively seeks a shrunk class space, which merely contains the original top-1 class, as well as remaining less likely classes. Since the confusion ones are removed in this space, the re-calculated top-1 confidence can satisfy the pre-defined threshold. We then impose a consistency regularization between a pair of strongly and weakly augmented samples in the shrunk space to strive for discriminative representations. Furthermore, considering the varied reliability among uncertain samples and the gradually improved model during training, we correspondingly design two reweighting principles for our uncertain loss. Our method exhibits impressive performance on widely adopted benchmarks. Code is available at


page 1

page 2

page 3

page 4


FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence

Semi-supervised learning (SSL) provides an effective means of leveraging...

MarginMatch: Improving Semi-Supervised Learning with Pseudo-Margins

We introduce MarginMatch, a new SSL approach combining consistency regul...

InPL: Pseudo-labeling the Inliers First for Imbalanced Semi-supervised Learning

Recent state-of-the-art methods in imbalanced semi-supervised learning (...

Towards Semi-Supervised Deep Facial Expression Recognition with An Adaptive Confidence Margin

Only parts of unlabeled data are selected to train models for most semi-...

ConMatch: Semi-Supervised Learning with Confidence-Guided Consistency Regularization

We present a novel semi-supervised learning framework that intelligently...

FlexMatch: Boosting Semi-Supervised Learning with Curriculum Pseudo Labeling

The recently proposed FixMatch achieved state-of-the-art results on most...

Please sign up or login with your details

Forgot password? Click here to reset