On the Importance of Calibration in Semi-supervised Learning

10/10/2022
by   Charlotte Loh, et al.
0

State-of-the-art (SOTA) semi-supervised learning (SSL) methods have been highly successful in leveraging a mix of labeled and unlabeled data by combining techniques of consistency regularization and pseudo-labeling. During pseudo-labeling, the model's predictions on unlabeled data are used for training and thus, model calibration is important in mitigating confirmation bias. Yet, many SOTA methods are optimized for model performance, with little focus directed to improve model calibration. In this work, we empirically demonstrate that model calibration is strongly correlated with model performance and propose to improve calibration via approximate Bayesian techniques. We introduce a family of new SSL models that optimizes for calibration and demonstrate their effectiveness across standard vision benchmarks of CIFAR-10, CIFAR-100 and ImageNet, giving up to 15.9 in test accuracy. Furthermore, we also demonstrate their effectiveness in additional realistic and challenging problems, such as class-imbalanced datasets and in photonics science.

READ FULL TEXT
research
05/31/2020

Pseudo-Representation Labeling Semi-Supervised Learning

In recent years, semi-supervised learning (SSL) has shown tremendous suc...
research
08/08/2019

Pseudo-Labeling and Confirmation Bias in Deep Semi-Supervised Learning

Semi-supervised learning, i.e. jointly learning from labeled an unlabele...
research
08/15/2023

Boosting Semi-Supervised Learning by bridging high and low-confidence predictions

Pseudo-labeling is a crucial technique in semi-supervised learning (SSL)...
research
11/20/2022

An Embarrassingly Simple Baseline for Imbalanced Semi-Supervised Learning

Semi-supervised learning (SSL) has shown great promise in leveraging unl...
research
11/17/2022

NorMatch: Matching Normalizing Flows with Discriminative Classifiers for Semi-Supervised Learning

Semi-Supervised Learning (SSL) aims to learn a model using a tiny labele...
research
09/13/2021

POPCORN: Progressive Pseudo-labeling with Consistency Regularization and Neighboring

Semi-supervised learning (SSL) uses unlabeled data to compensate for the...
research
07/30/2018

HybridNet: Classification and Reconstruction Cooperation for Semi-Supervised Learning

In this paper, we introduce a new model for leveraging unlabeled data to...

Please sign up or login with your details

Forgot password? Click here to reset