Contrastive Credibility Propagation for Reliable Semi-Supervised Learning

11/17/2022
by   Brody Kutt, et al.
0

Inferencing unlabeled data from labeled data is an error-prone process. Conventional neural network training is highly sensitive to supervision errors. These two realities make semi-supervised learning (SSL) troublesome. Often, SSL approaches fail to outperform their fully supervised baseline. Proposed is a novel framework for deep SSL, specifically pseudo-labeling, called contrastive credibility propagation (CCP). Through an iterative process of generating and refining soft pseudo-labels, CCP unifies a novel contrastive approach to generating pseudo-labels and a powerful technique to overcome instance-based label noise. The result is a semi-supervised classification framework explicitly designed to overcome inevitable pseudo-label errors in an attempt to reliably boost performance over a supervised baseline. Our empirical evaluation across five benchmark classification datasets suggests one must choose between reliability or effectiveness with prior approaches while CCP delivers both. We also demonstrate an unsupervised signal to subsample pseudo-labels to eliminate errors between iterations of CCP and after its conclusion.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/17/2022

Contrastive Regularization for Semi-Supervised Learning

Consistency regularization on label predictions becomes a fundamental te...
research
12/13/2022

Boosting Semi-Supervised Learning with Contrastive Complementary Labeling

Semi-supervised learning (SSL) has achieved great success in leveraging ...
research
06/05/2022

Semi-Supervised Learning for Mars Imagery Classification and Segmentation

With the progress of Mars exploration, numerous Mars image data are coll...
research
02/19/2023

Pseudo Contrastive Learning for Graph-based Semi-supervised Learning

Pseudo Labeling is a technique used to improve the performance of semi-s...
research
11/12/2019

Semi-supervised Wrapper Feature Selection with Imperfect Labels

In this paper, we propose a new wrapper approach for semi-supervised fea...
research
02/06/2023

Linking data separation, visual separation, and classifier performance using pseudo-labeling by contrastive learning

Lacking supervised data is an issue while training deep neural networks ...
research
05/06/2022

Revisiting Pretraining for Semi-Supervised Learning in the Low-Label Regime

Semi-supervised learning (SSL) addresses the lack of labeled data by exp...

Please sign up or login with your details

Forgot password? Click here to reset