Semi-supervised Contrastive Outlier removal for Pseudo Expectation Maximization (SCOPE)

06/28/2022
by   Sumeet Menon, et al.
0

Semi-supervised learning is the problem of training an accurate predictive model by combining a small labeled dataset with a presumably much larger unlabeled dataset. Many methods for semi-supervised deep learning have been developed, including pseudolabeling, consistency regularization, and contrastive learning techniques. Pseudolabeling methods however are highly susceptible to confounding, in which erroneous pseudolabels are assumed to be true labels in early iterations, thereby causing the model to reinforce its prior biases and thereby fail to generalize to strong predictive performance. We present a new approach to suppress confounding errors through a method we describe as Semi-supervised Contrastive Outlier removal for Pseudo Expectation Maximization (SCOPE). Like basic pseudolabeling, SCOPE is related to Expectation Maximization (EM), a latent variable framework which can be extended toward understanding cluster-assumption deep semi-supervised algorithms. However, unlike basic pseudolabeling which fails to adequately take into account the probability of the unlabeled samples given the model, SCOPE introduces an outlier suppression term designed to improve the behavior of EM iteration given a discrimination DNN backbone in the presence of outliers. Our results show that SCOPE greatly improves semi-supervised classification accuracy over a baseline, and furthermore when combined with consistency regularization achieves the highest reported accuracy for the semi-supervised CIFAR-10 classification task using 250 and 4000 labeled samples. Moreover, we show that SCOPE reduces the prevalence of confounding errors during pseudolabeling iterations by pruning erroneous high-confidence pseudolabeled samples that would otherwise contaminate the labeled set in subsequent retraining iterations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/17/2022

Contrastive Regularization for Semi-Supervised Learning

Consistency regularization on label predictions becomes a fundamental te...
research
10/02/2020

Deep Expectation-Maximization for Semi-Supervised Lung Cancer Screening

We present a semi-supervised algorithm for lung cancer screening in whic...
research
11/01/2022

On the Semi-supervised Expectation Maximization

The Expectation Maximization (EM) algorithm is widely used as an iterati...
research
08/08/2022

Bayesian Pseudo Labels: Expectation Maximization for Robust and Efficient Semi-Supervised Segmentation

This paper concerns pseudo labelling in segmentation. Our contribution i...
research
08/28/2020

Semi-supervised Learning with the EM Algorithm: A Comparative Study between Unstructured and Structured Prediction

Semi-supervised learning aims to learn prediction models from both label...
research
02/16/2017

Semi-supervised Learning for Discrete Choice Models

We introduce a semi-supervised discrete choice model to calibrate discre...
research
12/06/2016

A Probabilistic Framework for Deep Learning

We develop a probabilistic framework for deep learning based on the Deep...

Please sign up or login with your details

Forgot password? Click here to reset