Robust Semi-Supervised Learning with Out of Distribution Data

10/07/2020
by   Xujiang Zhao, et al.
10

Semi-supervised learning (SSL) based on deep neural networks (DNNs) has recently been proven effective. However, recent work [Oliver et al., 2018] shows that the performance of SSL could degrade substantially when the unlabeled set has out-of-distribution examples (OODs). In this work, we first study the key causes about the negative impact of OOD on SSL. We found that (1) OODs close to the decision boundary have a larger effect on the performance of existing SSL algorithms than the OODs far away from the decision boundary and (2) Batch Normalization (BN), a popular module in deep networks, could degrade the performance of a DNN for SSL substantially when the unlabeled set contains OODs. To address these causes, we proposed a novel unified robust SSL approach for many existing SSL algorithms in order to improve their robustness against OODs. In particular, we proposed a simple modification to batch normalization, called weighted batch normalization, capable of improving the robustness of BN against OODs. We developed two efficient hyperparameter optimization algorithms that have different tradeoffs in computational efficiency and accuracy. The first is meta-approximation and the second is implicit-differentiation based approximation. Both algorithms learn to reweight the unlabeled samples in order to improve the robustness of SSL against OODs. Extensive experiments on both synthetic and real-world datasets demonstrate that our proposed approach significantly improves the robustness of four representative SSL algorithms against OODs, in comparison with four state-of-the-art robust SSL approaches. We performed an ablation study to demonstrate which components of our approach are most important for its success.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/24/2018

Realistic Evaluation of Deep Semi-Supervised Learning Algorithms

Semi-supervised learning (SSL) provides a powerful framework for leverag...
research
04/06/2019

Split Batch Normalization: Improving Semi-Supervised Learning under Domain Shift

Recent work has shown that using unlabeled data in semi-supervised learn...
research
07/22/2020

MetAL: Active Semi-Supervised Learning on Graphs via Meta Learning

The objective of active learning (AL) is to train classification models ...
research
04/28/2023

Semi-Supervised RF Fingerprinting with Consistency-Based Regularization

As a promising non-password authentication technology, radio frequency (...
research
10/06/2017

Projection Based Weight Normalization for Deep Neural Networks

Optimizing deep neural networks (DNNs) often suffers from the ill-condit...
research
06/14/2021

RETRIEVE: Coreset Selection for Efficient and Robust Semi-Supervised Learning

Semi-supervised learning (SSL) algorithms have had great success in rece...
research
02/04/2021

SelfNorm and CrossNorm for Out-of-Distribution Robustness

Normalization techniques are crucial in stabilizing and accelerating the...

Please sign up or login with your details

Forgot password? Click here to reset