Pseudo-Representation Labeling Semi-Supervised Learning

05/31/2020
by   Song-Bo Yang, et al.
0

In recent years, semi-supervised learning (SSL) has shown tremendous success in leveraging unlabeled data to improve the performance of deep learning models, which significantly reduces the demand for large amounts of labeled data. Many SSL techniques have been proposed and have shown promising performance on famous datasets such as ImageNet and CIFAR-10. However, some exiting techniques (especially data augmentation based) are not suitable for industrial applications empirically. Therefore, this work proposes the pseudo-representation labeling, a simple and flexible framework that utilizes pseudo-labeling techniques to iteratively label a small amount of unlabeled data and use them as training data. In addition, our framework is integrated with self-supervised representation learning such that the classifier gains benefits from representation learning of both labeled and unlabeled data. This framework can be implemented without being limited at the specific model structure, but a general technique to improve the existing model. Compared with the existing approaches, the pseudo-representation labeling is more intuitive and can effectively solve practical problems in the real world. Empirically, it outperforms the current state-of-the-art semi-supervised learning methods in industrial types of classification problems such as the WM-811K wafer map and the MIT-BIH Arrhythmia dataset.

READ FULL TEXT
research
05/11/2022

DoubleMatch: Improving Semi-Supervised Learning with Self-Supervision

Following the success of supervised learning, semi-supervised learning (...
research
06/09/2021

Semi-Supervised Training with Pseudo-Labeling for End-to-End Neural Diarization

In this paper, we present a semi-supervised training technique using pse...
research
10/10/2022

On the Importance of Calibration in Semi-supervised Learning

State-of-the-art (SOTA) semi-supervised learning (SSL) methods have been...
research
05/28/2021

Noised Consistency Training for Text Summarization

Neural abstractive summarization methods often require large quantities ...
research
02/17/2022

CLS: Cross Labeling Supervision for Semi-Supervised Learning

It is well known that the success of deep neural networks is greatly att...
research
08/13/2021

Progressive Representative Labeling for Deep Semi-Supervised Learning

Deep semi-supervised learning (SSL) has experienced significant attentio...
research
06/14/2017

Provable benefits of representation learning

There is general consensus that learning representations is useful for a...

Please sign up or login with your details

Forgot password? Click here to reset