Building One-Shot Semi-supervised (BOSS) Learning up to Fully Supervised Performance

06/16/2020
by   Leslie N. Smith, et al.
0

Reaching the performance of fully supervised learning with unlabeled data and only labeling one sample per class might be ideal for deep learning applications. We demonstrate for the first time the potential for building one-shot semi-supervised (BOSS) learning on Cifar-10 and SVHN up to attain test accuracies that are comparable to fully supervised learning. Our method combines class prototype refining, class balancing, and self-training. A good prototype choice is essential and we propose a practical technique for obtaining iconic examples. In addition, we demonstrate that class balancing methods substantially improve accuracy results in semi-supervised learning to levels that allow self-training to reach the level of fully supervised learning performance. Rigorous empirical evaluations provide evidence that labeling large datasets is not necessary for training deep neural networks. We made our code available at <https://github.com/lnsmith54/BOSS> to facilitate replication and for use with future real-world applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/08/2020

Empirical Perspectives on One-Shot Semi-supervised Learning

One of the greatest obstacles in the adoption of deep neural networks fo...
research
11/18/2020

FROST: Faster and more Robust One-shot Semi-supervised Training

Recent advances in one-shot semi-supervised learning have lowered the ba...
research
07/31/2022

Analysis of Semi-Supervised Methods for Facial Expression Recognition

Training deep neural networks for image recognition often requires large...
research
02/17/2021

Sinkhorn Label Allocation: Semi-Supervised Classification via Annealed Self-Training

Self-training is a standard approach to semi-supervised learning where t...
research
03/21/2021

ScanMix: Learning from Severe Label Noise via Semantic Clustering and Semi-Supervised Learning

In this paper, we address the problem of training deep neural networks i...
research
11/10/2022

When Less is More: On the Value of "Co-training" for Semi-Supervised Software Defect Predictors

Labeling a module defective or non-defective is an expensive task. Hence...
research
11/04/2021

Lexically Aware Semi-Supervised Learning for OCR Post-Correction

Much of the existing linguistic data in many languages of the world is l...

Please sign up or login with your details

Forgot password? Click here to reset