DeepAI AI Chat
Log In Sign Up

Empirical Perspectives on One-Shot Semi-supervised Learning

by   Leslie N. Smith, et al.
U.S. Navy

One of the greatest obstacles in the adoption of deep neural networks for new applications is that training the network typically requires a large number of manually labeled training samples. We empirically investigate the scenario where one has access to large amounts of unlabeled data but require labeling only a single prototypical sample per class in order to train a deep network (i.e., one-shot semi-supervised learning). Specifically, we investigate the recent results reported in FixMatch for one-shot semi-supervised learning to understand the factors that affect and impede high accuracies and reliability for one-shot semi-supervised learning of Cifar-10. For example, we discover that one barrier to one-shot semi-supervised learning for high-performance image classification is the unevenness of class accuracy during the training. These results point to solutions that might enable more widespread adoption of one-shot semi-supervised training methods for new applications.


page 1

page 2

page 3

page 4


FROST: Faster and more Robust One-shot Semi-supervised Training

Recent advances in one-shot semi-supervised learning have lowered the ba...

Building One-Shot Semi-supervised (BOSS) Learning up to Fully Supervised Performance

Reaching the performance of fully supervised learning with unlabeled dat...

MSMatch: Semi-Supervised Multispectral Scene Classification with Few Labels

Supervised learning techniques are at the center of many tasks in remote...

FastHebb: Scaling Hebbian Training of Deep Neural Networks to ImageNet Level

Learning algorithms for Deep Neural Networks are typically based on supe...

Near-Optimal Glimpse Sequences for Improved Hard Attention Neural Network Training

We introduce the use of Bayesian optimal experimental design techniques ...

Noised Consistency Training for Text Summarization

Neural abstractive summarization methods often require large quantities ...

Semi-Supervised Learning via New Deep Network Inversion

We exploit a recently derived inversion scheme for arbitrary deep neural...