DeepAI AI Chat
Log In Sign Up

FROST: Faster and more Robust One-shot Semi-supervised Training

by   Helena E. Liu, et al.

Recent advances in one-shot semi-supervised learning have lowered the barrier for deep learning of new applications. However, the state-of-the-art for semi-supervised learning is slow to train and the performance is sensitive to the choices of the labeled data and hyper-parameter values. In this paper, we present a one-shot semi-supervised learning method that trains up to an order of magnitude faster and is more robust than state-of-the-art methods. Specifically, we show that by combining semi-supervised learning with a one-stage, single network version of self-training, our FROST methodology trains faster and is more robust to choices for the labeled samples and changes in hyper-parameters. Our experiments demonstrate FROST's capability to perform well when the composition of the unlabeled data is unknown; that is when the unlabeled data contain unequal numbers of each class and can contain out-of-distribution examples that don't belong to any of the training classes. High performance, speed of training, and insensitivity to hyper-parameters make FROST the most practical method for one-shot semi-supervised training. Our code is available at


page 1

page 2

page 3

page 4


Semi-Supervised Learning with Self-Supervised Networks

Recent advances in semi-supervised learning have shown tremendous potent...

Empirical Perspectives on One-Shot Semi-supervised Learning

One of the greatest obstacles in the adoption of deep neural networks fo...

Interpolation-based semi-supervised learning for object detection

Despite the data labeling cost for the object detection tasks being subs...

Semi-supervised and Population Based Training for Voice Commands Recognition

We present a rapid design methodology that combines automated hyper-para...

Building One-Shot Semi-supervised (BOSS) Learning up to Fully Supervised Performance

Reaching the performance of fully supervised learning with unlabeled dat...

GenSDF: Two-Stage Learning of Generalizable Signed Distance Functions

We investigate the generalization capabilities of neural signed distance...