Temporal Ensembling for Semi-Supervised Learning

10/07/2016
by   Samuli Laine, et al.
0

In this paper, we present a simple and efficient method for training deep neural networks in a semi-supervised setting where only a small portion of training data is labeled. We introduce self-ensembling, where we form a consensus prediction of the unknown labels using the outputs of the network-in-training on different epochs, and most importantly, under different regularization and input augmentation conditions. This ensemble prediction can be expected to be a better predictor for the unknown labels than the output of the network at the most recent training epoch, and can thus be used as a target for training. Using our method, we set new records for two standard semi-supervised learning benchmarks, reducing the (non-augmented) classification error rate from 18.44 18.63 by enabling the standard augmentations. We additionally obtain a clear improvement in CIFAR-100 classification accuracy by using random images from the Tiny Images dataset as unlabeled extra inputs during training. Finally, we demonstrate good tolerance to incorrect labels.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/22/2019

Learning to Impute: A General Framework for Semi-supervised Learning

Recent semi-supervised learning methods have shown to achieve comparable...
research
02/08/2018

A Semi-Supervised Two-Stage Approach to Learning from Noisy Labels

The recent success of deep neural networks is powered in part by large-s...
research
05/12/2021

A function approximation approach to the prediction of blood glucose levels

The problem of real time prediction of blood glucose (BG) levels based o...
research
07/10/2021

Semi-Supervised Learning with Multi-Head Co-Training

Co-training, extended from self-training, is one of the frameworks for s...
research
11/01/2017

Smooth Neighbors on Teacher Graphs for Semi-supervised Learning

The paper proposes an inductive semi-supervised learning method, called ...
research
09/07/2023

Fast FixMatch: Faster Semi-Supervised Learning with Curriculum Batch Size

Advances in Semi-Supervised Learning (SSL) have almost entirely closed t...
research
09/13/2023

Reliability-based cleaning of noisy training labels with inductive conformal prediction in multi-modal biomedical data mining

Accurately labeling biomedical data presents a challenge. Traditional se...

Please sign up or login with your details

Forgot password? Click here to reset