EnAET: Self-Trained Ensemble AutoEncoding Transformations for Semi-Supervised Learning

11/21/2019
by   Xiao Wang, et al.
0

Deep neural networks have been successfully applied to many real-world applications. However, these successes rely heavily on large amounts of labeled data, which is expensive to obtain. Recently, Auto-Encoding Transformation (AET) and MixMatch have been proposed and achieved state-of-the-art results for unsupervised and semi-supervised learning, respectively. In this study, we train an Ensemble of Auto-Encoding Transformations (EnAET) to learn from both labeled and unlabeled data based on the embedded representations by decoding both spatial and non-spatial transformations. This distinguishes EnAET from conventional semi-supervised methods that focus on improving prediction consistency and confidence by different models on both unlabeled and labeled examples. In contrast, we propose to explore the role of self-supervised representations in semi-supervised learning under a rich family of transformations. Experiment results on CIFAR-10, CIFAR-100, SVHN and STL10 demonstrate that the proposed EnAET outperforms the state-of-the-art semi-supervised methods by significant margins. In particular, we apply the proposed method to extremely challenging scenarios with only 10 images per class, and show that EnAET can achieve an error rate of 9.35 16.92 fully supervised learning using all labeled data with the same network architecture. The performance on CIFAR-10, CIFAR-100 and SVHN with a smaller network is even more competitive than the state-of-the-art of supervised learning methods based on a larger network. We also set a new performance record with an error rate of 1.99 experiment records are released at https://github.com/maple-research-lab/EnAET.

READ FULL TEXT
research
06/25/2019

Semi-Supervised Learning with Self-Supervised Networks

Recent advances in semi-supervised learning have shown tremendous potent...
research
11/30/2020

MUSCLE: Strengthening Semi-Supervised Learning Via Concurrent Unsupervised Learning Using Mutual Information Maximization

Deep neural networks are powerful, massively parameterized machine learn...
research
12/06/2020

Art Style Classification with Self-Trained Ensemble of AutoEncoding Transformations

The artistic style of a painting is a rich descriptor that reveals both ...
research
01/14/2019

AET vs. AED: Unsupervised Representation Learning by Auto-Encoding Transformations rather than Data

The success of deep neural networks often relies on a large amount of la...
research
11/12/2018

Distributionally Robust Semi-Supervised Learning for People-Centric Sensing

Semi-supervised learning is crucial for alleviating labelling burdens in...
research
02/12/2021

ReRankMatch: Semi-Supervised Learning with Semantics-Oriented Similarity Representation

This paper proposes integrating semantics-oriented similarity representa...
research
07/16/2020

FeatMatch: Feature-Based Augmentation for Semi-Supervised Learning

Recent state-of-the-art semi-supervised learning (SSL) methods use a com...

Please sign up or login with your details

Forgot password? Click here to reset