AET vs. AED: Unsupervised Representation Learning by Auto-Encoding Transformations rather than Data

01/14/2019
by   Liheng Zhang, et al.
0

The success of deep neural networks often relies on a large amount of labeled examples, which can be difficult to obtain in many real scenarios. To address this challenge, unsupervised methods are strongly preferred for training neural networks without using any labeled data. In this paper, we present a novel paradigm of unsupervised representation learning by Auto-Encoding Transformation (AET) in contrast to the conventional Auto-Encoding Data (AED) approach. Given a randomly sampled transformation, AET seeks to predict it merely from the encoded features as accurately as possible at the output end. The idea is the following: as long as the unsupervised features successfully encode the essential information about the visual structures of original and transformed images, the transformation can be well predicted. We will show that this AET paradigm allows us to instantiate a large variety of transformations, from parameterized, to non-parameterized and GAN-induced ones. Our experiments show that AET greatly improves over existing unsupervised approaches, setting new state-of-the-art performances being greatly closer to the upper bounds by their fully supervised counterparts on CIFAR-10, ImageNet and Places datasets.

READ FULL TEXT
research
11/19/2019

GraphTER: Unsupervised Learning of Graph Transformation Equivariant Representations via Auto-Encoding Node-wise Transformations

Recent advances in Graph Convolutional Neural Networks (GCNNs) have show...
research
11/21/2019

EnAET: Self-Trained Ensemble AutoEncoding Transformations for Semi-Supervised Learning

Deep neural networks have been successfully applied to many real-world a...
research
03/23/2019

AVT: Unsupervised Learning of Transformation Equivariant Representations by Autoencoding Variational Transformations

The learning of Transformation-Equivariant Representations (TERs), which...
research
06/16/2020

Neural Optimal Control for Representation Learning

The intriguing connections recently established between neural networks ...
research
06/19/2019

Learning Generalized Transformation Equivariant Representations via Autoencoding Transformations

Learning Transformation Equivariant Representations (TERs) seeks to capt...
research
04/11/2021

Saddlepoints in Unsupervised Least Squares

This paper sheds light on the risk landscape of unsupervised least squar...
research
11/28/2016

Efficient Convolutional Auto-Encoding via Random Convexification and Frequency-Domain Minimization

The omnipresence of deep learning architectures such as deep convolution...

Please sign up or login with your details

Forgot password? Click here to reset