Learning Deep Representations Using Convolutional Auto-encoders with Symmetric Skip Connections

11/28/2016
by   Jianfeng Dong, et al.
0

Unsupervised pre-training was a critical technique for training deep neural networks years ago. With sufficient labeled data and modern training techniques, it is possible to train very deep neural networks from scratch in a purely supervised manner nowadays. However, unlabeled data is easier to obtain and usually of very large scale. How to make use of them better to help supervised learning is still a well-valued topic. In this paper, we investigate convolutional denoising auto-encoders to show that unsupervised pre-training can still improve the performance of high-level image related tasks such as image classification and semantic segmentation. The architecture we use is a convolutional auto-encoder network with symmetric shortcut connections. We empirically show that symmetric shortcut connections are very important for learning abstract representations via image reconstruction. When no extra unlabeled data are available, unsupervised pre-training with our network can regularize the supervised training and therefore lead to better generalization performance. With the help of unsupervised pre-training, our method achieves very competitive results in image classification using very simple all-convolution networks. When labeled data are limited but extra unlabeled data are available, our method achieves good results in several semi-supervised learning tasks.

READ FULL TEXT

page 3

page 8

page 13

page 14

research
04/02/2021

Robust wav2vec 2.0: Analyzing Domain Shift in Self-Supervised Pre-Training

Self-supervised learning of speech representations has been a very activ...
research
03/03/2022

Semi-supervised Learning using Robust Loss

The amount of manually labeled data is limited in medical applications, ...
research
04/23/2020

Self-supervised Learning for Astronomical Image Classification

In Astronomy, a huge amount of image data is generated daily by photomet...
research
01/04/2020

Biologically-Motivated Deep Learning Method using Hierarchical Competitive Learning

This study proposes a novel biologically-motivated learning method for d...
research
05/06/2014

Is Joint Training Better for Deep Auto-Encoders?

Traditionally, when generative models of data are developed via deep arc...
research
11/07/2016

Adversarial Ladder Networks

The use of unsupervised data in addition to supervised data in training ...
research
08/08/2020

Using UNet and PSPNet to explore the reusability principle of CNN parameters

How to reduce the requirement on training dataset size is a hot topic in...

Please sign up or login with your details

Forgot password? Click here to reset