Swapout: Learning an ensemble of deep architectures

05/20/2016
by   Saurabh Singh, et al.
0

We describe Swapout, a new stochastic training method, that outperforms ResNets of identical network structure yielding impressive results on CIFAR-10 and CIFAR-100. Swapout samples from a rich set of architectures including dropout, stochastic depth and residual architectures as special cases. When viewed as a regularization method swapout not only inhibits co-adaptation of units in a layer, similar to dropout, but also across network layers. We conjecture that swapout achieves strong regularization by implicitly tying the parameters across layers. When viewed as an ensemble training method, it samples a much richer set of architectures than existing methods such as dropout or stochastic depth. We propose a parameterization that reveals connections to exiting architectures and suggests a much richer set of architectures to be explored. We show that our formulation suggests an efficient training method and validate our conclusions on CIFAR-10 and CIFAR-100 matching state of the art accuracy. Remarkably, our 32 layer wider model performs similar to a 1001 layer ResNet model.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/21/2019

Regularizing Neural Networks by Stochastically Training Layer Ensembles

Dropout and similar stochastic neural network regularization methods are...
11/22/2015

Gradual DropIn of Layers to Train Very Deep Neural Networks

We introduce the concept of dynamically growing a neural network during ...
04/13/2019

Shakeout: A New Approach to Regularized Deep Neural Network Training

Recent years have witnessed the success of deep neural networks in deali...
04/09/2019

Intra-Ensemble in Neural Networks

Improving model performance is always the key problem in machine learnin...
06/20/2017

Analysis of dropout learning regarded as ensemble learning

Deep learning is the state-of-the-art in fields such as visual object re...
10/02/2018

Multi-scale Convolution Aggregation and Stochastic Feature Reuse for DenseNets

Recently, Convolution Neural Networks (CNNs) obtained huge success in nu...
06/04/2017

Segmentation of Intracranial Arterial Calcification with Deeply Supervised Residual Dropout Networks

Intracranial carotid artery calcification (ICAC) is a major risk factor ...

Code Repositories

swapout

Implementation of Swapout by chainer (Swapout: Learning an ensemble of deep architectures: https://arxiv.org/abs/1605.06465)


view repo