ShakeDrop regularization

02/07/2018
by   Yoshihiro Yamada, et al.
0

This paper proposes a powerful regularization method named ShakeDrop regularization. ShakeDrop is inspired by Shake-Shake regularization that decreases error rates by disturbing learning. While Shake-Shake can be applied to only ResNeXt which has multiple branches, ShakeDrop can be applied to not only ResNeXt but also ResNet, Wide ResNet and PyramidNet in a memory efficient way. Important and interesting feature of ShakeDrop is that it strongly disturbs learning by multiplying even a negative factor to the output of a convolutional layer in the forward training pass. The effectiveness of ShakeDrop is confirmed by experiments on CIFAR-10/100 and Tiny ImageNet datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/11/2021

Preprint: Norm Loss: An efficient yet effective regularization method for deep neural networks

Convolutional neural network training can suffer from diverse issues lik...
research
03/29/2021

Selective Output Smoothing Regularization: Regularize Neural Networks by Softening Output Distributions

In this paper, we propose Selective Output Smoothing Regularization, a n...
research
06/05/2018

Stochastic Gradient Descent with Hyperbolic-Tangent Decay

Learning rate scheduler has been a critical issue in the deep neural net...
research
06/06/2021

Regularization in ResNet with Stochastic Depth

Regularization plays a major role in modern deep learning. From classic ...
research
01/08/2021

Good Students Play Big Lottery Better

Lottery ticket hypothesis suggests that a dense neural network contains ...
research
09/18/2017

Coupled Ensembles of Neural Networks

We investigate in this paper the architecture of deep convolutional netw...
research
05/25/2019

Locality-Promoting Representation Learning

This work investigates fundamental questions related to locating and def...

Please sign up or login with your details

Forgot password? Click here to reset