DeepAI AI Chat
Log In Sign Up

Regularization in ResNet with Stochastic Depth

by   Soufiane Hayou, et al.

Regularization plays a major role in modern deep learning. From classic techniques such as L1,L2 penalties to other noise-based methods such as Dropout, regularization often yields better generalization properties by avoiding overfitting. Recently, Stochastic Depth (SD) has emerged as an alternative regularization technique for residual neural networks (ResNets) and has proven to boost the performance of ResNet on many tasks [Huang et al., 2016]. Despite the recent success of SD, little is known about this technique from a theoretical perspective. This paper provides a hybrid analysis combining perturbation analysis and signal propagation to shed light on different regularization effects of SD. Our analysis allows us to derive principled guidelines for choosing the survival rates used for training with SD.


page 1

page 2

page 3

page 4


On the relationship between Dropout and Equiangular Tight Frames

Dropout is a popular regularization technique in neural networks. Yet, t...

Stable ResNet

Deep ResNet architectures have achieved state of the art performance on ...

Regularizing Deep Neural Networks by Noise: Its Interpretation and Optimization

Overfitting is one of the most critical challenges in deep neural networ...

Information Geometry of Dropout Training

Dropout is one of the most popular regularization techniques in neural n...

ShakeDrop regularization

This paper proposes a powerful regularization method named ShakeDrop reg...

The Hybrid Bootstrap: A Drop-in Replacement for Dropout

Regularization is an important component of predictive model building. T...

Disturbing Target Values for Neural Network Regularization

Diverse regularization techniques have been developed such as L2 regular...