Regularization in ResNet with Stochastic Depth

06/06/2021
by   Soufiane Hayou, et al.
0

Regularization plays a major role in modern deep learning. From classic techniques such as L1,L2 penalties to other noise-based methods such as Dropout, regularization often yields better generalization properties by avoiding overfitting. Recently, Stochastic Depth (SD) has emerged as an alternative regularization technique for residual neural networks (ResNets) and has proven to boost the performance of ResNet on many tasks [Huang et al., 2016]. Despite the recent success of SD, little is known about this technique from a theoretical perspective. This paper provides a hybrid analysis combining perturbation analysis and signal propagation to shed light on different regularization effects of SD. Our analysis allows us to derive principled guidelines for choosing the survival rates used for training with SD.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/14/2018

On the relationship between Dropout and Equiangular Tight Frames

Dropout is a popular regularization technique in neural networks. Yet, t...
research
10/24/2020

Stable ResNet

Deep ResNet architectures have achieved state of the art performance on ...
research
10/14/2017

Regularizing Deep Neural Networks by Noise: Its Interpretation and Optimization

Overfitting is one of the most critical challenges in deep neural networ...
research
06/22/2022

Information Geometry of Dropout Training

Dropout is one of the most popular regularization techniques in neural n...
research
02/07/2018

ShakeDrop regularization

This paper proposes a powerful regularization method named ShakeDrop reg...
research
01/22/2018

The Hybrid Bootstrap: A Drop-in Replacement for Dropout

Regularization is an important component of predictive model building. T...
research
10/11/2021

Disturbing Target Values for Neural Network Regularization

Diverse regularization techniques have been developed such as L2 regular...

Please sign up or login with your details

Forgot password? Click here to reset