Guidelines for the Regularization of Gammas in Batch Normalization for Deep Residual Networks

05/15/2022
by   Bum Jun Kim, et al.
0

L2 regularization for weights in neural networks is widely used as a standard training trick. However, L2 regularization for gamma, a trainable parameter of batch normalization, remains an undiscussed mystery and is applied in different ways depending on the library and practitioner. In this paper, we study whether L2 regularization for gamma is valid. To explore this issue, we consider two approaches: 1) variance control to make the residual network behave like identity mapping and 2) stable optimization through the improvement of effective learning rate. Through two analyses, we specify the desirable and undesirable gamma to apply L2 regularization and propose four guidelines for managing them. In several experiments, we observed the increase and decrease in performance caused by applying L2 regularization to gamma of four categories, which is consistent with our four guidelines. Our proposed guidelines were validated through various tasks and architectures, including variants of residual networks and transformers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/16/2017

L2 Regularization versus Batch and Weight Normalization

Batch Normalization is a commonly used trick to improve the training of ...
research
02/13/2023

How to Use Dropout Correctly on Residual Networks with Batch Normalization

For the stable optimization of deep neural networks, regularization meth...
research
02/24/2020

Batch Normalization Biases Deep Residual Networks Towards Shallow Paths

Batch normalization has multiple benefits. It improves the conditioning ...
research
01/27/2019

Fixup Initialization: Residual Learning Without Normalization

Normalization layers are a staple in state-of-the-art deep neural networ...
research
09/15/2022

Theroretical Insight into Batch Normalization: Data Dependant Auto-Tuning of Regularization Rate

Batch normalization is widely used in deep learning to normalize interme...
research
12/02/2018

Analysis on Gradient Propagation in Batch Normalized Residual Networks

We conduct mathematical analysis on the effect of batch normalization (B...

Please sign up or login with your details

Forgot password? Click here to reset