Theroretical Insight into Batch Normalization: Data Dependant Auto-Tuning of Regularization Rate

09/15/2022
by   Lakshmi Annamalai, et al.
0

Batch normalization is widely used in deep learning to normalize intermediate activations. Deep networks suffer from notoriously increased training complexity, mandating careful initialization of weights, requiring lower learning rates, etc. These issues have been addressed by Batch Normalization (BN), by normalizing the inputs of activations to zero mean and unit standard deviation. Making this batch normalization part of the training process dramatically accelerates the training process of very deep networks. A new field of research has been going on to examine the exact theoretical explanation behind the success of BN. Most of these theoretical insights attempt to explain the benefits of BN by placing them on its influence on optimization, weight scale invariance, and regularization. Despite BN undeniable success in accelerating generalization, the gap of analytically relating the effect of BN to the regularization parameter is still missing. This paper aims to bring out the data-dependent auto-tuning of the regularization parameter by BN with analytical proofs. We have posed BN as a constrained optimization imposed on non-BN weights through which we demonstrate its data statistics dependant auto-tuning of regularization parameter. We have also given analytical proof for its behavior under a noisy input scenario, which reveals the signal vs. noise tuning of the regularization parameter. We have also substantiated our claim with empirical results from the MNIST dataset experiments.

READ FULL TEXT

page 1

page 4

research
06/01/2018

Understanding Batch Normalization

Batch normalization is a ubiquitous deep learning technique that normali...
research
08/18/2020

Training Deep Neural Networks Without Batch Normalization

Training neural networks is an optimization problem, and finding a decen...
research
05/21/2015

Why Regularized Auto-Encoders learn Sparse Representation?

While the authors of Batch Normalization (BN) identify and address an im...
research
09/04/2018

Understanding Regularization in Batch Normalization

Batch Normalization (BN) makes output of hidden neuron had zero mean and...
research
03/05/2018

Norm matters: efficient and accurate normalization schemes in deep networks

Over the past few years batch-normalization has been commonly used in de...
research
05/15/2022

Guidelines for the Regularization of Gammas in Batch Normalization for Deep Residual Networks

L2 regularization for weights in neural networks is widely used as a sta...
research
06/01/2023

Spreads in Effective Learning Rates: The Perils of Batch Normalization During Early Training

Excursions in gradient magnitude pose a persistent challenge when traini...

Please sign up or login with your details

Forgot password? Click here to reset