Batch Normalization Preconditioning for Neural Network Training

08/02/2021
by   Susanna Lange, et al.
0

Batch normalization (BN) is a popular and ubiquitous method in deep learning that has been shown to decrease training time and improve generalization performance of neural networks. Despite its success, BN is not theoretically well understood. It is not suitable for use with very small mini-batch sizes or online learning. In this paper, we propose a new method called Batch Normalization Preconditioning (BNP). Instead of applying normalization explicitly through a batch normalization layer as is done in BN, BNP applies normalization by conditioning the parameter gradients directly during training. This is designed to improve the Hessian matrix of the loss function and hence convergence during training. One benefit is that BNP is not constrained on the mini-batch size and works in the online learning setting. Furthermore, its connection to BN provides theoretical insights on how BN improves training and how BN is applied to special architectures such as convolutional neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/02/2020

Weight and Gradient Centralization in Deep Neural Networks

Batch normalization is currently the most widely used variant of interna...
research
11/23/2020

Comparing Normalization Methods for Limited Batch Size Segmentation Neural Networks

The widespread use of Batch Normalization has enabled training deeper ne...
research
07/05/2022

Understanding and Improving Group Normalization

Various normalization layers have been proposed to help the training of ...
research
09/29/2022

Batch Normalization Explained

A critically important, ubiquitous, and yet poorly understood ingredient...
research
06/21/2022

On the Maximum Hessian Eigenvalue and Generalization

The mechanisms by which certain training interventions, such as increasi...
research
02/12/2021

A Large Batch Optimizer Reality Check: Traditional, Generic Optimizers Suffice Across Batch Sizes

Recently the LARS and LAMB optimizers have been proposed for training ne...
research
03/27/2020

An Investigation into the Stochasticity of Batch Whitening

Batch Normalization (BN) is extensively employed in various network arch...

Please sign up or login with your details

Forgot password? Click here to reset