Understanding and Improving Group Normalization

by   Agus Gunawan, et al.
KAIST 수리과학과

Various normalization layers have been proposed to help the training of neural networks. Group Normalization (GN) is one of the effective and attractive studies that achieved significant performances in the visual recognition task. Despite the great success achieved, GN still has several issues that may negatively impact neural network training. In this paper, we introduce an analysis framework and discuss the working principles of GN in affecting the training process of the neural network. From experimental results, we conclude the real cause of GN's inferior performance against Batch normalization (BN): 1) unstable training performance, 2) more sensitive to distortion, whether it comes from external noise or perturbations introduced by the regularization. In addition, we found that GN can only help the neural network training in some specific period, unlike BN, which helps the network throughout the training. To solve these issues, we propose a new normalization layer built on top of GN, by incorporating the advantages of BN. Experimental results on the image classification task demonstrated that the proposed normalization layer outperforms the official GN to improve recognition accuracy regardless of the batch sizes and stabilize the network training.


page 8

page 9

page 10

page 11

page 12


Batch Normalization Preconditioning for Neural Network Training

Batch normalization (BN) is a popular and ubiquitous method in deep lear...

On the Periodic Behavior of Neural Network Training with Batch Normalization and Weight Decay

Despite the conventional wisdom that using batch normalization with weig...

Peephole: Predicting Network Performance Before Training

The quest for performant networks has been a significant force that driv...

A Large Batch Optimizer Reality Check: Traditional, Generic Optimizers Suffice Across Batch Sizes

Recently the LARS and LAMB optimizers have been proposed for training ne...

LCA: Loss Change Allocation for Neural Network Training

Neural networks enjoy widespread use, but many aspects of their training...

Iterative Normalization: Beyond Standardization towards Efficient Whitening

Batch Normalization (BN) is ubiquitously employed for accelerating neura...

Improving training of deep neural networks via Singular Value Bounding

Deep learning methods achieve great success recently on many computer vi...

Please sign up or login with your details

Forgot password? Click here to reset