Understanding and Improving Group Normalization

07/05/2022
by   Agus Gunawan, et al.
53

Various normalization layers have been proposed to help the training of neural networks. Group Normalization (GN) is one of the effective and attractive studies that achieved significant performances in the visual recognition task. Despite the great success achieved, GN still has several issues that may negatively impact neural network training. In this paper, we introduce an analysis framework and discuss the working principles of GN in affecting the training process of the neural network. From experimental results, we conclude the real cause of GN's inferior performance against Batch normalization (BN): 1) unstable training performance, 2) more sensitive to distortion, whether it comes from external noise or perturbations introduced by the regularization. In addition, we found that GN can only help the neural network training in some specific period, unlike BN, which helps the network throughout the training. To solve these issues, we propose a new normalization layer built on top of GN, by incorporating the advantages of BN. Experimental results on the image classification task demonstrated that the proposed normalization layer outperforms the official GN to improve recognition accuracy regardless of the batch sizes and stabilize the network training.

READ FULL TEXT

page 8

page 9

page 10

page 11

page 12

research
08/02/2021

Batch Normalization Preconditioning for Neural Network Training

Batch normalization (BN) is a popular and ubiquitous method in deep lear...
research
06/29/2021

On the Periodic Behavior of Neural Network Training with Batch Normalization and Weight Decay

Despite the conventional wisdom that using batch normalization with weig...
research
12/09/2017

Peephole: Predicting Network Performance Before Training

The quest for performant networks has been a significant force that driv...
research
02/12/2021

A Large Batch Optimizer Reality Check: Traditional, Generic Optimizers Suffice Across Batch Sizes

Recently the LARS and LAMB optimizers have been proposed for training ne...
research
09/03/2019

LCA: Loss Change Allocation for Neural Network Training

Neural networks enjoy widespread use, but many aspects of their training...
research
04/06/2019

Iterative Normalization: Beyond Standardization towards Efficient Whitening

Batch Normalization (BN) is ubiquitously employed for accelerating neura...
research
11/18/2016

Improving training of deep neural networks via Singular Value Bounding

Deep learning methods achieve great success recently on many computer vi...

Please sign up or login with your details

Forgot password? Click here to reset