Group Normalization

03/22/2018
by   Yuxin Wu, et al.
4

Batch Normalization (BN) is a milestone technique in the development of deep learning, enabling various networks to train. However, normalizing along the batch dimension introduces problems --- BN's error increases rapidly when the batch size becomes smaller, caused by inaccurate batch statistics estimation. This limits BN's usage for training larger models and transferring features to computer vision tasks including detection, segmentation, and video, which require small batches constrained by memory consumption. In this paper, we present Group Normalization (GN) as a simple alternative to BN. GN divides the channels into groups and computes within each group the mean and variance for normalization. GN's computation is independent of batch sizes, and its accuracy is stable in a wide range of batch sizes. On ResNet-50 trained in ImageNet, GN has 10.6 when using typical batch sizes, GN is comparably good with BN and outperforms other normalization variants. Moreover, GN can be naturally transferred from pre-training to fine-tuning. GN can outperform or compete with its BN-based counterparts for object detection and segmentation in COCO, and for video classification in Kinetics, showing that GN can effectively replace the powerful BN in a variety of tasks. GN can be easily implemented by a few lines of code in modern libraries.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/04/2020

Batch Group Normalization

Deep Convolutional Neural Networks (DCNNs) are hard and time-consuming t...
research
01/19/2020

Towards Stabilizing Batch Statistics in Backward Propagation of Batch Normalization

Batch Normalization (BN) is one of the most widely used techniques in De...
research
04/06/2023

Patch-aware Batch Normalization for Improving Cross-domain Robustness

Despite the significant success of deep learning in computer vision task...
research
09/28/2020

Group Whitening: Balancing Learning Efficiency and Representational Capacity

Batch normalization (BN) is an important technique commonly incorporated...
research
04/12/2019

EvalNorm: Estimating Batch Normalization Statistics for Evaluation

Batch normalization (BN) has been very effective for deep learning and i...
research
06/07/2021

Making EfficientNet More Efficient: Exploring Batch-Independent Normalization, Group Convolutions and Reduced Resolution Training

Much recent research has been dedicated to improving the efficiency of t...
research
01/29/2022

Task-Balanced Batch Normalization for Exemplar-based Class-Incremental Learning

Batch Normalization (BN) is an essential layer for training neural netwo...

Please sign up or login with your details

Forgot password? Click here to reset