Group Whitening: Balancing Learning Efficiency and Representational Capacity

09/28/2020
by   Lei Huang, et al.
2

Batch normalization (BN) is an important technique commonly incorporated into deep learning models to perform standardization within mini-batches. The merits of BN in improving model's learning efficiency can be further amplified by applying whitening, while its drawbacks in estimating population statistics for inference can be avoided through group normalization (GN). This paper proposes group whitening (GW), which elaborately exploits the advantages of the whitening operation and avoids the disadvantages of normalization within mini-batches. Specifically, GW divides the neurons of a sample into groups for standardization, like GN, and then further decorrelates the groups. In addition, we quantitatively analyze the constraint imposed by normalization, and show how the batch size (group number) affects the performance of batch (group) normalized networks, from the perspective of model's representational capacity. This analysis provides theoretical guidance for applying GW in practice. Finally, we apply the proposed GW to ResNet and ResNeXt architectures and conduct experiments on the ImageNet and COCO benchmarks. Results show that GW consistently improves the performance of different architectures, with absolute gains of 1.02% ∼ 1.49% in top-1 accuracy on ImageNet and 1.82% ∼ 3.21% in bounding box AP on COCO.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/21/2022

Delving into the Estimation Shift of Batch Normalization in a Network

Batch normalization (BN) is a milestone technique in deep learning. It n...
research
11/21/2019

Filter Response Normalization Layer: Eliminating Batch Dependence in the Training of Deep Neural Networks

Batch Normalization (BN) is a highly successful and widely used batch de...
research
03/22/2018

Group Normalization

Batch Normalization (BN) is a milestone technique in the development of ...
research
03/27/2020

An Investigation into the Stochasticity of Batch Whitening

Batch Normalization (BN) is extensively employed in various network arch...
research
06/07/2021

Making EfficientNet More Efficient: Exploring Batch-Independent Normalization, Group Convolutions and Reduced Resolution Training

Much recent research has been dedicated to improving the efficiency of t...
research
04/06/2019

Instance-Level Meta Normalization

This paper presents a normalization mechanism called Instance-Level Meta...
research
04/12/2019

EvalNorm: Estimating Batch Normalization Statistics for Evaluation

Batch normalization (BN) has been very effective for deep learning and i...

Please sign up or login with your details

Forgot password? Click here to reset