Cross-Iteration Batch Normalization

02/13/2020
by   Zhuliang Yao, et al.
0

A well-known issue of Batch Normalization is its significantly reduced effectiveness in the case of small mini-batch sizes. When a mini-batch contains few examples, the statistics upon which the normalization is defined cannot be reliably estimated from it during a training iteration. To address this problem, we present Cross-Iteration Batch Normalization (CBN), in which examples from multiple recent iterations are jointly utilized to enhance estimation quality. A challenge of computing statistics over multiple iterations is that the network activations from different iterations are not comparable to each other due to changes in network weights. We thus compensate for the network weight changes via a proposed technique based on Taylor polynomials, so that the statistics can be accurately estimated and batch normalization can be effectively applied. On object detection and image classification with small mini-batch sizes, CBN is found to outperform the original batch normalization and a direct calculation of statistics over previous iterations without the proposed compensation technique.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/21/2019

Filter Response Normalization Layer: Eliminating Batch Dependence in the Training of Deep Neural Networks

Batch Normalization (BN) is a highly successful and widely used batch de...
research
11/20/2017

MegDet: A Large Mini-Batch Object Detector

The improvements in recent CNN-based object detection works, from R-CNN ...
research
07/09/2022

Batch-efficient EigenDecomposition for Small and Medium Matrices

EigenDecomposition (ED) is at the heart of many computer vision algorith...
research
04/06/2023

Patch-aware Batch Normalization for Improving Cross-domain Robustness

Despite the significant success of deep learning in computer vision task...
research
09/29/2022

Batch Normalization Explained

A critically important, ubiquitous, and yet poorly understood ingredient...
research
04/06/2019

Iterative Normalization: Beyond Standardization towards Efficient Whitening

Batch Normalization (BN) is ubiquitously employed for accelerating neura...
research
10/25/2018

Batch Normalization Sampling

Deep Neural Networks (DNNs) thrive in recent years in which Batch Normal...

Please sign up or login with your details

Forgot password? Click here to reset