Double Forward Propagation for Memorized Batch Normalization

10/10/2020
by   Yong Guo, et al.
0

Batch Normalization (BN) has been a standard component in designing deep neural networks (DNNs). Although the standard BN can significantly accelerate the training of DNNs and improve the generalization performance, it has several underlying limitations which may hamper the performance in both training and inference. In the training stage, BN relies on estimating the mean and variance of data using a single minibatch. Consequently, BN can be unstable when the batch size is very small or the data is poorly sampled. In the inference stage, BN often uses the so called moving mean and moving variance instead of batch statistics, i.e., the training and inference rules in BN are not consistent. Regarding these issues, we propose a memorized batch normalization (MBN), which considers multiple recent batches to obtain more accurate and robust statistics. Note that after the SGD update for each batch, the model parameters will change, and the features will change accordingly, leading to the Distribution Shift before and after the update for the considered batch. To alleviate this issue, we present a simple Double-Forward scheme in MBN which can further improve the performance. Compared to related methods, the proposed MBN exhibits consistent behaviors in both training and inference. Empirical results show that the MBN based models trained with the Double-Forward scheme greatly reduce the sensitivity of data and significantly improve the generalization performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/14/2023

Context Normalization for Robust Image Classification

Normalization is a pre-processing step that converts the data into a mor...
research
05/06/2019

Batch Normalization is a Cause of Adversarial Vulnerability

Batch normalization (batch norm) is often used in an attempt to stabiliz...
research
03/28/2018

Normalization of Neural Networks using Analytic Variance Propagation

We address the problem of estimating statistics of hidden units in a neu...
research
03/04/2016

Normalization Propagation: A Parametric Technique for Removing Internal Covariate Shift in Deep Networks

While the authors of Batch Normalization (BN) identify and address an im...
research
03/17/2020

Rethinking Batch Normalization in Transformers

The standard normalization method for neural network (NN) models used in...
research
10/25/2018

Batch Normalization Sampling

Deep Neural Networks (DNNs) thrive in recent years in which Batch Normal...
research
11/01/2018

Stochastic Normalizations as Bayesian Learning

In this work we investigate the reasons why Batch Normalization (BN) imp...

Please sign up or login with your details

Forgot password? Click here to reset