Controlling Covariate Shift using Equilibrium Normalization of Weights

12/11/2018
by   Aaron Defazio, et al.
1

We introduce a new normalization technique that exhibits the fast convergence properties of batch normalization using a transformation of layer weights instead of layer outputs. The proposed technique keeps the contribution of positive and negative weights to the layer output in equilibrium. We validate our method on a set of standard benchmarks including CIFAR-10/100, SVHN and ILSVRC 2012 ImageNet.

READ FULL TEXT
research
03/22/2021

Delving into Variance Transmission and Normalization: Shift of Average Gradient Makes the Network Collapse

Normalization operations are essential for state-of-the-art neural netwo...
research
09/19/2022

Batch Layer Normalization, A new normalization layer for CNNs and RNN

This study introduces a new normalization layer termed Batch Layer Norma...
research
02/27/2019

Equi-normalization of Neural Networks

Modern neural networks are over-parametrized. In particular, each rectif...
research
01/09/2020

An Internal Covariate Shift Bounding Algorithm for Deep Neural Networks by Unitizing Layers' Outputs

Batch Normalization (BN) techniques have been proposed to reduce the so-...
research
07/16/2020

A New Look at Ghost Normalization

Batch normalization (BatchNorm) is an effective yet poorly understood te...
research
11/16/2019

Understanding and Improving Layer Normalization

Layer normalization (LayerNorm) is a technique to normalize the distribu...
research
09/29/2018

On the Convergence and Robustness of Batch Normalization

Despite its empirical success, the theoretical underpinnings of the stab...

Please sign up or login with your details

Forgot password? Click here to reset