DeepAI AI Chat
Log In Sign Up

Proxy-Normalizing Activations to Match Batch Normalization while Removing Batch Dependence

06/07/2021
by   Antoine Labatie, et al.
0

We investigate the reasons for the performance degradation incurred with batch-independent normalization. We find that the prototypical techniques of layer normalization and instance normalization both induce the appearance of failure modes in the neural network's pre-activations: (i) layer normalization induces a collapse towards channel-wise constant functions; (ii) instance normalization induces a lack of variability in instance statistics, symptomatic of an alteration of the expressivity. To alleviate failure mode (i) without aggravating failure mode (ii), we introduce the technique "Proxy Normalization" that normalizes post-activations using a proxy distribution. When combined with layer normalization or group normalization, this batch-independent normalization emulates batch normalization's behavior and consistently matches or exceeds its performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

08/21/2019

U-Net Training with Instance-Layer Normalization

Normalization layers are essential in a Deep Convolutional Neural Networ...
11/01/2018

Stochastic Normalizations as Bayesian Learning

In this work we investigate the reasons why Batch Normalization (BN) imp...
07/16/2020

A New Look at Ghost Normalization

Batch normalization (BatchNorm) is an effective yet poorly understood te...
10/12/2018

Mode Normalization

Normalization methods are a central building block in the deep learning ...
10/09/2020

Permuted AdaIN: Enhancing the Representation of Local Cues in Image Classifiers

Recent work has shown that convolutional neural network classifiers over...
11/14/2016

Normalizing the Normalizers: Comparing and Extending Network Normalization Schemes

Normalization techniques have only recently begun to be exploited in sup...