Online Normalization for Training Neural Networks

05/15/2019
by   Vitaliy Chiley, et al.
5

Online Normalization is a new technique for normalizing the hidden activations of a neural network. Like Batch Normalization, it normalizes the sample dimension. While Online Normalization does not use batches, it is as accurate as Batch Normalization. We resolve a theoretical limitation of Batch Normalization by introducing an unbiased technique for computing the gradient of normalized activations. Online Normalization works with automatic differentiation by adding statistical normalization as a primitive. This technique can be used in cases not covered by some other normalizers, such as recurrent networks, fully connected networks, and networks with activation memory requirements prohibitive for batching. We show its applications to image classification, image segmentation, and language modeling. We present formal proofs and experimental results on ImageNet, CIFAR, and PTB datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/14/2016

Normalizing the Normalizers: Comparing and Extending Network Normalization Schemes

Normalization techniques have only recently begun to be exploited in sup...
research
10/19/2016

Streaming Normalization: Towards Simpler and More Biologically-plausible Normalizations for Online and Recurrent Learning

We systematically explored a spectrum of normalization algorithms relate...
research
11/21/2018

Regularizing by the Variance of the Activations' Sample-Variances

Normalization techniques play an important role in supporting efficient ...
research
07/13/2017

Be Careful What You Backpropagate: A Case For Linear Output Activations & Gradient Boosting

In this work, we show that saturating output activation functions, such ...
research
12/30/2022

Batchless Normalization: How to Normalize Activations with just one Instance in Memory

In training neural networks, batch normalization has many benefits, not ...
research
07/16/2020

A New Look at Ghost Normalization

Batch normalization (BatchNorm) is an effective yet poorly understood te...
research
09/28/2022

Breaking Time Invariance: Assorted-Time Normalization for RNNs

Methods such as Layer Normalization (LN) and Batch Normalization (BN) ha...

Please sign up or login with your details

Forgot password? Click here to reset