Revisiting Batch Normalization

10/26/2021
by   Jim Davis, et al.
0

Batch normalization (BN) is comprised of a normalization component followed by an affine transformation and has become essential for training deep neural networks. Standard initialization of each BN in a network sets the affine transformation scale and shift to 1 and 0, respectively. However, after training we have observed that these parameters do not alter much from their initialization. Furthermore, we have noticed that the normalization process can still yield overly large values, which is undesirable for training. We revisit the BN formulation and present a new initialization method and update approach for BN to address the aforementioned issues. Experimental results using the proposed alterations to BN show statistically significant performance gains in a variety of scenarios. The approach can be used with existing implementations at no additional computational cost. We also present a new online BN-based input data normalization technique to alleviate the need for other offline or fixed methods. Source code is available at https://github.com/osu-cvl/revisiting-bn.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/30/2020

Deep Isometric Learning for Visual Recognition

Initialization, normalization, and skip connections are believed to be t...
research
05/04/2017

Pixel Normalization from Numeric Data as Input to Neural Networks

Text to image transformation for input to neural networks requires inter...
research
07/27/2016

Instance Normalization: The Missing Ingredient for Fast Stylization

It this paper we revisit the fast stylization method introduced in Ulyan...
research
11/28/2020

Batch Normalization with Enhanced Linear Transformation

Batch normalization (BN) is a fundamental unit in modern deep networks, ...
research
08/04/2019

Attentive Normalization

Batch Normalization (BN) is a vital pillar in the development of deep le...
research
01/13/2022

Depth Normalization of Small RNA Sequencing: Using Data and Biology to Select a Suitable Method

Deep sequencing has become one of the most popular tools for transcripto...
research
03/22/2023

An Empirical Analysis of the Shift and Scale Parameters in BatchNorm

Batch Normalization (BatchNorm) is a technique that improves the trainin...

Please sign up or login with your details

Forgot password? Click here to reset