DeepAI AI Chat
Log In Sign Up

Normalization Techniques in Training DNNs: Methodology, Analysis and Application

by   Lei Huang, et al.

Normalization techniques are essential for accelerating the training and improving the generalization of deep neural networks (DNNs), and have successfully been used in various applications. This paper reviews and comments on the past, present and future of normalization methods in the context of DNN training. We provide a unified picture of the main motivation behind different approaches from the perspective of optimization, and present a taxonomy for understanding the similarities and differences between them. Specifically, we decompose the pipeline of the most representative normalizing activation methods into three components: the normalization area partitioning, normalization operation and normalization representation recovery. In doing so, we provide insight for designing new normalization technique. Finally, we discuss the current progress in understanding normalization methods, and provide a comprehensive review of the applications of normalization for particular tasks, in which it can effectively solve the key issues.


page 1

page 2

page 3

page 4


Context Normalization for Robust Image Classification

Normalization is a pre-processing step that converts the data into a mor...

Towards a Theoretical Understanding of Batch Normalization

Normalization techniques such as Batch Normalization have been applied v...

A Unified Framework for Training Neural Networks

The lack of mathematical tractability of Deep Neural Networks (DNNs) has...

Beyond BatchNorm: Towards a General Understanding of Normalization in Deep Learning

Inspired by BatchNorm, there has been an explosion of normalization laye...

Towards Understanding Normalization in Neural ODEs

Normalization is an important and vastly investigated technique in deep ...

A survey of deep neural network watermarking techniques

Protecting the Intellectual Property Rights (IPR) associated to Deep Neu...