Do Normalization Layers in a Deep ConvNet Really Need to Be Distinct?

11/19/2018
by   Ping Luo, et al.
10

Yes, they do. This work investigates a perspective for deep learning: whether different normalization layers in a ConvNet require different normalizers. This is the first step towards understanding this phenomenon. We allow each convolutional layer to be stacked before a switchable normalization (SN) that learns to choose a normalizer from a pool of normalization methods. Through systematic experiments in ImageNet, COCO, Cityscapes, and ADE20K, we answer three questions: (a) Is it useful to allow each normalization layer to select its own normalizer? (b) What impacts the choices of normalizers? (c) Do different tasks and datasets prefer different normalizers? Our results suggest that (1) using distinct normalizers improves both learning and generalization of a ConvNet; (2) the choices of normalizers are more related to depth and batch size, but less relevant to parameter initialization, learning rate decay, and solver; (3) different tasks and datasets have different behaviors when learning to select normalizers.

READ FULL TEXT

page 1

page 2

page 3

page 5

page 6

page 8

page 9

page 10

research
07/22/2019

Switchable Normalization for Learning-to-Normalize Deep Representation

We address a learning-to-normalize problem by proposing Switchable Norma...
research
10/08/2021

A Loss Curvature Perspective on Training Instability in Deep Learning

In this work, we study the evolution of the loss Hessian across many cla...
research
06/14/2022

Understanding the Generalization Benefit of Normalization Layers: Sharpness Reduction

Normalization layers (e.g., Batch Normalization, Layer Normalization) we...
research
06/17/2016

Most central or least central? How much modeling decisions influence a node's centrality ranking in multiplex networks

To understand a node's centrality in a multiplex network, its centrality...
research
03/09/2019

SSN: Learning Sparse Switchable Normalization via SparsestMax

Normalization methods improve both optimization and generalization of Co...
research
03/19/2020

Exemplar Normalization for Learning Deep Representation

Normalization techniques are important in different advanced neural netw...
research
10/13/2022

NoMorelization: Building Normalizer-Free Models from a Sample's Perspective

The normalizing layer has become one of the basic configurations of deep...

Please sign up or login with your details

Forgot password? Click here to reset