Exemplar Normalization for Learning Deep Representation

03/19/2020
by   Ruimao Zhang, et al.
0

Normalization techniques are important in different advanced neural networks and different tasks. This work investigates a novel dynamic learning-to-normalize (L2N) problem by proposing Exemplar Normalization (EN), which is able to learn different normalization methods for different convolutional layers and image samples of a deep network. EN significantly improves flexibility of the recently proposed switchable normalization (SN), which solves a static L2N problem by linearly combining several normalizers in each normalization layer (the combination is the same for all samples). Instead of directly employing a multi-layer perceptron (MLP) to learn data-dependent parameters as conditional batch normalization (cBN) did, the internal architecture of EN is carefully designed to stabilize its optimization, leading to many appealing benefits. (1) EN enables different convolutional layers, image samples, categories, benchmarks, and tasks to use different normalization methods, shedding light on analyzing them in a holistic view. (2) EN is effective for various network architectures and tasks. (3) It could replace any normalization layers in a deep network and still produce stable model training. Extensive experiments demonstrate the effectiveness of EN in a wide spectrum of tasks including image recognition, noisy label learning, and semantic segmentation. For example, by replacing BN in the ordinary ResNet50, improvement produced by EN is 300 the noisy WebVision dataset.

READ FULL TEXT

page 1

page 8

research
03/25/2019

Weight Standardization

In this paper, we propose Weight Standardization (WS) to accelerate deep...
research
06/28/2018

Differentiable Learning-to-Normalize via Switchable Normalization

We address a learning-to-normalize problem by proposing Switchable Norma...
research
03/09/2019

SSN: Learning Sparse Switchable Normalization via SparsestMax

Normalization methods improve both optimization and generalization of Co...
research
11/19/2018

Do Normalization Layers in a Deep ConvNet Really Need to Be Distinct?

Yes, they do. This work investigates a perspective for deep learning: wh...
research
06/08/2023

Mesh-MLP: An all-MLP Architecture for Mesh Classification and Semantic Segmentation

With the rapid development of geometric deep learning techniques, many m...
research
07/22/2019

Switchable Normalization for Learning-to-Normalize Deep Representation

We address a learning-to-normalize problem by proposing Switchable Norma...
research
04/20/2020

Towards Understanding Normalization in Neural ODEs

Normalization is an important and vastly investigated technique in deep ...

Please sign up or login with your details

Forgot password? Click here to reset