An Empirical Study of Batch Normalization and Group Normalization in Conditional Computation

07/31/2019
by   Vincent Michalski, et al.
1

Batch normalization has been widely used to improve optimization in deep neural networks. While the uncertainty in batch statistics can act as a regularizer, using these dataset statistics specific to the training set impairs generalization in certain tasks. Recently, alternative methods for normalizing feature activations in neural networks have been proposed. Among them, group normalization has been shown to yield similar, in some domains even superior performance to batch normalization. All these methods utilize a learned affine transformation after the normalization operation to increase representational power. Methods used in conditional computation define the parameters of these transformations as learnable functions of conditioning information. In this work, we study whether and where the conditional formulation of group normalization can improve generalization compared to conditional batch normalization. We evaluate performances on the tasks of visual question answering, few-shot learning, and conditional image generation.

READ FULL TEXT

page 5

page 8

research
10/02/2020

Weight and Gradient Centralization in Deep Neural Networks

Batch normalization is currently the most widely used variant of interna...
research
11/28/2022

Pitfalls of Conditional Batch Normalization for Contextual Multi-Modal Learning

Humans have perfected the art of learning from multiple modalities throu...
research
10/19/1998

General Theory of Image Normalization

We give a systematic, abstract formulation of the image normalization me...
research
06/01/2018

Whitening and Coloring transform for GANs

Batch Normalization (BN) is a common technique used both in discriminati...
research
02/22/2021

Sandwich Batch Normalization

We present Sandwich Batch Normalization (SaBN), an embarrassingly easy i...
research
08/04/2019

Attentive Normalization

Batch Normalization (BN) is a vital pillar in the development of deep le...
research
12/08/2018

Generalized Batch Normalization: Towards Accelerating Deep Neural Networks

Utilizing recently introduced concepts from statistics and quantitative ...

Please sign up or login with your details

Forgot password? Click here to reset