DeepAI AI Chat
Log In Sign Up

Filter Response Normalization Layer: Eliminating Batch Dependence in the Training of Deep Neural Networks

by   Saurabh Singh, et al.

Batch Normalization (BN) is a highly successful and widely used batch dependent training method. Its use of mini-batch statistics to normalize the activations introduces dependence between samples, which can hurt the training if the mini-batch size is too small, or if the samples are correlated. Several alternatives, such as Batch Renormalization and Group Normalization (GN), have been proposed to address these issues. However, they either do not match the performance of BN for large batches, or still exhibit degradation in performance for smaller batches, or introduce artificial constraints on the model architecture. In this paper we propose the Filter Response Normalization (FRN) layer, a novel combination of a normalization and an activation function, that can be used as a drop-in replacement for other normalizations and activations. Our method operates on each activation map of each batch sample independently, eliminating the dependency on other batch samples or channels of the same sample. Our method outperforms BN and all alternatives in a variety of settings for all batch sizes. FRN layer performs ≈ 0.7-1.0% better on top-1 validation accuracy than BN with large mini-batch sizes on Imagenet classification on InceptionV3 and ResnetV2-50 architectures. Further, it performs >1% better than GN on the same problem in the small mini-batch size regime. For object detection problem on COCO dataset, FRN layer outperforms all other methods by at least 0.3-0.5% in all batch size regimes.


page 1

page 2

page 3

page 4


Batch Kalman Normalization: Towards Training Deep Neural Networks with Micro-Batches

As an indispensable component, Batch Normalization (BN) has successfully...

Cross-Iteration Batch Normalization

A well-known issue of Batch Normalization is its significantly reduced e...

EvalNorm: Estimating Batch Normalization Statistics for Evaluation

Batch normalization (BN) has been very effective for deep learning and i...

Batch Normalization Sampling

Deep Neural Networks (DNNs) thrive in recent years in which Batch Normal...

Group Whitening: Balancing Learning Efficiency and Representational Capacity

Batch normalization (BN) is an important technique commonly incorporated...

Micro Batch Streaming: Allowing the Training of DNN models Using a large batch size on Small Memory Systems

The size of the deep learning models has greatly increased over the past...

Batch Renormalization: Towards Reducing Minibatch Dependence in Batch-Normalized Models

Batch Normalization is quite effective at accelerating and improving the...