An Empirical Study on Position of the Batch Normalization Layer in Convolutional Neural Networks

12/09/2019
by   Moein Hasani, et al.
0

In this paper, we have studied how the training of the convolutional neural networks (CNNs) can be affected by changing the position of the batch normalization (BN) layer. Three different convolutional neural networks have been chosen for our experiments. These networks are AlexNet, VGG-16, and ResNet- 20. We show that the speed up in training provided by the BN algorithm can be improved by using other positions for the BN layer than the one suggested by its original paper. Also, we discuss how the BN layer in a certain position can aid the training of one network but not the other. Three different positions for the BN layer have been studied in this research. These positions are; the BN layer between the convolution layer and the non-linear activation function, the BN layer after the non-linear activation function and finally, the BN layer before each of the convolutional layers.

READ FULL TEXT
research
08/22/2016

Local Binary Convolutional Neural Networks

We propose local binary convolution (LBC), an efficient alternative to c...
research
05/18/2018

Batch Normalization in the final layer of generative networks

Generative Networks have shown great promise in generating photo-realist...
research
02/29/2020

Channel Equilibrium Networks for Learning Deep Representation

Convolutional Neural Networks (CNNs) are typically constructed by stacki...
research
06/18/2020

Image classification in frequency domain with 2SReLU: a second harmonics superposition activation function

Deep Convolutional Neural Networks are able to identify complex patterns...
research
04/16/2021

High Performance Convolution Using Sparsity and Patterns for Inference in Deep Convolutional Neural Networks

Deploying deep Convolutional Neural Networks (CNNs) is impacted by their...
research
03/27/2019

Understanding Unconventional Preprocessors in Deep Convolutional Neural Networks for Face Identification

Deep networks have achieved huge successes in application domains like o...
research
10/31/2018

SplineNets: Continuous Neural Decision Graphs

We present SplineNets, a practical and novel approach for using conditio...

Please sign up or login with your details

Forgot password? Click here to reset