Batch Normalization and the impact of batch structure on the behavior of deep convolution networks

02/21/2018
by   Mohamed Hajaj, et al.
0

Batch normalization was introduced in 2015 to speed up training of deep convolution networks by normalizing the activations across the current batch to have zero mean and unity variance. The results presented here show an interesting aspect of batch normalization, where controlling the shape of the training batches can influence what the network will learn. If training batches are structured as balanced batches (one image per class), and inference is also carried out on balanced test batches, using the batch's own means and variances, then the conditional results will improve considerably. The network uses the strong information about easy images in a balanced batch, and propagates it through the shared means and variances to help decide the identity of harder images on the same batch. Balancing the test batches requires the labels of the test images, which are not available in practice, however further investigation can be done using batch structures that are less strict and might not require the test image labels. The conditional results show the error rate almost reduced to zero for nontrivial datasets with small number of classes such as the CIFAR10.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/29/2022

Task-Balanced Batch Normalization for Exemplar-based Class-Incremental Learning

Batch Normalization (BN) is an essential layer for training neural netwo...
research
10/16/2020

Filtered Batch Normalization

It is a common assumption that the activation of different layers in neu...
research
02/21/2018

Learning Multiple Categories on Deep Convolution Networks

Deep convolution networks have proved very successful with big datasets ...
research
06/01/2018

Whitening and Coloring transform for GANs

Batch Normalization (BN) is a common technique used both in discriminati...
research
02/10/2017

Batch Renormalization: Towards Reducing Minibatch Dependence in Batch-Normalized Models

Batch Normalization is quite effective at accelerating and improving the...
research
07/18/2022

Easy Batch Normalization

It was shown that adversarial examples improve object recognition. But w...
research
05/18/2018

Batch Normalization in the final layer of generative networks

Generative Networks have shown great promise in generating photo-realist...

Please sign up or login with your details

Forgot password? Click here to reset