Task-Balanced Batch Normalization for Exemplar-based Class-Incremental Learning

01/29/2022
by   Sungmin Cha, et al.
0

Batch Normalization (BN) is an essential layer for training neural network models in various computer vision tasks. It has been widely used in continual learning scenarios with little discussion, but we find that BN should be carefully applied, particularly for the exemplar memory based class incremental learning (CIL). We first analyze that the empirical mean and variance obtained for normalization in a BN layer become highly biased toward the current task. To tackle its significant problems in training and test phases, we propose Task-Balanced Batch Normalization (TBBN). Given each mini-batch imbalanced between the current and previous tasks, TBBN first reshapes and repeats the batch, calculating near task-balanced mean and variance. Second, we show that when the affine transformation parameters of BN are learned from a reshaped feature map, they become less-biased toward the current task. Based on our extensive CIL experiments with CIFAR-100 and ImageNet-100 datasets, we demonstrate that our TBBN is easily applicable to most of existing exemplar-based CIL algorithms, improving their performance by decreasing the forgetting on the previous tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/30/2022

Continual Normalization: Rethinking Batch Normalization for Online Continual Learning

Existing continual learning methods use Batch Normalization (BN) to faci...
research
02/21/2018

Batch Normalization and the impact of batch structure on the behavior of deep convolution networks

Batch normalization was introduced in 2015 to speed up training of deep ...
research
02/16/2022

Diagnosing Batch Normalization in Class Incremental Learning

Extensive researches have applied deep neural networks (DNNs) in class i...
research
02/27/2019

Regularity Normalization: Constraining Implicit Space with Minimum Description Length

Inspired by the adaptation phenomenon of biological neuronal firing rate...
research
10/12/2018

Mode Normalization

Normalization methods are a central building block in the deep learning ...
research
03/22/2018

Group Normalization

Batch Normalization (BN) is a milestone technique in the development of ...
research
08/04/2019

Attentive Normalization

Batch Normalization (BN) is a vital pillar in the development of deep le...

Please sign up or login with your details

Forgot password? Click here to reset