DILF-EN framework for Class-Incremental Learning

12/23/2021
by   Mohammed Asad Karim, et al.
23

Deep learning models suffer from catastrophic forgetting of the classes in the older phases as they get trained on the classes introduced in the new phase in the class-incremental learning setting. In this work, we show that the effect of catastrophic forgetting on the model prediction varies with the change in orientation of the same image, which is a novel finding. Based on this, we propose a novel data-ensemble approach that combines the predictions for the different orientations of the image to help the model retain further information regarding the previously seen classes and thereby reduce the effect of forgetting on the model predictions. However, we cannot directly use the data-ensemble approach if the model is trained using traditional techniques. Therefore, we also propose a novel dual-incremental learning framework that involves jointly training the network with two incremental learning objectives, i.e., the class-incremental learning objective and our proposed data-incremental learning objective. In the dual-incremental learning framework, each image belongs to two classes, i.e., the image class (for class-incremental learning) and the orientation class (for data-incremental learning). In class-incremental learning, each new phase introduces a new set of classes, and the model cannot access the complete training data from the older phases. In our proposed data-incremental learning, the orientation classes remain the same across all the phases, and the data introduced by the new phase in class-incremental learning acts as new training data for these orientation classes. We empirically demonstrate that the dual-incremental learning framework is vital to the data-ensemble approach. We apply our proposed approach to state-of-the-art class-incremental learning methods and empirically show that our framework significantly improves the performance of these methods.

READ FULL TEXT

page 5

page 11

research
03/19/2019

Class-incremental Learning via Deep Model Consolidation

Deep neural networks (DNNs) often suffer from "catastrophic forgetting" ...
research
06/26/2022

Class Impression for Data-free Incremental Learning

Standard deep learning-based classification approaches require collectin...
research
12/09/2021

Mimicking the Oracle: An Initial Phase Decorrelation Approach for Class Incremental Learning

Class Incremental Learning (CIL) aims at learning a multi-class classifi...
research
05/30/2022

ACIL: Analytic Class-Incremental Learning with Absolute Memorization and Privacy Protection

Class-incremental learning (CIL) learns a classification model with trai...
research
07/26/2021

Alleviate Representation Overlapping in Class Incremental Learning by Contrastive Class Concentration

The challenge of the Class Incremental Learning (CIL) lies in difficulty...
research
08/11/2017

Deep Incremental Boosting

This paper introduces Deep Incremental Boosting, a new technique derived...
research
10/19/2022

Attaining Class-level Forgetting in Pretrained Model using Few Samples

In order to address real-world problems, deep learning models are jointl...

Please sign up or login with your details

Forgot password? Click here to reset