Class-incremental Learning via Deep Model Consolidation

03/19/2019
by   Junting Zhang, et al.
12

Deep neural networks (DNNs) often suffer from "catastrophic forgetting" during incremental learning (IL) --- an abrupt degradation of performance on the original set of classes when the training objective is adapted to a newly added set of classes. Existing IL approaches attempting to overcome catastrophic forgetting tend to produce a model that is biased towards either the old classes or new classes, unless with the help of exemplars of the old data. To address this issue, we propose a class-incremental learning paradigm called Deep Model Consolidation (DMC), which works well even when the original training data is not available. The idea is to first train a separate model only for the new classes, and then combine the two individual models trained on data of two distinct set of classes (old classes and new classes) via a novel dual distillation training objective. The two models are consolidated by exploiting publicly available unlabeled auxiliary data. This overcomes the potential difficulties due to unavailability of original training data. Compared to the state-of-the-art techniques, DMC demonstrates significantly better performance in CIFAR-100 image classification and PASCAL VOC 2007 object detection benchmarks in the IL setting.

READ FULL TEXT
research
08/23/2017

Incremental Learning of Object Detectors without Catastrophic Forgetting

Despite their success for object detection, convolutional neural network...
research
12/23/2021

DILF-EN framework for Class-Incremental Learning

Deep learning models suffer from catastrophic forgetting of the classes ...
research
11/16/2019

Maintaining Discrimination and Fairness in Class Incremental Learning

Deep neural networks (DNNs) have been applied in class incremental learn...
research
08/04/2020

Memory Efficient Class-Incremental Learning for Image Classification

With the memory-resource-limited constraints, class-incremental learning...
research
07/28/2022

Progressive Voronoi Diagram Subdivision: Towards A Holistic Geometric Framework for Exemplar-free Class-Incremental Learning

Exemplar-free Class-incremental Learning (CIL) is a challenging problem ...
research
05/20/2019

Label Mapping Neural Networks with Response Consolidation for Class Incremental Learning

Class incremental learning refers to a special multi-class classificatio...
research
03/24/2022

R-DFCIL: Relation-Guided Representation Learning for Data-Free Class Incremental Learning

Class-Incremental Learning (CIL) struggles with catastrophic forgetting ...

Please sign up or login with your details

Forgot password? Click here to reset