eTag: Class-Incremental Learning with Embedding Distillation and Task-Oriented Generation

04/20/2023
by   Libo Huang, et al.
0

Class-Incremental Learning (CIL) aims to solve the neural networks' catastrophic forgetting problem, which refers to the fact that once the network updates on a new task, its performance on previously-learned tasks drops dramatically. Most successful CIL methods incrementally train a feature extractor with the aid of stored exemplars, or estimate the feature distribution with the stored prototypes. However, the stored exemplars would violate the data privacy concerns, while the stored prototypes might not reasonably be consistent with a proper feature distribution, hindering the exploration of real-world CIL applications. In this paper, we propose a method of embedding distillation and Task-oriented generation (eTag) for CIL, which requires neither the exemplar nor the prototype. Instead, eTag achieves a data-free manner to train the neural networks incrementally. To prevent the feature extractor from forgetting, eTag distills the embeddings of the network's intermediate blocks. Additionally, eTag enables a generative network to produce suitable features, fitting the needs of the top incremental classifier. Experimental results confirmed that our proposed eTag considerably outperforms the state-of-the-art methods on CIFAR-100 and ImageNet-sub[Our code is available in the Supplementary Materials.]

READ FULL TEXT

page 4

page 7

page 12

research
04/20/2020

Generative Feature Replay For Class-Incremental Learning

Humans are capable of learning new tasks without forgetting previous one...
research
06/21/2023

Complementary Learning Subnetworks for Parameter-Efficient Class-Incremental Learning

In the scenario of class-incremental learning (CIL), deep neural network...
research
04/04/2021

Class-incremental Learning using a Sequence of Partial Implicitly Regularized Classifiers

In class-incremental learning, the objective is to learn a number of cla...
research
05/26/2023

Teamwork Is Not Always Good: An Empirical Study of Classifier Drift in Class-incremental Information Extraction

Class-incremental learning (CIL) aims to develop a learning system that ...
research
10/17/2021

Reminding the Incremental Language Model via Data-Free Self-Distillation

Incremental language learning with pseudo-data can alleviate catastrophi...
research
07/08/2018

Distillation Techniques for Pseudo-rehearsal Based Incremental Learning

The ability to learn from incrementally arriving data is essential for a...
research
05/27/2022

Geometer: Graph Few-Shot Class-Incremental Learning via Prototype Representation

With the tremendous expansion of graphs data, node classification shows ...

Please sign up or login with your details

Forgot password? Click here to reset