FOSTER: Feature Boosting and Compression for Class-Incremental Learning

04/10/2022
by   Fu-Yun Wang, et al.
0

The ability to learn new concepts continually is necessary in this ever-changing world. However, deep neural networks suffer from catastrophic forgetting when learning new categories. Many works have been proposed to alleviate this phenomenon, whereas most of them either fall into the stability-plasticity dilemma or take too much computation or storage overhead. Inspired by the gradient boosting algorithm to gradually fit the residuals between the target and the current approximation function, we propose a novel two-stage learning paradigm FOSTER, empowering the model to learn new categories adaptively. Specifically, we first dynamically expand new modules to fit the residuals of the target and the original model. Next, we remove redundant parameters and feature dimensions through an effective distillation strategy to maintain the single backbone model. We validate our method FOSTER on CIFAR-100, ImageNet-100/1000 under different settings. Experimental results show that our method achieves state-of-the-art performance.

READ FULL TEXT
research
05/09/2023

SRIL: Selective Regularization for Class-Incremental Learning

Human intelligence gradually accepts new information and accumulates kno...
research
11/16/2019

Maintaining Discrimination and Fairness in Class Incremental Learning

Deep neural networks (DNNs) have been applied in class incremental learn...
research
11/28/2017

FearNet: Brain-Inspired Model for Incremental Learning

Incremental class learning involves sequentially learning classes in bur...
research
03/29/2019

Incremental Learning with Unlabeled Data in the Wild

Deep neural networks are known to suffer from catastrophic forgetting in...
research
04/11/2023

Density Map Distillation for Incremental Object Counting

We investigate the problem of incremental learning for object counting, ...
research
09/25/2019

Learning with Long-term Remembering: Following the Lead of Mixed Stochastic Gradient

Current deep neural networks can achieve remarkable performance on a sin...
research
04/27/2020

Lifelong Learning Process: Self-Memory Supervising and Dynamically Growing Networks

From childhood to youth, human gradually come to know the world. But for...

Please sign up or login with your details

Forgot password? Click here to reset