FFNB: Forgetting-Free Neural Blocks for Deep Continual Visual Learning

11/22/2021
by   Hichem Sahbi, et al.
0

Deep neural networks (DNNs) have recently achieved a great success in computer vision and several related fields. Despite such progress, current neural architectures still suffer from catastrophic interference (a.k.a. forgetting) which obstructs DNNs to learn continually. While several state-of-the-art methods have been proposed to mitigate forgetting, these existing solutions are either highly rigid (as regularization) or time/memory demanding (as replay). An intermediate class of methods, based on dynamic networks, has been proposed in the literature and provides a reasonable balance between task memorization and computational footprint. In this paper, we devise a dynamic network architecture for continual learning based on a novel forgetting-free neural block (FFNB). Training FFNB features on new tasks is achieved using a novel procedure that constrains the underlying parameters in the null-space of the previous tasks, while training classifier parameters equates to Fisher discriminant analysis. The latter provides an effective incremental process which is also optimal from a Bayesian perspective. The trained features and classifiers are further enhanced using an incremental "end-to-end" fine-tuning. Extensive experiments, conducted on different challenging classification problems, show the high effectiveness of the proposed method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/31/2019

Learn to Grow: A Continual Structure Learning Framework for Overcoming Catastrophic Forgetting

Addressing catastrophic forgetting is one of the key challenges in conti...
research
11/23/2022

CODA-Prompt: COntinual Decomposed Attention-based Prompting for Rehearsal-Free Continual Learning

Computer vision models suffer from a phenomenon known as catastrophic fo...
research
08/17/2022

DLCFT: Deep Linear Continual Fine-Tuning for General Incremental Learning

Pre-trained representation is one of the key elements in the success of ...
research
05/20/2019

Catastrophic forgetting: still a problem for DNNs

We investigate the performance of DNNs when trained on class-incremental...
research
07/14/2022

E2-AEN: End-to-End Incremental Learning with Adaptively Expandable Network

Expandable networks have demonstrated their advantages in dealing with c...
research
02/16/2022

Diagnosing Batch Normalization in Class Incremental Learning

Extensive researches have applied deep neural networks (DNNs) in class i...
research
04/19/2021

Overcoming Catastrophic Forgetting with Gaussian Mixture Replay

We present Gaussian Mixture Replay (GMR), a rehearsal-based approach for...

Please sign up or login with your details

Forgot password? Click here to reset