Catastrophic forgetting: still a problem for DNNs

05/20/2019
by   B. Pfülb, et al.
0

We investigate the performance of DNNs when trained on class-incremental visual problems consisting of initial training, followed by retraining with added visual classes. Catastrophic forgetting (CF) behavior is measured using a new evaluation procedure that aims at an application-oriented view of incremental learning. In particular, it imposes that model selection must be performed on the initial dataset alone, as well as demanding that retraining control be performed only using the retraining dataset, as initial dataset is usually too large to be kept. Experiments are conducted on class-incremental problems derived from MNIST, using a variety of different DNN models, some of them recently proposed to avoid catastrophic forgetting. When comparing our new evaluation procedure to previous approaches for assessing CF, we find their findings are completely negated, and that none of the tested methods can avoid CF in all experiments. This stresses the importance of a realistic empirical measurement procedure for catastrophic forgetting, and the need for further research in incremental learning for DNNs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/20/2019

A comprehensive, application-oriented study of catastrophic forgetting in DNNs

We present a large-scale empirical study of catastrophic forgetting (CF)...
research
05/03/2020

Continuous Learning in a Single-Incremental-Task Scenario with Spike Features

Deep Neural Networks (DNNs) have two key deficiencies, their dependence ...
research
08/22/2023

An Analysis of Initial Training Strategies for Exemplar-Free Class-Incremental Learning

Class-Incremental Learning (CIL) aims to build classification models fro...
research
12/24/2022

Utilizing Priming to Identify Optimal Class Ordering to Alleviate Catastrophic Forgetting

In order for artificial neural networks to begin accurately mimicking bi...
research
02/16/2022

Diagnosing Batch Normalization in Class Incremental Learning

Extensive researches have applied deep neural networks (DNNs) in class i...
research
07/28/2022

Progressive Voronoi Diagram Subdivision: Towards A Holistic Geometric Framework for Exemplar-free Class-Incremental Learning

Exemplar-free Class-incremental Learning (CIL) is a challenging problem ...
research
11/22/2021

FFNB: Forgetting-Free Neural Blocks for Deep Continual Visual Learning

Deep neural networks (DNNs) have recently achieved a great success in co...

Please sign up or login with your details

Forgot password? Click here to reset