Learn to Grow: A Continual Structure Learning Framework for Overcoming Catastrophic Forgetting

03/31/2019
by   Xilai Li, et al.
6

Addressing catastrophic forgetting is one of the key challenges in continual learning where machine learning systems are trained with sequential or streaming tasks. Despite recent remarkable progress in state-of-the-art deep learning, deep neural networks (DNNs) are still plagued with the catastrophic forgetting problem. This paper presents a conceptually simple yet general and effective framework for handling catastrophic forgetting in continual learning with DNNs. The proposed method consists of two components: a neural structure optimization component and a parameter learning and/or fine-tuning component. The former learns the best neural structure for the current task on top of the current DNN trained with previous tasks. It learns whether to reuse or adapt building blocks in the current DNN, or to create new ones if needed under the differentiable neural architecture search framework. The latter estimates parameters for newly introduced structures, and fine-tunes the old ones if preferred. By separating the explicit neural structure learning and the parameter estimation, not only is the proposed method capable of evolving neural structures in an intuitively meaningful way, but also shows strong capabilities of alleviating catastrophic forgetting in experiments. Furthermore, the proposed method outperforms all other baselines on the permuted MNIST dataset, the split CIFAR100 dataset and the Visual Domain Decathlon dataset in continual learning setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/08/2023

A multifidelity approach to continual learning for physical systems

We introduce a novel continual learning method based on multifidelity de...
research
11/22/2021

FFNB: Forgetting-Free Neural Blocks for Deep Continual Visual Learning

Deep neural networks (DNNs) have recently achieved a great success in co...
research
03/19/2020

Lifelong Learning with Searchable Extension Units

Lifelong learning remains an open problem. One of its main difficulties ...
research
09/19/2019

ContCap: A comprehensive framework for continual image captioning

While cutting-edge image captioning systems are increasingly describing ...
research
02/17/2021

Firefly Neural Architecture Descent: a General Approach for Growing Neural Networks

We propose firefly neural architecture descent, a general framework for ...
research
04/10/2022

Edge Continual Learning for Dynamic Digital Twins over Wireless Networks

Digital twins (DTs) constitute a critical link between the real-world an...
research
12/03/2019

Overcoming Catastrophic Forgetting by Generative Regularization

In this paper, we propose a new method to overcome catastrophic forgetti...

Please sign up or login with your details

Forgot password? Click here to reset