Lifelong Learning with Searchable Extension Units

03/19/2020
by   Wenjin Wang, et al.
0

Lifelong learning remains an open problem. One of its main difficulties is catastrophic forgetting. Many dynamic expansion approaches have been proposed to address this problem, but they all use homogeneous models of predefined structure for all tasks. The common original model and expansion structures ignore the requirement of different model structures on different tasks, which leads to a less compact model for multiple tasks and causes the model size to increase rapidly as the number of tasks increases. Moreover, they can not perform best on all tasks. To solve those problems, in this paper, we propose a new lifelong learning framework named Searchable Extension Units (SEU) by introducing Neural Architecture Search into lifelong learning, which breaks down the need for a predefined original model and searches for specific extension units for different tasks, without compromising the performance of the model on different tasks. Our approach can obtain a much more compact model without catastrophic forgetting. The experimental results on the PMNIST, the split CIFAR10 dataset, the split CIFAR100 dataset, and the Mixture dataset empirically prove that our method can achieve higher accuracy with much smaller model, whose size is about 25-33 percentage of that of the state-of-the-art methods.

READ FULL TEXT

page 1

page 7

page 8

page 9

research
03/31/2019

Learn to Grow: A Continual Structure Learning Framework for Overcoming Catastrophic Forgetting

Addressing catastrophic forgetting is one of the key challenges in conti...
research
04/29/2020

Reducing catastrophic forgetting with learning on synthetic data

Catastrophic forgetting is a problem caused by neural networks' inabilit...
research
06/22/2019

Beneficial perturbation network for continual learning

Sequential learning of multiple tasks in artificial neural networks usin...
research
12/03/2019

Overcoming Catastrophic Forgetting by Generative Regularization

In this paper, we propose a new method to overcome catastrophic forgetti...
research
04/06/2021

Learnable Expansion-and-Compression Network for Few-shot Class-Incremental Learning

Few-shot class-incremental learning (FSCIL), which targets at continuous...
research
04/11/2023

Task Difficulty Aware Parameter Allocation Regularization for Lifelong Learning

Parameter regularization or allocation methods are effective in overcomi...
research
03/27/2021

Addressing catastrophic forgetting for medical domain expansion

Model brittleness is a key concern when deploying deep learning models i...

Please sign up or login with your details

Forgot password? Click here to reset