Efficient Architecture Search for Continual Learning

06/07/2020
by   Qiang Gao, et al.
36

Continual learning with neural networks is an important learning framework in AI that aims to learn a sequence of tasks well. However, it is often confronted with three challenges: (1) overcome the catastrophic forgetting problem, (2) adapt the current network to new tasks, and meanwhile (3) control its model complexity. To reach these goals, we propose a novel approach named as Continual Learning with Efficient Architecture Search, or CLEAS in short. CLEAS works closely with neural architecture search (NAS) which leverages reinforcement learning techniques to search for the best neural architecture that fits a new task. In particular, we design a neuron-level NAS controller that decides which old neurons from previous tasks should be reused (knowledge transfer), and which new neurons should be added (to learn new knowledge). Such a fine-grained controller allows one to find a very concise architecture that can fit each new task well. Meanwhile, since we do not alter the weights of the reused neurons, we perfectly memorize the knowledge learned from previous tasks. We evaluate CLEAS on numerous sequential classification tasks, and the results demonstrate that CLEAS outperforms other state-of-the-art alternative methods, achieving higher classification accuracy while using simpler neural architectures.

READ FULL TEXT

page 7

page 11

research
05/31/2018

Reinforced Continual Learning

Most artificial intelligence models have limiting ability to solve new t...
research
06/12/2019

Continual and Multi-Task Architecture Search

Architecture search is the process of automatically learning the neural ...
research
02/17/2021

Firefly Neural Architecture Descent: a General Approach for Growing Neural Networks

We propose firefly neural architecture descent, a general framework for ...
research
09/14/2019

Neural Architecture Search for Class-incremental Learning

In class-incremental learning, a model learns continuously from a sequen...
research
08/03/2023

Efficient Model Adaptation for Continual Learning at the Edge

Most machine learning (ML) systems assume stationary and matching data d...
research
04/14/2021

Neural Architecture Search of Deep Priors: Towards Continual Learning without Catastrophic Interference

In this paper we analyze the classification performance of neural networ...
research
06/11/2022

A Review on Plastic Artificial Neural Networks: Exploring the Intersection between Neural Architecture Search and Continual Learning

Despite the significant advances achieved in Artificial Neural Networks ...

Please sign up or login with your details

Forgot password? Click here to reset