Neural Architecture Search for Class-incremental Learning

09/14/2019
by   Shenyang Huang, et al.
0

In class-incremental learning, a model learns continuously from a sequential data stream in which new classes occur. Existing methods often rely on static architectures that are manually crafted. These methods can be prone to capacity saturation because a neural network's ability to generalize to new concepts is limited by its fixed capacity. To understand how to expand a continual learner, we focus on the neural architecture design problem in the context of class-incremental learning: at each time step, the learner must optimize its performance on all classes observed so far by selecting the most competitive neural architecture. To tackle this problem, we propose Continual Neural Architecture Search (CNAS): an autoML approach that takes advantage of the sequential nature of class-incremental learning to efficiently and adaptively identify strong architectures in a continual learning setting. We employ a task network to perform the classification task and a reinforcement learning agent as the meta-controller for architecture search. In addition, we apply network transformations to transfer weights from previous learning step and to reduce the size of the architecture search space, thus saving a large amount of computational resources. We evaluate CNAS on the CIFAR-100 dataset under varied incremental learning scenarios with limited computational power (1 GPU). Experimental results demonstrate that CNAS outperforms architectures that are optimized for the entire dataset. In addition, CNAS is at least an order of magnitude more efficient than naively using existing autoML methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/07/2020

Efficient Architecture Search for Continual Learning

Continual learning with neural networks is an important learning framewo...
research
06/18/2019

A Study of the Learning Progress in Neural Architecture Search Techniques

In neural architecture search, the structure of the neural network to be...
research
04/14/2021

Neural Architecture Search of Deep Priors: Towards Continual Learning without Catastrophic Interference

In this paper we analyze the classification performance of neural networ...
research
06/07/2018

Path-Level Network Transformation for Efficient Architecture Search

We introduce a new function-preserving transformation for efficient neur...
research
11/11/2021

Learning from Mistakes – A Framework for Neural Architecture Search

Learning from one's mistakes is an effective human learning technique wh...
research
07/01/2021

AdaXpert: Adapting Neural Architecture for Growing Data

In real-world applications, data often come in a growing manner, where t...
research
12/01/2021

Learning from Mistakes based on Class Weighting with Application to Neural Architecture Search

Learning from mistakes is an effective learning approach widely used in ...

Please sign up or login with your details

Forgot password? Click here to reset