Continual and Multi-Task Architecture Search

06/12/2019
by   Ramakanth Pasunuru, et al.
0

Architecture search is the process of automatically learning the neural model or cell structure that best suits the given task. Recently, this approach has shown promising performance improvements (on language modeling and image classification) with reasonable training speed, using a weight sharing strategy called Efficient Neural Architecture Search (ENAS). In our work, we first introduce a novel continual architecture search (CAS) approach, so as to continually evolve the model parameters during the sequential training of several tasks, without losing performance on previously learned tasks (via block-sparsity and orthogonality constraints), thus enabling life-long learning. Next, we explore a multi-task architecture search (MAS) approach over ENAS for finding a unified, single cell structure that performs well across multiple tasks (via joint controller rewards), and hence allows more generalizable transfer of the cell structure knowledge to an unseen new task. We empirically show the effectiveness of our sequential continual learning and parallel multi-task learning based architecture search approaches on diverse sentence-pair classification tasks (GLUE) and multimodal-generation based video captioning tasks. Further, we present several ablations and analyses on the learned cell structures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/07/2020

Efficient Architecture Search for Continual Learning

Continual learning with neural networks is an important learning framewo...
research
01/24/2022

PaRT: Parallel Learning Towards Robust and Transparent AI

This paper takes a parallel learning approach for robust and transparent...
research
10/08/2020

Evaluating the Effectiveness of Efficient Neural Architecture Search for Sentence-Pair Tasks

Neural Architecture Search (NAS) methods, which automatically learn enti...
research
08/23/2018

Exploring Shared Structures and Hierarchies for Multiple NLP Tasks

Designing shared neural architecture plays an important role in multi-ta...
research
04/14/2021

Neural Architecture Search of Deep Priors: Towards Continual Learning without Catastrophic Interference

In this paper we analyze the classification performance of neural networ...
research
02/15/2019

Fast Task-Aware Architecture Inference

Neural architecture search has been shown to hold great promise towards ...
research
02/21/2019

Overcoming Multi-Model Forgetting

We identify a phenomenon, which we refer to as multi-model forgetting, t...

Please sign up or login with your details

Forgot password? Click here to reset