MTL-NAS: Task-Agnostic Neural Architecture Search towards General-Purpose Multi-Task Learning

03/31/2020
by   Yuan Gao, et al.
0

We propose to incorporate neural architecture search (NAS) into general-purpose multi-task learning (GP-MTL). Existing NAS methods typically define different search spaces according to different tasks. In order to adapt to different task combinations (i.e., task sets), we disentangle the GP-MTL networks into single-task backbones (optionally encode the task priors), and a hierarchical and layerwise features sharing/fusing scheme across them. This enables us to design a novel and general task-agnostic search space, which inserts cross-task edges (i.e., feature fusion connections) into fixed single-task network backbones. Moreover, we also propose a novel single-shot gradient-based search algorithm that closes the performance gap between the searched architectures and the final evaluation architecture. This is realized with a minimum entropy regularization on the architecture weights during the search phase, which makes the architecture weights converge to near-discrete values and therefore achieves a single model. As a result, our searched model can be directly used for evaluation without (re-)training from scratch. We perform extensive experiments using different single-task backbones on various task sets, demonstrating the promising performance obtained by exploiting the hierarchical and layerwise features, as well as the desirable generalizability to different i) task sets and ii) single-task backbones. The code of our paper is available at https://github.com/bhpfelix/MTLNAS.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/04/2021

Neural Architecture Search via Bregman Iterations

We propose a novel strategy for Neural Architecture Search (NAS) based o...
research
08/07/2020

A Surgery of the Neural Architecture Evaluators

Neural architecture search (NAS) recently received extensive attention d...
research
04/21/2023

Can GPT-4 Perform Neural Architecture Search?

We investigate the potential of GPT-4~\cite{gpt4} to perform Neural Arch...
research
05/02/2023

Predict NAS Multi-Task by Stacking Ensemble Models using GP-NAS

Accurately predicting the performance of architecture with small sample ...
research
10/17/2022

Extensible Proxy for Efficient NAS

Neural Architecture Search (NAS) has become a de facto approach in the r...
research
06/12/2023

Robustifying DARTS by Eliminating Information Bypass Leakage via Explicit Sparse Regularization

Differentiable architecture search (DARTS) is a promising end to end NAS...
research
09/01/2020

Boosting share routing for multi-task learning

Multi-task learning (MTL) aims to make full use of the knowledge contain...

Please sign up or login with your details

Forgot password? Click here to reset