Extensible Proxy for Efficient NAS

10/17/2022
by   Yuhong Li, et al.
0

Neural Architecture Search (NAS) has become a de facto approach in the recent trend of AutoML to design deep neural networks (DNNs). Efficient or near-zero-cost NAS proxies are further proposed to address the demanding computational issues of NAS, where each candidate architecture network only requires one iteration of backpropagation. The values obtained from the proxies are considered the predictions of architecture performance on downstream tasks. However, two significant drawbacks hinder the extended usage of Efficient NAS proxies. (1) Efficient proxies are not adaptive to various search spaces. (2) Efficient proxies are not extensible to multi-modality downstream tasks. Based on the observations, we design a Extensible proxy (Eproxy) that utilizes self-supervised, few-shot training (i.e., 10 iterations of backpropagation) which yields near-zero costs. The key component that makes Eproxy efficient is an untrainable convolution layer termed barrier layer that add the non-linearities to the optimization spaces so that the Eproxy can discriminate the performance of architectures in the early stage. Furthermore, to make Eproxy adaptive to different downstream tasks/search spaces, we propose a Discrete Proxy Search (DPS) to find the optimized training settings for Eproxy with only handful of benchmarked architectures on the target tasks. Our extensive experiments confirm the effectiveness of both Eproxy and Eproxy+DPS. Code is available at https://github.com/leeyeehoo/GenNAS-Zero.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/28/2023

PreNAS: Preferred One-Shot Learning Towards Efficient Neural Architecture Search

The wide application of pre-trained models is driving the trend of once-...
research
08/04/2021

Generic Neural Architecture Search via Regression

Most existing neural architecture search (NAS) algorithms are dedicated ...
research
05/30/2021

NAS-BERT: Task-Agnostic and Adaptive-Size BERT Compression with Neural Architecture Search

While pre-trained language models (e.g., BERT) have achieved impressive ...
research
03/15/2021

Pretraining Neural Architecture Search Controllers with Locality-based Self-Supervised Learning

Neural architecture search (NAS) has fostered various fields of machine ...
research
09/15/2022

EZNAS: Evolving Zero Cost Proxies For Neural Architecture Scoring

Neural Architecture Search (NAS) has significantly improved productivity...
research
03/17/2022

DATA: Domain-Aware and Task-Aware Pre-training

The paradigm of training models on massive data without label through se...
research
03/31/2020

MTL-NAS: Task-Agnostic Neural Architecture Search towards General-Purpose Multi-Task Learning

We propose to incorporate neural architecture search (NAS) into general-...

Please sign up or login with your details

Forgot password? Click here to reset