Generic Neural Architecture Search via Regression

08/04/2021
by   Yuhong Li, et al.
0

Most existing neural architecture search (NAS) algorithms are dedicated to the downstream tasks, e.g., image classification in computer vision. However, extensive experiments have shown that, prominent neural architectures, such as ResNet in computer vision and LSTM in natural language processing, are generally good at extracting patterns from the input data and perform well on different downstream tasks. These observations inspire us to ask: Is it necessary to use the performance of specific downstream tasks to evaluate and search for good neural architectures? Can we perform NAS effectively and efficiently while being agnostic to the downstream task? In this work, we attempt to affirmatively answer the above two questions and improve the state-of-the-art NAS solution by proposing a novel and generic NAS framework, termed Generic NAS (GenNAS). GenNAS does not use task-specific labels but instead adopts regression on a set of manually designed synthetic signal bases for architecture evaluation. Such a self-supervised regression task can effectively evaluate the intrinsic power of an architecture to capture and transform the input signal patterns, and allow more sufficient usage of training samples. We then propose an automatic task search to optimize the combination of synthetic signals using limited downstream-task-specific labels, further improving the performance of GenNAS. We also thoroughly evaluate GenNAS's generality and end-to-end NAS performance on all search spaces, which outperforms almost all existing works with significant speedup.

READ FULL TEXT
research
10/17/2022

Extensible Proxy for Efficient NAS

Neural Architecture Search (NAS) has become a de facto approach in the r...
research
05/30/2021

NAS-BERT: Task-Agnostic and Adaptive-Size BERT Compression with Neural Architecture Search

While pre-trained language models (e.g., BERT) have achieved impressive ...
research
03/26/2020

Are Labels Necessary for Neural Architecture Search?

Existing neural network architectures in computer vision — whether desig...
research
06/12/2020

NAS-Bench-NLP: Neural Architecture Search Benchmark for Natural Language Processing

Neural Architecture Search (NAS) is a promising and rapidly evolving res...
research
06/12/2020

Does Unsupervised Architecture Representation Learning Help Neural Architecture Search?

Existing Neural Architecture Search (NAS) methods either encode neural a...
research
04/25/2020

Deep Multimodal Neural Architecture Search

Designing effective neural networks is fundamentally important in deep m...
research
11/30/2022

AIO-P: Expanding Neural Performance Predictors Beyond Image Classification

Evaluating neural network performance is critical to deep neural network...

Please sign up or login with your details

Forgot password? Click here to reset