DA-NAS: Data Adapted Pruning for Efficient Neural Architecture Search

03/27/2020
by   Xiyang Dai, et al.
0

Efficient search is a core issue in Neural Architecture Search (NAS). It is difficult for conventional NAS algorithms to directly search the architectures on large-scale tasks like ImageNet. In general, the cost of GPU hours for NAS grows with regard to training dataset size and candidate set size. One common way is searching on a smaller proxy dataset (e.g., CIFAR-10) and then transferring to the target task (e.g., ImageNet). These architectures optimized on proxy data are not guaranteed to be optimal on the target task. Another common way is learning with a smaller candidate set, which may require expert knowledge and indeed betrays the essence of NAS. In this paper, we present DA-NAS that can directly search the architecture for large-scale target tasks while allowing a large candidate set in a more efficient manner. Our method is based on an interesting observation that the learning speed for blocks in deep neural networks is related to the difficulty of recognizing distinct categories. We carefully design a progressive data adapted pruning strategy for efficient architecture search. It will quickly trim low performed blocks on a subset of target dataset (e.g., easy classes), and then gradually find the best blocks on the whole target dataset. At this time, the original candidate set becomes as compact as possible, providing a faster search in the target task. Experiments on ImageNet verify the effectiveness of our approach. It is 2x faster than previous methods while the accuracy is currently state-of-the-art, at 76.2 (i.e., more candidate blocks) to efficiently search the best-performing architecture.

READ FULL TEXT
research
12/02/2018

ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware

Neural architecture search (NAS) has a great impact by automatically des...
research
06/09/2021

Accelerating Neural Architecture Search via Proxy Data

Despite the increasing interest in neural architecture search (NAS), the...
research
11/20/2020

Large Scale Neural Architecture Search with Polyharmonic Splines

Neural Architecture Search (NAS) is a powerful tool to automatically des...
research
07/08/2021

Core-set Sampling for Efficient Neural Architecture Search

Neural architecture search (NAS), an important branch of automatic machi...
research
11/21/2019

Data Proxy Generation for Fast and Efficient Neural Architecture Search

Due to the recent advances on Neural Architecture Search (NAS), it gains...
research
03/14/2022

Less is More: Proxy Datasets in NAS approaches

Neural Architecture Search (NAS) defines the design of Neural Networks a...
research
03/11/2020

PONAS: Progressive One-shot Neural Architecture Search for Very Efficient Deployment

We achieve very efficient deep learning model deployment that designs ne...

Please sign up or login with your details

Forgot password? Click here to reset