Core-set Sampling for Efficient Neural Architecture Search

07/08/2021
by   Jae-hun Shim, et al.
0

Neural architecture search (NAS), an important branch of automatic machine learning, has become an effective approach to automate the design of deep learning models. However, the major issue in NAS is how to reduce the large search time imposed by the heavy computational burden. While most recent approaches focus on pruning redundant sets or developing new search methodologies, this paper attempts to formulate the problem based on the data curation manner. Our key strategy is to search the architecture using summarized data distribution, i.e., core-set. Typically, many NAS algorithms separate searching and training stages, and the proposed core-set methodology is only used in search stage, thus their performance degradation can be minimized. In our experiments, we were able to save overall computational time from 30.8 hours to 3.5 hours, 8.8x reduction, on a single RTX 3090 GPU without sacrificing accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/23/2021

Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective

Neural Architecture Search (NAS) has been explosively studied to automat...
research
12/17/2020

On the performance of deep learning for numerical optimization: an application to protein structure prediction

Deep neural networks have recently drawn considerable attention to build...
research
03/27/2020

DA-NAS: Data Adapted Pruning for Efficient Neural Architecture Search

Efficient search is a core issue in Neural Architecture Search (NAS). It...
research
09/08/2021

RepNAS: Searching for Efficient Re-parameterizing Blocks

In the past years, significant improvements in the field of neural archi...
research
10/30/2020

AgEBO-Tabular: Joint Neural Architecture and Hyperparameter Search with Autotuned Data-Parallel Training for Tabular Data

Developing high-performing predictive models for large tabular data sets...
research
05/29/2020

HourNAS: Extremely Fast Neural Architecture Search Through an Hourglass Lens

Neural Architecture Search (NAS) refers to automatically design the arch...
research
09/20/2023

Grassroots Operator Search for Model Edge Adaptation

Hardware-aware Neural Architecture Search (HW-NAS) is increasingly being...

Please sign up or login with your details

Forgot password? Click here to reset