Rethinking Performance Estimation in Neural Architecture Search

05/20/2020
by   Xiawu Zheng, et al.
0

Neural architecture search (NAS) remains a challenging problem, which is attributed to the indispensable and time-consuming component of performance estimation (PE). In this paper, we provide a novel yet systematic rethinking of PE in a resource constrained regime, termed budgeted PE (BPE), which precisely and effectively estimates the performance of an architecture sampled from an architecture space. Since searching an optimal BPE is extremely time-consuming as it requires to train a large number of networks for evaluation, we propose a Minimum Importance Pruning (MIP) approach. Given a dataset and a BPE search space, MIP estimates the importance of hyper-parameters using random forest and subsequently prunes the minimum one from the next iteration. In this way, MIP effectively prunes less important hyper-parameters to allocate more computational resource on more important ones, thus achieving an effective exploration. By combining BPE with various search algorithms including reinforcement learning, evolution algorithm, random search, and differentiable architecture search, we achieve 1, 000x of NAS speed up with a negligible performance drop comparing to the SOTA

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/05/2020

AutoHAS: Differentiable Hyper-parameter and Architecture Search

Neural Architecture Search (NAS) has achieved significant progress in pu...
research
11/05/2018

You Only Search Once: Single Shot Neural Architecture Search via Direct Sparse Optimization

Recently Neural Architecture Search (NAS) has aroused great interest in ...
research
07/22/2022

Guided Evolutionary Neural Architecture Search With Efficient Performance Estimation

Neural Architecture Search (NAS) methods have been successfully applied ...
research
03/07/2021

Efficient Model Performance Estimation via Feature Histories

An important step in the task of neural network design, such as hyper-pa...
research
09/11/2019

CARS: Continuous Evolution for Efficient Neural Architecture Search

Searching techniques in most of existing neural architecture search (NAS...
research
09/02/2020

Understanding the wiring evolution in differentiable neural architecture search

Controversy exists on whether differentiable neural architecture search ...
research
09/13/2021

DHA: End-to-End Joint Optimization of Data Augmentation Policy, Hyper-parameter and Architecture

Automated machine learning (AutoML) usually involves several crucial com...

Please sign up or login with your details

Forgot password? Click here to reset