AutoSpace: Neural Architecture Search with Less Human Interference

03/22/2021
by   Daquan Zhou, et al.
0

Current neural architecture search (NAS) algorithms still require expert knowledge and effort to design a search space for network construction. In this paper, we consider automating the search space design to minimize human interference, which however faces two challenges: the explosive complexity of the exploration space and the expensive computation cost to evaluate the quality of different search spaces. To solve them, we propose a novel differentiable evolutionary framework named AutoSpace, which evolves the search space to an optimal one with following novel techniques: a differentiable fitness scoring function to efficiently evaluate the performance of cells and a reference architecture to speedup the evolution procedure and avoid falling into sub-optimal solutions. The framework is generic and compatible with additional computational constraints, making it feasible to learn specialized search spaces that fit different computational budgets. With the learned search space, the performance of recent NAS algorithms can be improved significantly compared with using previously manually designed spaces. Remarkably, the models generated from the new search space achieve 77.8 under the mobile setting (MAdds < 500M), out-performing previous SOTA EfficientNet-B0 by 0.7

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/17/2019

Sample-Efficient Neural Architecture Search by Learning Action Space

Neural Architecture Search (NAS) has emerged as a promising technique fo...
research
03/23/2019

sharpDARTS: Faster and More Accurate Differentiable Architecture Search

Neural Architecture Search (NAS) has been a source of dramatic improveme...
research
07/06/2023

LISSNAS: Locality-based Iterative Search Space Shrinkage for Neural Architecture Search

Search spaces hallmark the advancement of Neural Architecture Search (NA...
research
03/07/2021

Efficient Model Performance Estimation via Feature Histories

An important step in the task of neural network design, such as hyper-pa...
research
09/30/2019

Towards modular and programmable architecture search

Neural architecture search methods are able to find high performance dee...
research
08/26/2019

On the Bounds of Function Approximations

Within machine learning, the subfield of Neural Architecture Search (NAS...
research
03/08/2022

UENAS: A Unified Evolution-based NAS Framework

Neural architecture search (NAS) has gained significant attention for au...

Please sign up or login with your details

Forgot password? Click here to reset