Understanding and Accelerating Neural Architecture Search with Training-Free and Theory-Grounded Metrics

08/26/2021
by   Wuyang Chen, et al.
0

This work targets designing a principled and unified training-free framework for Neural Architecture Search (NAS), with high performance, low cost, and in-depth interpretation. NAS has been explosively studied to automate the discovery of top-performer neural networks, but suffers from heavy resource consumption and often incurs search bias due to truncated training or approximations. Recent NAS works start to explore indicators that can predict a network's performance without training. However, they either leveraged limited properties of deep networks, or the benefits of their training-free indicators are not applied to more extensive search methods. By rigorous correlation analysis, we present a unified framework to understand and accelerate NAS, by disentangling "TEG" characteristics of searched networks - Trainability, Expressivity, Generalization - all assessed in a training-free manner. The TEG indicators could be scaled up and integrated with various NAS search methods, including both supernet and single-path approaches. Extensive studies validate the effective and efficient guidance from our TEG-NAS framework, leading to both improved search accuracy and over 2.3x reduction in search time cost. Moreover, we visualize search trajectories on three landscapes of "TEG" characteristics, observing that while a good local minimum is easier to find on NAS-Bench-201 given its simple topology, balancing "TEG" characteristics is much harder on the DARTS search space due to its complex landscape geometry. Our code is available at https://github.com/VITA-Group/TEGNAS.

READ FULL TEXT

page 4

page 7

page 13

research
02/23/2021

Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective

Neural Architecture Search (NAS) has been explosively studied to automat...
research
02/24/2022

Auto-scaling Vision Transformers without Training

This work targets automated designing and scaling of Vision Transformers...
research
01/24/2022

Unifying and Boosting Gradient-Based Training-Free Neural Architecture Search

Neural architecture search (NAS) has gained immense popularity owing to ...
research
03/03/2020

BATS: Binary ArchitecTure Search

This paper proposes Binary ArchitecTure Search (BATS), a framework that ...
research
03/28/2022

Demystifying the Neural Tangent Kernel from a Practical Perspective: Can it be trusted for Neural Architecture Search without training?

In Neural Architecture Search (NAS), reducing the cost of architecture e...
research
10/06/2022

NAS-Bench-Suite-Zero: Accelerating Research on Zero Cost Proxies

Zero-cost proxies (ZC proxies) are a recent architecture performance pre...
research
10/10/2022

LidarNAS: Unifying and Searching Neural Architectures for 3D Point Clouds

Developing neural models that accurately understand objects in 3D point ...

Please sign up or login with your details

Forgot password? Click here to reset