Unifying and Boosting Gradient-Based Training-Free Neural Architecture Search

01/24/2022
by   Yao Shu, et al.
0

Neural architecture search (NAS) has gained immense popularity owing to its ability to automate neural architecture design. A number of training-free metrics are recently proposed to realize NAS without training, hence making NAS more scalable. Despite their competitive empirical performances, a unified theoretical understanding of these training-free metrics is lacking. As a consequence, (a) the relationships among these metrics are unclear, (b) there is no theoretical guarantee for their empirical performances and transferability, and (c) there may exist untapped potential in training-free NAS, which can be unveiled through a unified theoretical understanding. To this end, this paper presents a unified theoretical analysis of gradient-based training-free NAS, which allows us to (a) theoretically study their relationships, (b) theoretically guarantee their generalization performances and transferability, and (c) exploit our unified theoretical understanding to develop a novel framework named hybrid NAS (HNAS) which consistently boosts training-free NAS in a principled way. Interestingly, HNAS is able to enjoy the advantages of both training-free (i.e., superior search efficiency) and training-based (i.e., remarkable search effectiveness) NAS, which we have demonstrated through extensive experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/16/2020

Geometry-Aware Gradient Algorithms for Neural Architecture Search

Many recent state-of-the-art methods for neural architecture search (NAS...
research
08/26/2021

Understanding and Accelerating Neural Architecture Search with Training-Free and Theory-Grounded Metrics

This work targets designing a principled and unified training-free frame...
research
10/16/2021

GradSign: Model Performance Inference with Theoretical Insights

A key challenge in neural architecture search (NAS) is quickly inferring...
research
06/01/2023

Training-free Neural Architecture Search for RNNs and Transformers

Neural architecture search (NAS) has allowed for the automatic creation ...
research
05/21/2019

Adaptive Stochastic Natural Gradient Method for One-Shot Neural Architecture Search

High sensitivity of neural architecture search (NAS) methods against the...
research
09/15/2022

Generalization Properties of NAS under Activation and Skip Connection Search

Neural Architecture Search (NAS) has fostered the automatic discovery of...
research
12/23/2022

DAS: Neural Architecture Search via Distinguishing Activation Score

Neural Architecture Search (NAS) is an automatic technique that can sear...

Please sign up or login with your details

Forgot password? Click here to reset