Geometry-Aware Gradient Algorithms for Neural Architecture Search

04/16/2020
by   Liam Li, et al.
23

Many recent state-of-the-art methods for neural architecture search (NAS) relax the NAS problem into a joint continuous optimization over architecture parameters and their shared-weights, enabling the application of standard gradient-based optimizers. However, this training process remains poorly understood, as evidenced by the multitude of gradient-based heuristics that have been recently proposed. Invoking the theory of mirror descent, we present a unifying framework for designing and analyzing gradient-based NAS methods that exploit the underlying problem structure to quickly find high-performance architectures. Our geometry-aware framework leads to simple yet novel algorithms that (1) enjoy faster convergence guarantees than existing gradient-based methods and (2) achieve state-of-the-art accuracy on the latest NAS benchmarks in computer vision. Notably, we exceed the best published results for both CIFAR and ImageNet on both the DARTS search space and NAS-Bench-201; on the latter benchmark we achieve close to oracle-optimal performance on CIFAR-10 and CIFAR-100. Together, our theory and experiments demonstrate a principled way to co-design optimizers and continuous parameterizations of discrete NAS search spaces.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/25/2019

Binarized Neural Architecture Search

Neural architecture search (NAS) can have a significant impact in comput...
research
05/13/2021

Neighborhood-Aware Neural Architecture Search

Existing neural architecture search (NAS) methods often return an archit...
research
09/18/2020

Faster Gradient-based NAS Pipeline Combining Broad Scalable Architecture with Confident Learning Rate

In order to further improve the search efficiency of Neural Architecture...
research
01/24/2022

Unifying and Boosting Gradient-Based Training-Free Neural Architecture Search

Neural architecture search (NAS) has gained immense popularity owing to ...
research
09/15/2022

Generalization Properties of NAS under Activation and Skip Connection Search

Neural Architecture Search (NAS) has fostered the automatic discovery of...
research
10/16/2021

GradSign: Model Performance Inference with Theoretical Insights

A key challenge in neural architecture search (NAS) is quickly inferring...
research
05/21/2019

Adaptive Stochastic Natural Gradient Method for One-Shot Neural Architecture Search

High sensitivity of neural architecture search (NAS) methods against the...

Please sign up or login with your details

Forgot password? Click here to reset