Single-Path Mobile AutoML: Efficient ConvNet Design and NAS Hyperparameter Optimization

07/01/2019
by   Dimitrios Stamoulis, et al.
6

Can we reduce the search cost of Neural Architecture Search (NAS) from days down to only few hours? NAS methods automate the design of Convolutional Networks (ConvNets) under hardware constraints and they have emerged as key components of AutoML frameworks. However, the NAS problem remains challenging due to the combinatorially large design space and the significant search time (at least 200 GPU-hours). In this work, we alleviate the NAS search cost down to less than 3 hours, while achieving state-of-the-art image classification results under mobile latency constraints. We propose a novel differentiable NAS formulation, namely Single-Path NAS, that uses one single-path over-parameterized ConvNet to encode all architectural decisions based on shared convolutional kernel parameters, hence drastically decreasing the search overhead. Single-Path NAS achieves state-of-the-art top-1 ImageNet accuracy (75.62 settings ( 80ms). In particular, we enhance the accuracy-runtime trade-off in differentiable NAS by treating the Squeeze-and-Excitation path as a fully searchable operation with our novel single-path encoding. Our method has an overall cost of only 8 epochs (24 TPU-hours), which is up to 5,000x faster compared to prior work. Moreover, we study how different NAS formulation choices affect the performance of the designed ConvNets. Furthermore, we exploit the efficiency of our method to answer an interesting question: instead of empirically tuning the hyperparameters of the NAS solver (as in prior work), can we automatically find the hyperparameter values that yield the desired accuracy-runtime trade-off? We open-source our entire codebase at: https://github.com/dstamoulis/single-path-nas.

READ FULL TEXT

page 1

page 3

page 10

page 11

research
04/05/2019

Single-Path NAS: Designing Hardware-Efficient ConvNets in less than 4 Hours

Can we automatically design a Convolutional Network (ConvNet) with the h...
research
05/10/2019

Single-Path NAS: Device-Aware Efficient ConvNet Design

Can we automatically design a Convolutional Network (ConvNet) with the h...
research
03/12/2021

Searching by Generating: Flexible and Efficient One-Shot NAS with Architecture Generator

In one-shot NAS, sub-networks need to be searched from the supernet to m...
research
08/12/2020

TF-NAS: Rethinking Three Search Freedoms of Latency-Constrained Differentiable Neural Architecture Search

With the flourish of differentiable neural architecture search (NAS), au...
research
12/23/2019

Progressive DARTS: Bridging the Optimization Gap for NAS in the Wild

With the rapid development of neural architecture search (NAS), research...
research
06/10/2021

A multi-objective perspective on jointly tuning hardware and hyperparameters

In addition to the best model architecture and hyperparameters, a full A...
research
07/07/2020

Hyperparameter Optimization in Neural Networks via Structured Sparse Recovery

In this paper, we study two important problems in the automated design o...

Please sign up or login with your details

Forgot password? Click here to reset