sharpDARTS: Faster and More Accurate Differentiable Architecture Search

03/23/2019
by   Andrew Hundt, et al.
0

Neural Architecture Search (NAS) has been a source of dramatic improvements in neural network design, with recent results meeting or exceeding the performance of hand-tuned architectures. However, our understanding of how to represent the search space for neural net architectures and how to search that space efficiently are both still in their infancy. We have performed an in-depth analysis to identify limitations in a widely used search space and a recent architecture search method, Differentiable Architecture Search (DARTS). These findings led us to introduce novel network blocks with a more general, balanced, and consistent design; a better-optimized Cosine Power Annealing learning rate schedule; and other improvements. Our resulting sharpDARTS search is 50 final model error on CIFAR-10 when compared to DARTS. Our best single model run has 1.93 on the recently released CIFAR-10.1 test set. To our knowledge, both are state of the art for models of similar size. This model also generalizes competitively to ImageNet at 25.1 We found improvements for existing search spaces but does DARTS generalize to new domains? We propose Differentiable Hyperparameter Grid Search and the HyperCuboid search space, which are representations designed to leverage DARTS for more general parameter optimization. Here we find that DARTS fails to generalize when compared against a human's one shot choice of models. We look back to the DARTS and sharpDARTS search spaces to understand why, and an ablation study reveals an unusual generalization gap. We finally propose Max-W regularization to solve this problem, which proves significantly better than the handmade design. Code will be made available.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/22/2021

AutoSpace: Neural Architecture Search with Less Human Interference

Current neural architecture search (NAS) algorithms still require expert...
research
11/21/2020

BARS: Joint Search of Cell Topology and Layout for Accurate and Efficient Binary ARchitectures

Binary Neural Networks (BNNs) have received significant attention due to...
research
07/12/2023

Efficient and Joint Hyperparameter and Architecture Search for Collaborative Filtering

Automated Machine Learning (AutoML) techniques have recently been introd...
research
09/20/2019

Understanding and Robustifying Differentiable Architecture Search

Differentiable Architecture Search (DARTS) has attracted a lot of attent...
research
12/03/2019

EDAS: Efficient and Differentiable Architecture Search

Transferrable neural architecture search can be viewed as a binary optim...
research
12/02/2019

GroSS: Group-Size Series Decomposition for Whole Search-Space Training

We present Group-size Series (GroSS) decomposition, a mathematical formu...
research
04/28/2017

DeepArchitect: Automatically Designing and Training Deep Architectures

In deep learning, performance is strongly affected by the choice of arch...

Please sign up or login with your details

Forgot password? Click here to reset