Scalable NAS with Factorizable Architectural Parameters

12/31/2019
by   Lanfei Wang, et al.
16

Neural architecture search (NAS) is an emerging topic in machine learning and computer vision. The fundamental ideology of NAS is using an automatic mechanism to replace manual designs for exploring powerful network architectures. One of the key factors of NAS is to scale-up the search space, e.g., increasing the number of operators, so that more possibilities are covered, but existing search algorithms often get lost in a large number of operators. This paper presents a scalable NAS algorithm by designing a factorizable set of architectural parameters, so that the size of the search space goes up quadratically while the burden of optimization increases linearly. As a practical example, we add a set of activation functions to the original set containing convolution, pooling and skip-connect, etc. With a marginal increase in search costs and no extra costs in retraining, we can find interesting architectures that were not explored before and achieve state-of-the-art performance in CIFAR10 and ImageNet, two standard image classification benchmarks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/10/2019

Auto-DeepLab: Hierarchical Neural Architecture Search for Semantic Image Segmentation

Recently, Neural Architecture Search (NAS) has successfully identified n...
research
07/07/2020

GOLD-NAS: Gradual, One-Level, Differentiable

There has been a large literature of neural architecture search, but mos...
research
09/15/2022

Generalization Properties of NAS under Activation and Skip Connection Search

Neural Architecture Search (NAS) has fostered the automatic discovery of...
research
04/03/2020

Neural Architecture Generator Optimization

Neural Architecture Search (NAS) was first proposed to achieve state-of-...
research
06/06/2019

StyleNAS: An Empirical Study of Neural Architecture Search to Uncover Surprisingly Fast End-to-End Universal Style Transfer Networks

Neural Architecture Search (NAS) has been widely studied for designing d...
research
08/26/2019

On the Bounds of Function Approximations

Within machine learning, the subfield of Neural Architecture Search (NAS...
research
02/17/2022

Two-Stage Architectural Fine-Tuning with Neural Architecture Search using Early-Stopping in Image Classification

Deep neural networks (NN) perform well in various tasks (e.g., computer ...

Please sign up or login with your details

Forgot password? Click here to reset