Adaptive Stochastic Natural Gradient Method for One-Shot Neural Architecture Search

05/21/2019
by   Youhei Akimoto, et al.
0

High sensitivity of neural architecture search (NAS) methods against their input such as step-size (i.e., learning rate) and search space prevents practitioners from applying them out-of-the-box to their own problems, albeit its purpose is to automate a part of tuning process. Aiming at a fast, robust, and widely-applicable NAS, we develop a generic optimization framework for NAS. We turn a coupled optimization of connection weights and neural architecture into a differentiable optimization by means of stochastic relaxation. It accepts arbitrary search space (widely-applicable) and enables to employ a gradient-based simultaneous optimization of weights and architecture (fast). We propose a stochastic natural gradient method with an adaptive step-size mechanism built upon our theoretical investigation (robust). Despite its simplicity and no problem-dependent parameter tuning, our method exhibited near state-of-the-art performances with low computational budgets both on image classification and inpainting tasks.

READ FULL TEXT
research
12/24/2018

SNAS: Stochastic Neural Architecture Search

We propose Stochastic Neural Architecture Search (SNAS), an economical e...
research
04/16/2020

Geometry-Aware Gradient Algorithms for Neural Architecture Search

Many recent state-of-the-art methods for neural architecture search (NAS...
research
07/13/2020

MS-NAS: Multi-Scale Neural Architecture Search for Medical Image Segmentation

The recent breakthroughs of Neural Architecture Search (NAS) have motiva...
research
05/07/2019

Neural Architecture Refinement: A Practical Way for Avoiding Overfitting in NAS

Neural architecture search (NAS) is proposed to automate the architectur...
research
12/24/2021

DARTS without a Validation Set: Optimizing the Marginal Likelihood

The success of neural architecture search (NAS) has historically been li...
research
01/24/2022

Unifying and Boosting Gradient-Based Training-Free Neural Architecture Search

Neural architecture search (NAS) has gained immense popularity owing to ...
research
05/11/2023

Backpropagation-Free 4D Continuous Ant-Based Neural Topology Search

Continuous Ant-based Topology Search (CANTS) is a previously introduced ...

Please sign up or login with your details

Forgot password? Click here to reset