ASAP: Architecture Search, Anneal and Prune

04/08/2019
by   Asaf Noy, et al.
0

Automatic methods for Neural Architecture Search (NAS) have been shown to produce state-of-the-art network models, yet, their main drawback is the computational complexity of the search process. As some primal methods optimized over a discrete search space, thousands of days of GPU were required for convergence. A recent approach is based on constructing a differentiable search space that enables gradient-based optimization, thus reducing the search time to a few days. While successful, such methods still include some incontinuous steps, e.g., the pruning of many weak connections at once. In this paper, we propose a differentiable search space that allows the annealing of architecture weights, while gradually pruning inferior operations, thus the search converges to a single output network in a continuous manner. Experiments on several vision datasets demonstrate the effectiveness of our method with respect to the search cost, accuracy and the memory footprint of the achieved model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/30/2019

Differentiable Neural Architecture Search via Proximal Iterations

Neural architecture search (NAS) recently attracts much research attenti...
research
12/03/2019

EDAS: Efficient and Differentiable Architecture Search

Transferrable neural architecture search can be viewed as a binary optim...
research
04/07/2021

Partially-Connected Differentiable Architecture Search for Deepfake and Spoofing Detection

This paper reports the first successful application of a differentiable ...
research
08/18/2020

NASE: Learning Knowledge Graph Embedding for Link Prediction via Neural Architecture Search

Link prediction is the task of predicting missing connections between en...
research
09/01/2023

ICDARTS: Improving the Stability and Performance of Cyclic DARTS

This work introduces improvements to the stability and generalizability ...
research
03/30/2021

Differentiable Network Adaption with Elastic Search Space

In this paper we propose a novel network adaption method called Differen...
research
02/03/2022

Learning strides in convolutional neural networks

Convolutional neural networks typically contain several downsampling ope...

Please sign up or login with your details

Forgot password? Click here to reset