Efficient Search of Multiple Neural Architectures with Different Complexities via Importance Sampling

07/21/2022
by   Yuhei Noda, et al.
0

Neural architecture search (NAS) aims to automate architecture design processes and improve the performance of deep neural networks. Platform-aware NAS methods consider both performance and complexity and can find well-performing architectures with low computational resources. Although ordinary NAS methods result in tremendous computational costs owing to the repetition of model training, one-shot NAS, which trains the weights of a supernetwork containing all candidate architectures only once during the search process, has been reported to result in a lower search cost. This study focuses on the architecture complexity-aware one-shot NAS that optimizes the objective function composed of the weighted sum of two metrics, such as the predictive performance and number of parameters. In existing methods, the architecture search process must be run multiple times with different coefficients of the weighted sum to obtain multiple architectures with different complexities. This study aims at reducing the search cost associated with finding multiple architectures. The proposed method uses multiple distributions to generate architectures with different complexities and updates each distribution using the samples obtained from multiple distributions based on importance sampling. The proposed method allows us to obtain multiple architectures with different complexities in a single architecture search, resulting in reducing the search cost. The proposed method is applied to the architecture search of convolutional neural networks on the CIAFR-10 and ImageNet datasets. Consequently, compared with baseline methods, the proposed method finds multiple architectures with varying complexities while requiring less computational effort.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/17/2023

ShiftNAS: Improving One-shot NAS via Probability Shift

One-shot Neural architecture search (One-shot NAS) has been proposed as ...
research
02/06/2020

Variational Depth Search in ResNets

One-shot neural architecture search allows joint learning of weights and...
research
04/24/2018

Multi-objective Architecture Search for CNNs

Architecture search aims at automatically finding neural architectures t...
research
12/11/2020

AdvantageNAS: Efficient Neural Architecture Search with Credit Assignment

Neural architecture search (NAS) is an approach for automatically design...
research
02/16/2021

EPE-NAS: Efficient Performance Estimation Without Training for Neural Architecture Search

Neural Architecture Search (NAS) has shown excellent results in designin...
research
09/24/2019

Understanding and Improving One-shot Neural Architecture Optimization

The ability of accurately ranking candidate architectures is the key to ...
research
04/23/2020

Depth-Wise Neural Architecture Search

Modern convolutional networks such as ResNet and NASNet have achieved st...

Please sign up or login with your details

Forgot password? Click here to reset