Efficient Neural Architecture Search: A Broad Version

01/18/2020
by   Zixiang Ding, et al.
0

Efficient Neural Architecture Search (ENAS) achieves novel efficiency for learning architecture with high-performance via parameter sharing, but suffers from an issue of slow propagation speed of search model with deep topology. In this paper, we propose a Broad version for ENAS (BENAS) to solve the above issue, by learning broad architecture whose propagation speed is fast with reinforcement learning and parameter sharing used in ENAS, thereby achieving a higher search efficiency. In particular, we elaborately design Broad Convolutional Neural Network (BCNN), the search paradigm of BENAS with fast propagation speed, which can obtain a satisfactory performance with broad topology, i.e. fast forward and backward propagation speed. The proposed BCNN extracts multi-scale features and enhancement representations, and feeds them into global average pooling layer to yield more reasonable and comprehensive representations so that the achieved performance of BCNN with shallow topology can be promised. In order to verify the effectiveness of BENAS, several experiments are performed and experimental results show that 1) BENAS delivers 0.23 day which is 2x less expensive than ENAS, 2) the architecture learned by BENAS based small-size BCNNs with 0.5 and 1.1 millions parameters obtain state-of-the-art performance, 3.63 learned architecture based BCNN achieves 25.3% top-1 error on ImageNet just using 3.9 millions parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/18/2020

Faster Gradient-based NAS Pipeline Combining Broad Scalable Architecture with Confident Learning Rate

In order to further improve the search efficiency of Neural Architecture...
research
11/15/2021

Stacked BNAS: Rethinking Broad Convolutional Neural Network for Neural Architecture Search

Different from other deep scalable architecture based NAS approaches, Br...
research
02/09/2018

Efficient Neural Architecture Search via Parameters Sharing

We propose Efficient Neural Architecture Search (ENAS), a fast and inexp...
research
02/09/2018

Efficient Neural Architecture Search via Parameter Sharing

We propose Efficient Neural Architecture Search (ENAS), a fast and inexp...
research
12/13/2018

IRLAS: Inverse Reinforcement Learning for Architecture Search

In this paper, we propose an inverse reinforcement learning method for a...
research
12/20/2022

RepMode: Learning to Re-parameterize Diverse Experts for Subcellular Structure Prediction

In subcellular biological research, fluorescence staining is a key techn...

Please sign up or login with your details

Forgot password? Click here to reset