Pseudo-Inverted Bottleneck Convolution for DARTS Search Space

12/31/2022
by   Arash Ahmadian, et al.
0

Differentiable Architecture Search (DARTS) has attracted considerable attention as a gradient-based Neural Architecture Search (NAS) method. Since the introduction of DARTS, there has been little work done on adapting the action space based on state-of-art architecture design principles for CNNs. In this work, we aim to address this gap by incrementally augmenting the DARTS search space with micro-design changes inspired by ConvNeXt and studying the trade-off between accuracy, evaluation layer count, and computational cost. To this end, we introduce the Pseudo-Inverted Bottleneck conv block intending to reduce the computational footprint of the inverted bottleneck block proposed in ConvNeXt. Our proposed architecture is much less sensitive to evaluation layer count and outperforms a DARTS network with similar size significantly, at layer counts as small as 2. Furthermore, with less layers, not only does it achieve higher accuracy with lower GMACs and parameter count, GradCAM comparisons show that our network is able to better detect distinctive features of target objects compared to DARTS.

READ FULL TEXT
research
04/12/2020

FBNetV2: Differentiable Neural Architecture Search for Spatial and Channel Dimensions

Differentiable Neural Architecture Search (DNAS) has demonstrated great ...
research
05/15/2020

HNAS: Hierarchical Neural Architecture Search on Mobile Devices

Neural Architecture Search (NAS) has attracted growing interest. To redu...
research
11/21/2020

BARS: Joint Search of Cell Topology and Layout for Accurate and Efficient Binary ARchitectures

Binary Neural Networks (BNNs) have received significant attention due to...
research
12/09/2018

FBNet: Hardware-Aware Efficient ConvNet Design via Differentiable Neural Architecture Search

Designing accurate and efficient ConvNets for mobile devices is challeng...
research
02/27/2022

An Efficient End-to-End 3D Model Reconstruction based on Neural Architecture Search

Using neural networks to represent 3D objects has become popular. Howeve...
research
12/02/2019

GroSS: Group-Size Series Decomposition for Whole Search-Space Training

We present Group-size Series (GroSS) decomposition, a mathematical formu...
research
07/07/2021

Differentiable Random Access Memory using Lattices

We introduce a differentiable random access memory module with O(1) perf...

Please sign up or login with your details

Forgot password? Click here to reset