DropNAS: Grouped Operation Dropout for Differentiable Architecture Search

01/27/2022
by   Weijun Hong, et al.
0

Neural architecture search (NAS) has shown encouraging results in automating the architecture design. Recently, DARTS relaxes the search process with a differentiable formulation that leverages weight-sharing and SGD where all candidate operations are trained simultaneously. Our empirical results show that such procedure results in the co-adaption problem and Matthew Effect: operations with fewer parameters would be trained maturely earlier. This causes two problems: firstly, the operations with more parameters may never have the chance to express the desired function since those with less have already done the job; secondly, the system will punish those underperforming operations by lowering their architecture parameter, and they will get smaller loss gradients, which causes the Matthew Effect. In this paper, we systematically study these problems and propose a novel grouped operation dropout algorithm named DropNAS to fix the problems with DARTS. Extensive experiments demonstrate that DropNAS solves the above issues and achieves promising performance. Specifically, DropNAS achieves 2.26 CIFAR-100 and 23.4 DARTS for a fair comparison). It is also observed that DropNAS is robust across variants of the DARTS search space. Code is available at https://github.com/wiljohnhong/DropNAS.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/11/2023

Operation-level Progressive Differentiable Architecture Search

Differentiable Neural Architecture Search (DARTS) is becoming more and m...
research
08/12/2020

TF-NAS: Rethinking Three Search Freedoms of Latency-Constrained Differentiable Neural Architecture Search

With the flourish of differentiable neural architecture search (NAS), au...
research
08/01/2022

Partial Connection Based on Channel Attention for Differentiable Neural Architecture Search

Differentiable neural architecture search (DARTS), as a gradient-guided ...
research
06/20/2022

Shapley-NAS: Discovering Operation Contribution for Neural Architecture Search

In this paper, we propose a Shapley value based method to evaluate opera...
research
11/27/2019

Fair DARTS: Eliminating Unfair Advantages in Differentiable Architecture Search

Differential Architecture Search (DARTS) is now a widely disseminated we...
research
02/12/2020

Stabilizing Differentiable Architecture Search via Perturbation-based Regularization

Differentiable architecture search (DARTS) is a prevailing NAS solution ...
research
04/01/2021

EfficientNetV2: Smaller Models and Faster Training

This paper introduces EfficientNetV2, a new family of convolutional netw...

Please sign up or login with your details

Forgot password? Click here to reset