Rethinking Architecture Selection in Differentiable NAS

08/10/2021
by   Ruochen Wang, et al.
0

Differentiable Neural Architecture Search is one of the most popular Neural Architecture Search (NAS) methods for its search efficiency and simplicity, accomplished by jointly optimizing the model weight and architecture parameters in a weight-sharing supernet via gradient-based algorithms. At the end of the search phase, the operations with the largest architecture parameters will be selected to form the final architecture, with the implicit assumption that the values of architecture parameters reflect the operation strength. While much has been discussed about the supernet's optimization, the architecture selection process has received little attention. We provide empirical and theoretical analysis to show that the magnitude of architecture parameters does not necessarily indicate how much the operation contributes to the supernet's performance. We propose an alternative perturbation-based architecture selection that directly measures each operation's influence on the supernet. We re-evaluate several differentiable NAS methods with the proposed architecture selection and find that it is able to extract significantly improved architectures from the underlying supernets consistently. Furthermore, we find that several failure modes of DARTS can be greatly alleviated with the proposed selection method, indicating that much of the poor generalization observed in DARTS can be attributed to the failure of magnitude-based architecture selection rather than entirely the optimization of its supernet.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/14/2022

Λ-DARTS: Mitigating Performance Collapse by Harmonizing Operation Selection among Cells

Differentiable neural architecture search (DARTS) is a popular method fo...
research
12/24/2021

DARTS without a Validation Set: Optimizing the Marginal Likelihood

The success of neural architecture search (NAS) has historically been li...
research
09/28/2021

Delve into the Performance Degradation of Differentiable Architecture Search

Differentiable architecture search (DARTS) is widely considered to be ea...
research
09/02/2020

Understanding the wiring evolution in differentiable neural architecture search

Controversy exists on whether differentiable neural architecture search ...
research
06/12/2023

Small Temperature is All You Need for Differentiable Architecture Search

Differentiable architecture search (DARTS) yields highly efficient gradi...
research
11/18/2020

Explicitly Learning Topology for Differentiable Neural Architecture Search

Differentiable neural architecture search (DARTS) has gained much succes...
research
05/13/2021

Neighborhood-Aware Neural Architecture Search

Existing neural architecture search (NAS) methods often return an archit...

Please sign up or login with your details

Forgot password? Click here to reset