Shapley-NAS: Discovering Operation Contribution for Neural Architecture Search

06/20/2022
by   Han Xiao, et al.
0

In this paper, we propose a Shapley value based method to evaluate operation contribution (Shapley-NAS) for neural architecture search. Differentiable architecture search (DARTS) acquires the optimal architectures by optimizing the architecture parameters with gradient descent, which significantly reduces the search cost. However, the magnitude of architecture parameters updated by gradient descent fails to reveal the actual operation importance to the task performance and therefore harms the effectiveness of obtained architectures. By contrast, we propose to evaluate the direct influence of operations on validation accuracy. To deal with the complex relationships between supernet components, we leverage Shapley value to quantify their marginal contributions by considering all possible combinations. Specifically, we iteratively optimize the supernet weights and update the architecture parameters by evaluating operation contributions via Shapley value, so that the optimal architectures are derived by selecting the operations that contribute significantly to the tasks. Since the exact computation of Shapley value is NP-hard, the Monte-Carlo sampling based algorithm with early truncation is employed for efficient approximation, and the momentum update mechanism is adopted to alleviate fluctuation of the sampling process. Extensive experiments on various datasets and various search spaces show that our Shapley-NAS outperforms the state-of-the-art methods by a considerable margin with light search cost. The code is available at https://github.com/Euphoria16/Shapley-NAS.git

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/30/2019

Differentiable Neural Architecture Search via Proximal Iterations

Neural architecture search (NAS) recently attracts much research attenti...
research
02/28/2023

PA DA: Jointly Sampling PAth and DAta for Consistent NAS

Based on the weight-sharing mechanism, one-shot NAS methods train a supe...
research
11/08/2021

Approximate Neural Architecture Search via Operation Distribution Learning

The standard paradigm in Neural Architecture Search (NAS) is to search f...
research
10/10/2021

ZARTS: On Zero-order Optimization for Neural Architecture Search

Differentiable architecture search (DARTS) has been a popular one-shot p...
research
03/22/2021

Prioritized Architecture Sampling with Monto-Carlo Tree Search

One-shot neural architecture search (NAS) methods significantly reduce t...
research
02/12/2020

Stabilizing Differentiable Architecture Search via Perturbation-based Regularization

Differentiable architecture search (DARTS) is a prevailing NAS solution ...
research
01/27/2022

DropNAS: Grouped Operation Dropout for Differentiable Architecture Search

Neural architecture search (NAS) has shown encouraging results in automa...

Please sign up or login with your details

Forgot password? Click here to reset