NAS-LID: Efficient Neural Architecture Search with Local Intrinsic Dimension

11/23/2022
by   Xin He, et al.
0

One-shot neural architecture search (NAS) substantially improves the search efficiency by training one supernet to estimate the performance of every possible child architecture (i.e., subnet). However, the inconsistency of characteristics among subnets incurs serious interference in the optimization, resulting in poor performance ranking correlation of subnets. Subsequent explorations decompose supernet weights via a particular criterion, e.g., gradient matching, to reduce the interference; yet they suffer from huge computational cost and low space separability. In this work, we propose a lightweight and effective local intrinsic dimension (LID)-based method NAS-LID. NAS-LID evaluates the geometrical properties of architectures by calculating the low-cost LID features layer-by-layer, and the similarity characterized by LID enjoys better separability compared with gradients, which thus effectively reduces the interference among subnets. Extensive experiments on NASBench-201 indicate that NAS-LID achieves superior performance with better efficiency. Specifically, compared to the gradient-driven method, NAS-LID can save up to 86 the effectiveness of NAS-LID on ProxylessNAS and OFA spaces. Source code: https://github.com/marsggbo/NAS-LID.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2022

Improve Ranking Correlation of Super-net through Training Scheme from One-shot NAS to Few-shot NAS

The algorithms of one-shot neural architecture search(NAS) have been wid...
research
08/29/2021

Analyzing and Mitigating Interference in Neural Architecture Search

Weight sharing has become the de facto approach to reduce the training c...
research
07/28/2023

Shrink-Perturb Improves Architecture Mixing during Population Based Training for Neural Architecture Search

In this work, we show that simultaneously training and mixing neural net...
research
12/11/2020

AdvantageNAS: Efficient Neural Architecture Search with Credit Assignment

Neural architecture search (NAS) is an approach for automatically design...
research
08/22/2021

Pi-NAS: Improving Neural Architecture Search by Reducing Supernet Training Consistency Shift

Recently proposed neural architecture search (NAS) methods co-train bill...
research
03/29/2022

Generalizing Few-Shot NAS with Gradient Matching

Efficient performance estimation of architectures drawn from large searc...
research
01/24/2023

RD-NAS: Enhancing One-shot Supernet Ranking Ability via Ranking Distillation from Zero-cost Proxies

Neural architecture search (NAS) has made tremendous progress in the aut...

Please sign up or login with your details

Forgot password? Click here to reset