LISSNAS: Locality-based Iterative Search Space Shrinkage for Neural Architecture Search

07/06/2023
by   Bhavna Gopal, et al.
0

Search spaces hallmark the advancement of Neural Architecture Search (NAS). Large and complex search spaces with versatile building operators and structures provide more opportunities to brew promising architectures, yet pose severe challenges on efficient exploration and exploitation. Subsequently, several search space shrinkage methods optimize by selecting a single sub-region that contains some well-performing networks. Small performance and efficiency gains are observed with these methods but such techniques leave room for significantly improved search performance and are ineffective at retaining architectural diversity. We propose LISSNAS, an automated algorithm that shrinks a large space into a diverse, small search space with SOTA search performance. Our approach leverages locality, the relationship between structural and performance similarity, to efficiently extract many pockets of well-performing networks. We showcase our method on an array of search spaces spanning various sizes and datasets. We accentuate the effectiveness of our shrunk spaces when used in one-shot search by achieving the best Top-1 accuracy in two different search spaces. Our method achieves a SOTA Top-1 accuracy of 77.6% in ImageNet under mobile constraints, best-in-class Kendal-Tau, architectural diversity, and search space size.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/18/2021

A Novel Evolutionary Algorithm for Hierarchical Neural Architecture Search

In this work, we propose a novel evolutionary algorithm for neural archi...
research
03/22/2021

AutoSpace: Neural Architecture Search with Less Human Interference

Current neural architecture search (NAS) algorithms still require expert...
research
06/16/2021

Redefining Neural Architecture Search of Heterogeneous Multi-Network Models by Characterizing Variation Operators and Model Components

With neural architecture search methods gaining ground on manually desig...
research
11/18/2019

ImmuNeCS: Neural Committee Search by an Artificial Immune System

Current Neural Architecture Search techniques can suffer from a few shor...
research
10/25/2019

Stabilizing DARTS with Amended Gradient Estimation on Architectural Parameters

Differentiable neural architecture search has been a popular methodology...
research
10/28/2022

Using Supervised Deep-Learning to Model Edge-FBG Shape Sensors

Continuum robots in robot-assisted minimally invasive surgeries provide ...
research
12/16/2020

Distilling Optimal Neural Networks: Rapid Search in Diverse Spaces

This work presents DONNA (Distilling Optimal Neural Network Architecture...

Please sign up or login with your details

Forgot password? Click here to reset