Towards Less Constrained Macro-Neural Architecture Search

03/10/2022
by   Vasco Lopes, et al.
0

Networks found with Neural Architecture Search (NAS) achieve state-of-the-art performance in a variety of tasks, out-performing human-designed networks. However, most NAS methods heavily rely on human-defined assumptions that constrain the search: architecture's outer-skeletons, number of layers, parameter heuristics and search spaces. Additionally, common search spaces consist of repeatable modules (cells) instead of fully exploring the architecture's search space by designing entire architectures (macro-search). Imposing such constraints requires deep human expertise and restricts the search to pre-defined settings. In this paper, we propose LCMNAS, a method that pushes NAS to less constrained search spaces by performing macro-search without relying on pre-defined heuristics or bounded search spaces. LCMNAS introduces three components for the NAS pipeline: i) a method that leverages information about well-known architectures to autonomously generate complex search spaces based on Weighted Directed Graphs with hidden properties, ii) a evolutionary search strategy that generates complete architectures from scratch, and iii) a mixed-performance estimation approach that combines information about architectures at initialization stage and lower fidelity estimates to infer their trainability and capacity to model complex functions. We present experiments showing that LCMNAS generates state-of-the-art architectures from scratch with minimal GPU computation. We study the importance of different NAS components on a macro-search setting. Code for reproducibility is public at <https://github.com/VascoLopes/LCMNAS>.

READ FULL TEXT

page 12

page 13

research
07/31/2020

HMCNAS: Neural Architecture Search using Hidden Markov Chains and Bayesian Optimization

Neural Architecture Search has achieved state-of-the-art performance in ...
research
10/13/2022

BLOX: Macro Neural Architecture Search Benchmark and Algorithms

Neural architecture search (NAS) has been successfully used to design nu...
research
06/19/2019

Transfer NAS: Knowledge Transfer between Search Spaces with Transformer Agents

Recent advances in Neural Architecture Search (NAS) have produced state-...
research
11/03/2022

Towards Discovering Neural Architectures from Scratch

The discovery of neural architectures from scratch is the long-standing ...
research
04/10/2020

ModuleNet: Knowledge-inherited Neural Architecture Search

Although Neural Architecture Search (NAS) can bring improvement to deep ...
research
12/28/2019

NAS evaluation is frustratingly hard

Neural Architecture Search (NAS) is an exciting new field which promises...
research
03/16/2022

On Redundancy and Diversity in Cell-based Neural Architecture Search

Searching for the architecture cells is a dominant paradigm in NAS. Howe...

Please sign up or login with your details

Forgot password? Click here to reset