DeepAI AI Chat
Log In Sign Up

Towards Less Constrained Macro-Neural Architecture Search

by   Vasco Lopes, et al.
Universidade da Beira Interior

Networks found with Neural Architecture Search (NAS) achieve state-of-the-art performance in a variety of tasks, out-performing human-designed networks. However, most NAS methods heavily rely on human-defined assumptions that constrain the search: architecture's outer-skeletons, number of layers, parameter heuristics and search spaces. Additionally, common search spaces consist of repeatable modules (cells) instead of fully exploring the architecture's search space by designing entire architectures (macro-search). Imposing such constraints requires deep human expertise and restricts the search to pre-defined settings. In this paper, we propose LCMNAS, a method that pushes NAS to less constrained search spaces by performing macro-search without relying on pre-defined heuristics or bounded search spaces. LCMNAS introduces three components for the NAS pipeline: i) a method that leverages information about well-known architectures to autonomously generate complex search spaces based on Weighted Directed Graphs with hidden properties, ii) a evolutionary search strategy that generates complete architectures from scratch, and iii) a mixed-performance estimation approach that combines information about architectures at initialization stage and lower fidelity estimates to infer their trainability and capacity to model complex functions. We present experiments showing that LCMNAS generates state-of-the-art architectures from scratch with minimal GPU computation. We study the importance of different NAS components on a macro-search setting. Code for reproducibility is public at <>.


page 12

page 13


HMCNAS: Neural Architecture Search using Hidden Markov Chains and Bayesian Optimization

Neural Architecture Search has achieved state-of-the-art performance in ...

BLOX: Macro Neural Architecture Search Benchmark and Algorithms

Neural architecture search (NAS) has been successfully used to design nu...

Transfer NAS: Knowledge Transfer between Search Spaces with Transformer Agents

Recent advances in Neural Architecture Search (NAS) have produced state-...

Towards Discovering Neural Architectures from Scratch

The discovery of neural architectures from scratch is the long-standing ...

ModuleNet: Knowledge-inherited Neural Architecture Search

Although Neural Architecture Search (NAS) can bring improvement to deep ...

NAS evaluation is frustratingly hard

Neural Architecture Search (NAS) is an exciting new field which promises...

On Redundancy and Diversity in Cell-based Neural Architecture Search

Searching for the architecture cells is a dominant paradigm in NAS. Howe...