NASGEM: Neural Architecture Search via Graph Embedding Method

07/08/2020
by   Hsin-Pai Cheng, et al.
0

Neural Architecture Search (NAS) automates and prospers the design of neural networks. Recent studies show that mapping the discrete neural architecture search space into a continuous space which is more compact, more representative, and easier to optimize can significantly reduce the exploration cost. However, existing differentiable methods cannot preserve the graph information when projecting a neural architecture into a continuous space, causing inaccuracy and/or reduced representation capability in the mapped space. Moreover, existing methods can explore only a very limited inner-cell search space due to the cell representation limitation or poor scalability. To enable quick search of more sophisticated neural architectures while preserving graph information, we propose NASGEM which stands for Neural Architecture Search via Graph Embedding Method. NASGEM is driven by a novel graph embedding method integrated with similarity estimation to capture the inner-cell information in the discrete space. Thus, NASGEM is able to search a wider space (e.g., 30 nodes in a cell). By precisely estimating the graph distance, NASGEM can efficiently explore a large amount of candidate cells to enable a more flexible cell design while still keeping the search cost low. GEMNet, which is a set of networks discovered by NASGEM, has higher accuracy while less parameters (up to 62 compared to networks crafted by existing differentiable search methods. Our ablation study on NASBench-101 further validates the effectiveness of the proposed graph embedding method, which is complementary to many existing NAS approaches and can be combined to achieve better performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/27/2020

Neural Architecture Search of SPD Manifold Networks

In this paper, we propose a new neural architecture search (NAS) problem...
research
06/19/2019

SwiftNet: Using Graph Propagation as Meta-knowledge to Search Highly Representative Neural Architectures

Designing neural architectures for edge devices is subject to constraint...
research
11/30/2020

Inter-layer Transition in Neural Architecture Search

Differential Neural Architecture Search (NAS) methods represent the netw...
research
11/21/2020

BARS: Joint Search of Cell Topology and Layout for Accurate and Efficient Binary ARchitectures

Binary Neural Networks (BNNs) have received significant attention due to...
research
12/03/2019

EDAS: Efficient and Differentiable Architecture Search

Transferrable neural architecture search can be viewed as a binary optim...
research
07/02/2021

CHASE: Robust Visual Tracking via Cell-Level Differentiable Neural Architecture Search

A strong visual object tracker nowadays relies on its well-crafted modul...
research
01/27/2021

Towards Improving the Consistency, Efficiency, and Flexibility of Differentiable Neural Architecture Search

Most differentiable neural architecture search methods construct a super...

Please sign up or login with your details

Forgot password? Click here to reset