DC-NAS: Divide-and-Conquer Neural Architecture Search

05/29/2020
by   Yunhe Wang, et al.
0

Most applications demand high-performance deep neural architectures costing limited resources. Neural architecture searching is a way of automatically exploring optimal deep neural networks in a given huge search space. However, all sub-networks are usually evaluated using the same criterion; that is, early stopping on a small proportion of the training dataset, which is an inaccurate and highly complex approach. In contrast to conventional methods, here we present a divide-and-conquer (DC) approach to effectively and efficiently search deep neural architectures. Given an arbitrary search space, we first extract feature representations of all sub-networks according to changes in parameters or output features of each layer, and then calculate the similarity between two different sampled networks based on the representations. Then, a k-means clustering is conducted to aggregate similar architectures into the same cluster, separately executing sub-network evaluation in each cluster. The best architecture in each cluster is later merged to obtain the optimal neural architecture. Experimental results conducted on several benchmarks illustrate that DC-NAS can overcome the inaccurate evaluation problem, achieving a 75.1% top-1 accuracy on the ImageNet dataset, which is higher than that of state-of-the-art methods using the same search space.

READ FULL TEXT
research
09/30/2019

RNAS: Architecture Ranking for Powerful Networks

Neural Architecture Search (NAS) is attractive for automatically produci...
research
07/06/2020

Multi-Objective Neural Architecture Search Based on Diverse Structures and Adaptive Recommendation

The search space of neural architecture search (NAS) for convolutional n...
research
03/23/2021

BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search

A myriad of recent breakthroughs in hand-crafted neural architectures fo...
research
06/07/2019

AutoGrow: Automatic Layer Growing in Deep Convolutional Networks

We propose AutoGrow to automate depth discovery in Deep Neural Networks ...
research
05/14/2020

A Semi-Supervised Assessor of Neural Architectures

Neural architecture search (NAS) aims to automatically design deep neura...
research
05/11/2023

Divide-and-Conquer the NAS puzzle in Resource Constrained Federated Learning Systems

Federated Learning (FL) is a privacy-preserving distributed machine lear...
research
01/24/2022

Neural Architecture Searching for Facial Attributes-based Depression Recognition

Recent studies show that depression can be partially reflected from huma...

Please sign up or login with your details

Forgot password? Click here to reset