UENAS: A Unified Evolution-based NAS Framework

03/08/2022
by   Zimian Wei, et al.
0

Neural architecture search (NAS) has gained significant attention for automatic network design in recent years. Previous NAS methods suffer from limited search spaces, which may lead to sub-optimal results. In this paper, we propose UENAS, an evolution-based NAS framework with a broader search space that supports optimizing network architectures, pruning strategies, and hyperparameters simultaneously. To alleviate the huge search cost caused by the expanded search space, three strategies are adopted: First, an adaptive pruning strategy that iteratively trims the average model size in the population without compromising performance. Second, child networks share weights of overlapping layers with pre-trained parent networks to reduce the training epochs. Third, an online predictor scores the joint representations of architecture, pruning strategy, and hyperparameters to filter out inferior combos. By the proposed three strategies, the search efficiency is significantly improved and more well-performed compact networks with tailored hyper-parameters are derived. In experiments, UENAS achieves error rates of 2.81 the effectiveness of our method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/09/2020

Neural Architecture Search with GBDT

Neural architecture search (NAS) with an accuracy predictor that predict...
research
02/27/2023

An algorithmic framework for the optimization of deep neural networks architectures and hyperparameters

In this paper, we propose an algorithmic framework to automatically gene...
research
01/31/2021

AACP: Model Compression by Accurate and Automatic Channel Pruning

Channel pruning is formulated as a neural architecture search (NAS) prob...
research
03/22/2021

AutoSpace: Neural Architecture Search with Less Human Interference

Current neural architecture search (NAS) algorithms still require expert...
research
10/02/2022

Siamese-NAS: Using Trained Samples Efficiently to Find Lightweight Neural Architecture by Prior Knowledge

In the past decade, many architectures of convolution neural networks we...
research
10/19/2021

NAS-HPO-Bench-II: A Benchmark Dataset on Joint Optimization of Convolutional Neural Network Architecture and Training Hyperparameters

The benchmark datasets for neural architecture search (NAS) have been de...
research
03/23/2023

DetOFA: Efficient Training of Once-for-All Networks for Object Detection by Using Pre-trained Supernet and Path Filter

We address the challenge of training a large supernet for the object det...

Please sign up or login with your details

Forgot password? Click here to reset