Neural Architecture Transfer

05/12/2020
by   Zhichao Lu, et al.
22

Neural architecture search (NAS) has emerged as a promising avenue for automatically designing task-specific neural networks. Most existing NAS approaches require one complete search for each deployment specification of hardware or objective. This is a computationally impractical endeavor given the potentially large number of application scenarios. In this paper, we propose Neural Architecture Transfer (NAT) to overcome this limitation. NAT is designed to efficiently generate task-specific custom models that are competitive even under multiple conflicting objectives. To realize this goal we learn task-specific supernets from which specialized subnets can be sampled without any additional training. The key to our approach is an integrated online transfer learning and many-objective evolutionary search procedure. A pre-trained supernet is iteratively adapted while simultaneously searching for task-specific subnets. We demonstrate the efficacy of NAT on 11 benchmark image classification tasks ranging from large-scale multi-class to small-scale fine-grained datasets. In all cases, including ImageNet, NATNets improve upon the state-of-the-art under mobile settings (≤ 600M Multiply-Adds). Surprisingly, small-scale fine-grained datasets benefit the most from NAT. At the same time, the architecture search and transfer is orders of magnitude more efficient than existing NAS methods. Overall, experimental evaluation indicates that across diverse image classification tasks and computational objectives, NAT is an appreciably more effective alternative to fine-tuning based transfer learning. Code is available at https://github.com/human-analysis/neural-architecture-transfer

READ FULL TEXT

page 4

page 10

page 12

page 17

research
07/20/2020

NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural Architecture Search

In this paper, we propose an efficient NAS algorithm for generating task...
research
02/17/2022

Two-Stage Architectural Fine-Tuning with Neural Architecture Search using Early-Stopping in Image Classification

Deep neural networks (NN) perform well in various tasks (e.g., computer ...
research
12/15/2022

Colab NAS: Obtaining lightweight task-specific convolutional neural networks following Occam's razor

The current trend of applying transfer learning from CNNs trained on lar...
research
03/27/2023

TOFA: Transfer-Once-for-All

Weight-sharing neural architecture search aims to optimize a configurabl...
research
10/12/2021

NAS-Bench-360: Benchmarking Diverse Tasks for Neural Architecture Search

Most existing neural architecture search (NAS) benchmarks and algorithms...
research
04/25/2020

Deep Multimodal Neural Architecture Search

Designing effective neural networks is fundamentally important in deep m...
research
05/26/2023

FSD: Fully-Specialized Detector via Neural Architecture Search

In this paper, we first propose and examine a fully-automatic pipeline t...

Please sign up or login with your details

Forgot password? Click here to reset