Neural Architecture Transfer

05/12/2020
by   Zhichao Lu, et al.
22

Neural architecture search (NAS) has emerged as a promising avenue for automatically designing task-specific neural networks. Most existing NAS approaches require one complete search for each deployment specification of hardware or objective. This is a computationally impractical endeavor given the potentially large number of application scenarios. In this paper, we propose Neural Architecture Transfer (NAT) to overcome this limitation. NAT is designed to efficiently generate task-specific custom models that are competitive even under multiple conflicting objectives. To realize this goal we learn task-specific supernets from which specialized subnets can be sampled without any additional training. The key to our approach is an integrated online transfer learning and many-objective evolutionary search procedure. A pre-trained supernet is iteratively adapted while simultaneously searching for task-specific subnets. We demonstrate the efficacy of NAT on 11 benchmark image classification tasks ranging from large-scale multi-class to small-scale fine-grained datasets. In all cases, including ImageNet, NATNets improve upon the state-of-the-art under mobile settings (≤ 600M Multiply-Adds). Surprisingly, small-scale fine-grained datasets benefit the most from NAT. At the same time, the architecture search and transfer is orders of magnitude more efficient than existing NAS methods. Overall, experimental evaluation indicates that across diverse image classification tasks and computational objectives, NAT is an appreciably more effective alternative to fine-tuning based transfer learning. Code is available at https://github.com/human-analysis/neural-architecture-transfer

READ FULL TEXT

page 4

page 10

page 12

page 17

07/20/2020

NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural Architecture Search

In this paper, we propose an efficient NAS algorithm for generating task...
02/17/2022

Two-Stage Architectural Fine-Tuning with Neural Architecture Search using Early-Stopping in Image Classification

Deep neural networks (NN) perform well in various tasks (e.g., computer ...
10/12/2021

NAS-Bench-360: Benchmarking Diverse Tasks for Neural Architecture Search

Most existing neural architecture search (NAS) benchmarks and algorithms...
11/26/2021

KNAS: Green Neural Architecture Search

Many existing neural architecture search (NAS) solutions rely on downstr...
04/25/2020

Deep Multimodal Neural Architecture Search

Designing effective neural networks is fundamentally important in deep m...
03/30/2021

A resource-efficient method for repeated HPO and NAS problems

In this work we consider the problem of repeated hyperparameter and neur...
06/06/2019

StyleNAS: An Empirical Study of Neural Architecture Search to Uncover Surprisingly Fast End-to-End Universal Style Transfer Networks

Neural Architecture Search (NAS) has been widely studied for designing d...

Code Repositories

neural-architecture-transfer

Neural Architecture Transfer (Arxiv'20), PyTorch Implementation


view repo