Neural Architecture Transfer 2: A Paradigm for Improving Efficiency in Multi-Objective Neural Architecture Search

07/03/2023
by   Simone Sarti, et al.
0

Deep learning is increasingly impacting various aspects of contemporary society. Artificial neural networks have emerged as the dominant models for solving an expanding range of tasks. The introduction of Neural Architecture Search (NAS) techniques, which enable the automatic design of task-optimal networks, has led to remarkable advances. However, the NAS process is typically associated with long execution times and significant computational resource requirements. Once-For-All (OFA) and its successor, Once-For-All-2 (OFAv2), have been developed to mitigate these challenges. While maintaining exceptional performance and eliminating the need for retraining, they aim to build a single super-network model capable of directly extracting sub-networks satisfying different constraints. Neural Architecture Transfer (NAT) was developed to maximise the effectiveness of extracting sub-networks from a super-network. In this paper, we present NATv2, an extension of NAT that improves multi-objective search algorithms applied to dynamic super-network architectures. NATv2 achieves qualitative improvements in the extractable sub-networks by exploiting the improved super-networks generated by OFAv2 and incorporating new policies for initialisation, pre-processing and updating its networks archive. In addition, a post-processing pipeline based on fine-tuning is introduced. Experimental results show that NATv2 successfully improves NAT and is highly recommended for investigating high-performance architectures with a minimal number of parameters.

READ FULL TEXT

page 6

page 11

research
05/19/2022

A Hardware-Aware Framework for Accelerating Neural Architecture Search Across Modalities

Recent advances in Neural Architecture Search (NAS) such as one-shot NAS...
research
12/16/2021

Learning Interpretable Models Through Multi-Objective Neural Architecture Search

Monumental advances in deep learning have led to unprecedented achieveme...
research
12/20/2021

HyperSegNAS: Bridging One-Shot Neural Architecture Search with 3D Medical Image Segmentation using HyperNet

Semantic segmentation of 3D medical images is a challenging task due to ...
research
05/25/2023

Towards Automatic Neural Architecture Search within General Super-Networks

Existing neural architecture search (NAS) methods typically rely on pre-...
research
02/25/2022

A Hardware-Aware System for Accelerating Deep Neural Network Optimization

Recent advances in Neural Architecture Search (NAS) which extract specia...
research
02/17/2020

How to 0wn NAS in Your Spare Time

New data processing pipelines and novel network architectures increasing...
research
03/29/2021

Self-Constructing Neural Networks Through Random Mutation

The search for neural architecture is producing many of the most excitin...

Please sign up or login with your details

Forgot password? Click here to reset