AIO-P: Expanding Neural Performance Predictors Beyond Image Classification

11/30/2022
by   Keith G. Mills, et al.
0

Evaluating neural network performance is critical to deep neural network design but a costly procedure. Neural predictors provide an efficient solution by treating architectures as samples and learning to estimate their performance on a given task. However, existing predictors are task-dependent, predominantly estimating neural network performance on image classification benchmarks. They are also search-space dependent; each predictor is designed to make predictions for a specific architecture search space with predefined topologies and set of operations. In this paper, we propose a novel All-in-One Predictor (AIO-P), which aims to pretrain neural predictors on architecture examples from multiple, separate computer vision (CV) task domains and multiple architecture spaces, and then transfer to unseen downstream CV tasks or neural architectures. We describe our proposed techniques for general graph representation, efficient predictor pretraining and knowledge infusion techniques, as well as methods to transfer to downstream tasks/spaces. Extensive experimental results show that AIO-P can achieve Mean Absolute Error (MAE) and Spearman's Rank Correlation (SRCC) below 1 respectively, on a breadth of target downstream CV tasks with or without fine-tuning, outperforming a number of baselines. Moreover, AIO-P can directly transfer to new architectures not seen during training, accurately rank them and serve as an effective performance estimator when paired with an algorithm designed to preserve performance while reducing FLOPs.

READ FULL TEXT
research
02/21/2023

A General-Purpose Transferable Predictor for Neural Architecture Search

Understanding and modelling the performance of neural architectures is k...
research
11/30/2022

GENNAPE: Towards Generalized Neural Architecture Performance Estimators

Predicting neural architecture performance is a challenging task and is ...
research
05/14/2020

A Semi-Supervised Assessor of Neural Architectures

Neural architecture search (NAS) aims to automatically design deep neura...
research
06/04/2023

Multi-Predict: Few Shot Predictors For Efficient Neural Architecture Search

Many hardware-aware neural architecture search (NAS) methods have been d...
research
08/04/2021

Generic Neural Architecture Search via Regression

Most existing neural architecture search (NAS) algorithms are dedicated ...
research
08/18/2021

RANK-NOSH: Efficient Predictor-Based Architecture Search via Non-Uniform Successive Halving

Predictor-based algorithms have achieved remarkable performance in the N...
research
09/11/2018

Searching for Efficient Multi-Scale Architectures for Dense Image Prediction

The design of neural network architectures is an important component for...

Please sign up or login with your details

Forgot password? Click here to reset