DetOFA: Efficient Training of Once-for-All Networks for Object Detection by Using Pre-trained Supernet and Path Filter

03/23/2023
by   Yuiko Sakuma, et al.
0

We address the challenge of training a large supernet for the object detection task, using a relatively small amount of training data. Specifically, we propose an efficient supernet-based neural architecture search (NAS) method that uses transfer learning and search space pruning. First, the supernet is pre-trained on a classification task, for which large datasets are available. Second, the search space defined by the supernet is pruned by removing candidate models that are predicted to perform poorly. To effectively remove the candidates over a wide range of resource constraints, we particularly design a performance predictor, called path filter, which can accurately predict the relative performance of the models that satisfy similar resource constraints. Hence, supernet training is more focused on the best-performing candidates. Our path filter handles prediction for paths with different resource budgets. Compared to once-for-all, our proposed method reduces the computational cost of the optimal network architecture by 30 yielding better accuracy-floating point operations Pareto front (0.85 and 0.45 points of improvement on average precision for Pascal VOC and COCO, respectively).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/08/2021

OPANAS: One-Shot Path Aggregation Network Architecture Search for Object Detection

Recently, neural architecture search (NAS) has been exploited to design ...
research
03/14/2016

Rapid building detection using machine learning

This work describes algorithms for performing discrete object detection,...
research
09/23/2022

Tiered Pruning for Efficient Differentialble Inference-Aware Neural Architecture Search

We propose three novel pruning techniques to improve the cost and result...
research
11/24/2021

GreedyNASv2: Greedier Search with a Greedy Path Filter

Training a good supernet in one-shot NAS methods is difficult since the ...
research
03/08/2022

UENAS: A Unified Evolution-based NAS Framework

Neural architecture search (NAS) has gained significant attention for au...
research
03/25/2020

GreedyNAS: Towards Fast One-Shot NAS with Greedy Supernet

Training a supernet matters for one-shot neural architecture search (NAS...
research
05/16/2018

Prediction Rule Reshaping

Two methods are proposed for high-dimensional shape-constrained regressi...

Please sign up or login with your details

Forgot password? Click here to reset