Efficient Training Under Limited Resources

01/23/2023
by   Mahdi Zolnouri, et al.
0

Training time budget and size of the dataset are among the factors affecting the performance of a Deep Neural Network (DNN). This paper shows that Neural Architecture Search (NAS), Hyper Parameters Optimization (HPO), and Data Augmentation help DNNs perform much better while these two factors are limited. However, searching for an optimal architecture and the best hyperparameter values besides a good combination of data augmentation techniques under low resources requires many experiments. We present our approach to achieving such a goal in three steps: reducing training epoch time by compressing the model while maintaining the performance compared to the original model, preventing model overfitting when the dataset is small, and performing the hyperparameter tuning. We used NOMAD, which is a blackbox optimization software based on a derivative-free algorithm to do NAS and HPO. Our work achieved an accuracy of 86.0 Efficient Training (HAET) Challenge and won second place in the competition. The competition results can be found at haet2021.github.io/challenge and our source code can be found at github.com/DouniaLakhmiri/ICLR_HAET2021.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/17/2022

ObfuNAS: A Neural Architecture Search-based DNN Obfuscation Approach

Malicious architecture extraction has been emerging as a crucial concern...
research
07/28/2023

Shrink-Perturb Improves Architecture Mixing during Population Based Training for Neural Architecture Search

In this work, we show that simultaneously training and mixing neural net...
research
01/23/2023

GP-NAS-ensemble: a model for NAS Performance Prediction

It is of great significance to estimate the performance of a given model...
research
09/13/2021

DHA: End-to-End Joint Optimization of Data Augmentation Policy, Hyper-parameter and Architecture

Automated machine learning (AutoML) usually involves several crucial com...
research
07/14/2022

PASHA: Efficient HPO with Progressive Resource Allocation

Hyperparameter optimization (HPO) and neural architecture search (NAS) a...
research
09/30/2021

DAAS: Differentiable Architecture and Augmentation Policy Search

Neural architecture search (NAS) has been an active direction of automat...
research
10/18/2021

Improving GNSS Positioning using Neural Network-based Corrections

Deep Neural Networks (DNNs) are a promising tool for Global Navigation S...

Please sign up or login with your details

Forgot password? Click here to reset