Improving Multi-fidelity Optimization with a Recurring Learning Rate for Hyperparameter Tuning

09/26/2022
by   Hyunjae Lee, et al.
0

Despite the evolution of Convolutional Neural Networks (CNNs), their performance is surprisingly dependent on the choice of hyperparameters. However, it remains challenging to efficiently explore large hyperparameter search space due to the long training times of modern CNNs. Multi-fidelity optimization enables the exploration of more hyperparameter configurations given budget by early termination of unpromising configurations. However, it often results in selecting a sub-optimal configuration as training with the high-performing configuration typically converges slowly in an early phase. In this paper, we propose Multi-fidelity Optimization with a Recurring Learning rate (MORL) which incorporates CNNs' optimization process into multi-fidelity optimization. MORL alleviates the problem of slow-starter and achieves a more precise low-fidelity approximation. Our comprehensive experiments on general image classification, transfer learning, and semi-supervised learning demonstrate the effectiveness of MORL over other multi-fidelity optimization methods such as Successive Halving Algorithm (SHA) and Hyperband. Furthermore, it achieves significant performance improvements over hand-tuned hyperparameter configuration within a practical budget.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/12/2019

Practical Multi-fidelity Bayesian Optimization for Hyperparameter Tuning

Bayesian optimization is popular for optimizing time-consuming black-box...
research
02/25/2023

A Surrogate-Assisted Highly Cooperative Coevolutionary Algorithm for Hyperparameter Optimization in Deep Convolutional Neural Network

Convolutional neural networks (CNNs) have gained remarkable success in r...
research
11/12/2021

A Simple and Fast Baseline for Tuning Large XGBoost Models

XGBoost, a scalable tree boosting algorithm, has proven effective for ma...
research
07/28/2023

Is One Epoch All You Need For Multi-Fidelity Hyperparameter Optimization?

Hyperparameter optimization (HPO) is crucial for fine-tuning machine lea...
research
04/28/2022

Automatic Machine Learning for Multi-Receiver CNN Technology Classifiers

Convolutional Neural Networks (CNNs) are one of the most studied family ...
research
02/01/2023

Iterative Deepening Hyperband

Hyperparameter optimization (HPO) is concerned with the automated search...
research
03/26/2018

Algorithm Configuration: Learning policies for the quick termination of poor performers

One way to speed up the algorithm configuration task is to use short run...

Please sign up or login with your details

Forgot password? Click here to reset