Warm Starting CMA-ES for Hyperparameter Optimization

12/13/2020
by   Masahiro Nomura, et al.
0

Hyperparameter optimization (HPO), formulated as black-box optimization (BBO), is recognized as essential for automation and high performance of machine learning approaches. The CMA-ES is a promising BBO approach with a high degree of parallelism, and has been applied to HPO tasks, often under parallel implementation, and shown superior performance to other approaches including Bayesian optimization (BO). However, if the budget of hyperparameter evaluations is severely limited, which is often the case for end users who do not deserve parallel computing, the CMA-ES exhausts the budget without improving the performance due to its long adaptation phase, resulting in being outperformed by BO approaches. To address this issue, we propose to transfer prior knowledge on similar HPO tasks through the initialization of the CMA-ES, leading to significantly shortening the adaptation time. The knowledge transfer is designed based on the novel definition of task similarity, with which the correlation of the performance of the proposed approach is confirmed on synthetic problems. The proposed warm starting CMA-ES, called WS-CMA-ES, is applied to different HPO tasks where some prior knowledge is available, showing its superior performance over the original CMA-ES as well as BO approaches with or without using the prior knowledge.

READ FULL TEXT
research
12/11/2020

Better call Surrogates: A hybrid Evolutionary Algorithm for Hyperparameter optimization

In this paper, we propose a surrogate-assisted evolutionary algorithm (E...
research
06/24/2020

Simple and Scalable Parallelized Bayesian Optimization

In recent years, leveraging parallel and distributed computational resou...
research
10/15/2018

Hyperparameter Learning via Distributional Transfer

Bayesian optimisation is a popular technique for hyperparameter learning...
research
12/07/2020

Adaptive Local Bayesian Optimization Over Multiple Discrete Variables

In the machine learning algorithms, the choice of the hyperparameter is ...
research
09/30/2019

A Copula approach for hyperparameter transfer learning

Bayesian optimization (BO) is a popular methodology to tune the hyperpar...
research
03/07/2018

Transfer Automatic Machine Learning

Building effective neural networks requires many design choices. These i...
research
06/25/2022

Bayesian Optimization Over Iterative Learners with Structured Responses: A Budget-aware Planning Approach

The rising growth of deep neural networks (DNNs) and datasets in size mo...

Please sign up or login with your details

Forgot password? Click here to reset