On the Performance of Differential Evolution for Hyperparameter Tuning

04/15/2019
by   Mischa Schmidt, et al.
0

Automated hyperparameter tuning aspires to facilitate the application of machine learning for non-experts. In the literature, different optimization approaches are applied for that purpose. This paper investigates the performance of Differential Evolution for tuning hyperparameters of supervised learning algorithms for classification tasks. This empirical study involves a range of different machine learning algorithms and datasets with various characteristics to compare the performance of Differential Evolution with Sequential Model-based Algorithm Configuration (SMAC), a reference Bayesian Optimization approach. The results indicate that Differential Evolution outperforms SMAC for most datasets when tuning a given machine learning algorithm - particularly when breaking ties in a first-to-report fashion. Only for the tightest of computational budgets SMAC performs better. On small datasets, Differential Evolution outperforms SMAC by 19 tie-breaking). In a second experiment across a range of representative datasets taken from the literature, Differential Evolution scores 15 tie-breaking) more wins than SMAC.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/30/2021

To tune or not to tune? An Approach for Recommending Important Hyperparameters

Novel technologies in automated machine learning ease the complexity of ...
research
08/19/2019

Towards Assessing the Impact of Bayesian Optimization's Own Hyperparameters

Bayesian Optimization (BO) is a common approach for hyperparameter optim...
research
01/30/2019

Which Surrogate Works for Empirical Performance Modelling? A Case Study with Differential Evolution

It is not uncommon that meta-heuristic algorithms contain some intrinsic...
research
02/06/2019

Fast Hyperparameter Tuning using Bayesian Optimization with Directional Derivatives

In this paper we develop a Bayesian optimization based hyperparameter tu...
research
04/22/2020

Differential evolution outside the box

This paper investigates how often the popular configurations of Differen...
research
09/12/2023

Hybrid Algorithm Selection and Hyperparameter Tuning on Distributed Machine Learning Resources: A Hierarchical Agent-based Approach

Algorithm selection and hyperparameter tuning are critical steps in both...
research
08/09/2018

OBOE: Collaborative Filtering for AutoML Initialization

Algorithm selection and hyperparameter tuning remain two of the most cha...

Please sign up or login with your details

Forgot password? Click here to reset