Hyperparameter Importance Across Datasets

10/12/2017
by   J. N. van Rijn, et al.
0

With the advent of automated machine learning, automated hyperparameter optimization methods are by now routinely used. However, this progress is not yet matched by equal progress on automatic analyses that yield information beyond performance-optimizing hyperparameter settings. In this work, we aim to answer the following two questions: Given an algorithm, what are generally its most important hyperparameters, and what are good priors over their hyperparameters' ranges to draw values from? We present methodology and a framework to answer these questions based on meta-learning across many datasets. We apply this methodology using the experimental meta-data available on OpenML to determine the most important hyperparameters of support vector machines, random forests and Adaboost, and to infer priors for all their hyperparameters. Our results, obtained fully automatically, provide a quantitative basis to focus efforts in both manual algorithm design and in automated hyperparameter optimization. Our experiments confirm that the selected hyperparameters are indeed the most important ones and that our obtained priors also lead to improvements in hyperparameter optimization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/30/2021

To tune or not to tune? An Approach for Recommending Important Hyperparameters

Novel technologies in automated machine learning ease the complexity of ...
research
06/28/2018

Automatic Exploration of Machine Learning Experiments on OpenML

Understanding the influence of hyperparameters on the performance of a m...
research
01/01/2021

ECG-Based Driver Stress Levels Detection System Using Hyperparameter Optimization

Stress and driving are a dangerous combination which can lead to crashes...
research
11/08/2021

Explaining Hyperparameter Optimization via Partial Dependence Plots

Automated hyperparameter optimization (HPO) can support practitioners to...
research
01/27/2022

Consolidated learning – a domain-specific model-free optimization strategy with examples for XGBoost and MIMIC-IV

For many machine learning models, a choice of hyperparameters is a cruci...
research
01/28/2020

An Adaptive and Near Parameter-free Evolutionary Computation Approach Towards True Automation in AutoML

A common claim of evolutionary computation methods is that they can achi...
research
02/27/2020

Using a thousand optimization tasks to learn hyperparameter search strategies

We present TaskSet, a dataset of tasks for use in training and evaluatin...

Please sign up or login with your details

Forgot password? Click here to reset