A meta-learning recommender system for hyperparameter tuning: predicting when tuning improves SVM classifiers

06/04/2019
by   Rafael Gomes Mantovani, et al.
0

For many machine learning algorithms, predictive performance is critically affected by the hyperparameter values used to train them. However, tuning these hyperparameters can come at a high computational cost, especially on larger datasets, while the tuned settings do not always significantly outperform the default values. This paper proposes a recommender system based on meta-learning to identify exactly when it is better to use default values and when to tune hyperparameters for each new dataset. Besides, an in-depth analysis is performed to understand what they take into account for their decisions, providing useful insights. An extensive analysis of different categories of meta-features, meta-learners, and setups across 156 datasets is performed. Results show that it is possible to accurately predict when tuning will significantly improve the performance of the induced models. The proposed system reduces the time spent on optimization processes, without reducing the predictive performance of the induced models (when compared with the ones obtained using tuned hyperparameters). We also explain the decision-making process of the meta-learners in terms of linear separability-based hypotheses. Although this analysis is focused on the tuning of Support Vector Machines, it can also be applied to other algorithms, as shown in experiments performed with decision trees.

READ FULL TEXT

page 24

page 31

research
07/15/2020

Importance of Tuning Hyperparameters of Machine Learning Algorithms

The performance of many machine learning algorithms depends on their hyp...
research
07/31/2020

Rethinking Defaults Values: a Low Cost and Efficient Strategy to Define Hyperparameters

Machine Learning (ML) algorithms have been successfully employed by a va...
research
12/05/2018

An empirical study on hyperparameter tuning of decision trees

Machine learning algorithms often contain many hyperparameters whose val...
research
09/16/2019

Learning to Tune XGBoost with XGBoost

In this short paper we investigate whether meta-learning techniques can ...
research
01/26/2021

Average Localised Proximity: a new data descriptor with good default one-class classification performance

One-class classification is a challenging subfield of machine learning i...
research
03/03/2023

EZtune: A Package for Automated Hyperparameter Tuning in R

Statistical learning models have been growing in popularity in recent ye...
research
02/11/2020

Towards better understanding of meta-features contributions

Meta learning is a difficult problem as the expected performance of a mo...

Please sign up or login with your details

Forgot password? Click here to reset