Explaining Hyperparameter Optimization via Partial Dependence Plots

11/08/2021
by   Julia Moosbauer, et al.
24

Automated hyperparameter optimization (HPO) can support practitioners to obtain peak performance in machine learning models. However, there is often a lack of valuable insights into the effects of different hyperparameters on the final model performance. This lack of explainability makes it difficult to trust and understand the automated HPO process and its results. We suggest using interpretable machine learning (IML) to gain insights from the experimental data obtained during HPO with Bayesian optimization (BO). BO tends to focus on promising regions with potential high-performance configurations and thus induces a sampling bias. Hence, many IML techniques, such as the partial dependence plot (PDP), carry the risk of generating biased interpretations. By leveraging the posterior uncertainty of the BO surrogate model, we introduce a variant of the PDP with estimated confidence bands. We propose to partition the hyperparameter space to obtain more confident and reliable PDPs in relevant sub-regions. In an experimental study, we provide quantitative evidence for the increased quality of the PDPs within sub-regions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/11/2022

Enhancing Explainability of Hyperparameter Optimization via Bayesian Algorithm Execution

Despite all the benefits of automated hyperparameter optimization (HPO),...
research
08/30/2021

To tune or not to tune? An Approach for Recommending Important Hyperparameters

Novel technologies in automated machine learning ease the complexity of ...
research
10/12/2017

Hyperparameter Importance Across Datasets

With the advent of automated machine learning, automated hyperparameter ...
research
04/05/2023

AutoRL Hyperparameter Landscapes

Although Reinforcement Learning (RL) has shown to be capable of producin...
research
04/25/2023

Bayesian Optimization Meets Self-Distillation

Bayesian optimization (BO) has contributed greatly to improving model pe...
research
09/04/2020

HyperTendril: Visual Analytics for User-Driven Hyperparameter Optimization of Deep Neural Networks

To mitigate the pain of manually tuning hyperparameters of deep neural n...
research
09/29/2022

Dynamic Surrogate Switching: Sample-Efficient Search for Factorization Machine Configurations in Online Recommendations

Hyperparameter optimization is the process of identifying the appropriat...

Please sign up or login with your details

Forgot password? Click here to reset