A Comparative study of Hyper-Parameter Optimization Tools

01/17/2022
by   Shashank Shekhar, et al.
0

Most of the machine learning models have associated hyper-parameters along with their parameters. While the algorithm gives the solution for parameters, its utility for model performance is highly dependent on the choice of hyperparameters. For a robust performance of a model, it is necessary to find out the right hyper-parameter combination. Hyper-parameter optimization (HPO) is a systematic process that helps in finding the right values for them. The conventional methods for this purpose are grid search and random search and both methods create issues in industrial-scale applications. Hence a set of strategies have been recently proposed based on Bayesian optimization and evolutionary algorithm principles that help in runtime issues in a production environment and robust performance. In this paper, we compare the performance of four python libraries, namely Optuna, Hyper-opt, Optunity, and sequential model-based algorithm configuration (SMAC) that has been proposed for hyper-parameter optimization. The performance of these tools is tested using two benchmarks. The first one is to solve a combined algorithm selection and hyper-parameter optimization (CASH) problem The second one is the NeurIPS black-box optimization challenge in which a multilayer perception (MLP) architecture has to be chosen from a set of related architecture constraints and hyper-parameters. The benchmarking is done with six real-world datasets. From the experiments, we found that Optuna has better performance for CASH problem and HyperOpt for MLP problem.

READ FULL TEXT
research
07/30/2020

On Hyperparameter Optimization of Machine Learning Algorithms: Theory and Practice

Machine learning algorithms have been used widely in various application...
research
10/01/2018

Taming VAEs

In spite of remarkable progress in deep latent variable generative model...
research
11/28/2021

Towards Robust and Automatic Hyper-Parameter Tunning

The task of hyper-parameter optimization (HPO) is burdened with heavy co...
research
03/13/2014

The Potential Benefits of Filtering Versus Hyper-Parameter Optimization

The quality of an induced model by a learning algorithm is dependent on ...
research
03/03/2023

Agent-based Collaborative Random Search for Hyper-parameter Tuning and Global Function Optimization

Hyper-parameter optimization is one of the most tedious yet crucial step...
research
09/25/2019

A Heuristic for Efficient Reduction in Hidden Layer Combinations For Feedforward Neural Networks

In this paper, we describe the hyper-parameter search problem in the fie...
research
10/29/2020

Black-Box Optimization of Object Detector Scales

Object detectors have improved considerably in the last years by using a...

Please sign up or login with your details

Forgot password? Click here to reset