Discrete Simulation Optimization for Tuning Machine Learning Method Hyperparameters

01/16/2022
by   Varun Ramamohan, et al.
0

Machine learning methods are being increasingly used in most technical areas such as image recognition, product recommendation, financial analysis, medical diagnosis, and predictive maintenance. The key question that arises is: how do we control the learning process according to our requirement for the problem? Hyperparameter tuning is used to choose the optimal set of hyperparameters for controlling the learning process of a model. Selecting the appropriate hyperparameters directly impacts the performance measure a model. We have used simulation optimization using discrete search methods like ranking and selection (R S) methods such as the KN method and stochastic ruler method and its variations for hyperparameter optimization and also developed the theoretical basis for applying common R S methods. The KN method finds the best possible system with statistical guarantee and stochastic ruler method asymptotically converges to the optimal solution and is also computationally very efficient. We also benchmarked our results with state of art hyperparameter optimization libraries such as hyperopt and mango and found KN and stochastic ruler to be performing consistently better than hyperopt rand and stochastic ruler to be equally efficient in comparison with hyperopt tpe in most cases, even when our computational implementations are not yet optimized in comparison to professional packages.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/26/2018

Stochastic Hyperparameter Optimization through Hypernetworks

Machine learning models are often tuned by nesting optimization of model...
research
05/19/2023

PyTorch Hyperparameter Tuning - A Tutorial for spotPython

The goal of hyperparameter tuning (or hyperparameter optimization) is to...
research
07/19/2021

Experimental Investigation and Evaluation of Model-based Hyperparameter Optimization

Machine learning algorithms such as random forests or xgboost are gainin...
research
11/09/2021

The Role of Adaptive Optimizers for Honest Private Hyperparameter Selection

Hyperparameter optimization is a ubiquitous challenge in machine learnin...
research
06/02/2020

Hyperparameter Selection for Subsampling Bootstraps

Massive data analysis becomes increasingly prevalent, subsampling method...
research
09/24/2020

Tuning Word2vec for Large Scale Recommendation Systems

Word2vec is a powerful machine learning tool that emerged from Natural L...
research
04/26/2021

Efficient Hyperparameter Optimization for Physics-based Character Animation

Physics-based character animation has seen significant advances in recen...

Please sign up or login with your details

Forgot password? Click here to reset