DeepAI AI Chat
Log In Sign Up

Discrete Simulation Optimization for Tuning Machine Learning Method Hyperparameters

by   Varun Ramamohan, et al.
Indian Institute of Technology Delhi

Machine learning methods are being increasingly used in most technical areas such as image recognition, product recommendation, financial analysis, medical diagnosis, and predictive maintenance. The key question that arises is: how do we control the learning process according to our requirement for the problem? Hyperparameter tuning is used to choose the optimal set of hyperparameters for controlling the learning process of a model. Selecting the appropriate hyperparameters directly impacts the performance measure a model. We have used simulation optimization using discrete search methods like ranking and selection (R S) methods such as the KN method and stochastic ruler method and its variations for hyperparameter optimization and also developed the theoretical basis for applying common R S methods. The KN method finds the best possible system with statistical guarantee and stochastic ruler method asymptotically converges to the optimal solution and is also computationally very efficient. We also benchmarked our results with state of art hyperparameter optimization libraries such as hyperopt and mango and found KN and stochastic ruler to be performing consistently better than hyperopt rand and stochastic ruler to be equally efficient in comparison with hyperopt tpe in most cases, even when our computational implementations are not yet optimized in comparison to professional packages.


page 1

page 2

page 3

page 4


Stochastic Hyperparameter Optimization through Hypernetworks

Machine learning models are often tuned by nesting optimization of model...

PyTorch Hyperparameter Tuning – A Tutorial for spotPython

The goal of hyperparameter tuning (or hyperparameter optimization) is to...

Experimental Investigation and Evaluation of Model-based Hyperparameter Optimization

Machine learning algorithms such as random forests or xgboost are gainin...

The Role of Adaptive Optimizers for Honest Private Hyperparameter Selection

Hyperparameter optimization is a ubiquitous challenge in machine learnin...

Hyperparameter Selection for Subsampling Bootstraps

Massive data analysis becomes increasingly prevalent, subsampling method...

Tuning Word2vec for Large Scale Recommendation Systems

Word2vec is a powerful machine learning tool that emerged from Natural L...

Hyperparameter Optimisation with Early Termination of Poor Performers

It is typical for a machine learning system to have numerous hyperparame...