Hyperparameter Optimization via Sequential Uniform Designs

09/08/2020
by   Zebin Yang, et al.
0

Hyperparameter tuning or optimization plays a central role in the automated machine learning (AutoML) pipeline. It is a challenging task as the response surfaces of hyperparameters are generally unknown, and the evaluation of each experiment is expensive. In this paper, we reformulate hyperparameter optimization as a kind of computer experiment and propose a novel sequential uniform design (SeqUD) for hyperparameter optimization. It is advantageous as a) it adaptively explores the hyperparameter space with evenly spread design points, which is free of the expensive meta-modeling and acquisition optimization procedures in Bayesian optimization; b) sequential design points are generated in batch, which can be easily parallelized; and c) a real-time augmented uniform design (AugUD) algorithm is developed for the efficient generation of new design points. Experiments are conducted on both global optimization tasks and hyperparameter optimization applications. The results show that SeqUD outperforms related hyperparameter optimization methods, which is demonstrated to be a promising and competitive alternative of existing tools.

READ FULL TEXT

page 26

page 30

research
02/10/2018

Bayesian Optimization Using Monotonicity Information and Its Application in Machine Learning Hyperparameter

We propose an algorithm for a family of optimization problems where the ...
research
12/19/2017

Hyperparameters Optimization in Deep Convolutional Neural Network / Bayesian Approach with Gaussian Process Prior

Convolutional Neural Network is known as ConvNet have been extensively u...
research
08/09/2018

OBOE: Collaborative Filtering for AutoML Initialization

Algorithm selection and hyperparameter tuning remain two of the most cha...
research
10/08/2018

CHOPT : Automated Hyperparameter Optimization Framework for Cloud-Based Machine Learning Platforms

Many hyperparameter optimization (HyperOpt) methods assume restricted co...
research
10/10/2022

Multi-step Planning for Automated Hyperparameter Optimization with OptFormer

As machine learning permeates more industries and models become more exp...
research
04/20/2023

PED-ANOVA: Efficiently Quantifying Hyperparameter Importance in Arbitrary Subspaces

The recent rise in popularity of Hyperparameter Optimization (HPO) for d...
research
04/26/2021

Efficient Hyperparameter Optimization for Physics-based Character Animation

Physics-based character animation has seen significant advances in recen...

Please sign up or login with your details

Forgot password? Click here to reset