Random Search for Hyperparameters using Determinantal Point Processes

06/06/2017
by   Jesse Dodge, et al.
0

We propose the use of k-determinantal point processes in hyperparameter optimization via random search. Compared to conventional approaches where hyperparameter settings are sampled independently, a k-DPP promotes diversity. We describe an approach that transforms hyperparameter search spaces for efficient use with a k-DPP. Our experiments show significant benefits over uniform random search in realistic scenarios with a limited budget for training supervised learners, whether in serial or parallel.

READ FULL TEXT
research
03/30/2020

Weighted Random Search for CNN Hyperparameter Optimization

Nearly all model algorithms used in machine learning use two different s...
research
02/27/2020

Using a thousand optimization tasks to learn hyperparameter search strategies

We present TaskSet, a dataset of tasks for use in training and evaluatin...
research
10/18/2019

Fully Parallel Hyperparameter Search: Reshaped Space-Filling

Space-filling designs such as scrambled-Hammersley, Latin Hypercube Samp...
research
04/03/2020

Weighted Random Search for Hyperparameter Optimization

We introduce an improved version of Random Search (RS), used here for hy...
research
12/23/2021

Using Sequential Statistical Tests to Improve the Performance of Random Search in hyperparameter Tuning

Hyperparamter tuning is one of the the most time-consuming parts in mach...
research
08/17/2022

Random Search Hyper-Parameter Tuning: Expected Improvement Estimation and the Corresponding Lower Bound

Hyperparameter tuning is a common technique for improving the performanc...
research
07/06/2022

Model Agnostic Conformal Hyperparameter Optimization

Several novel frameworks for hyperparameter search have emerged in the l...

Please sign up or login with your details

Forgot password? Click here to reset