PHS: A Toolbox for Parellel Hyperparameter Search

02/26/2020
by   Peter Michael Habelitz, et al.
20

We introduce an open source python framework named PHS - Parallel Hyperparameter Search to enable hyperparameter optimization on numerous compute instances of any arbitrary python function. This is achieved with minimal modifications inside the target function. Possible applications appear in expensive to evaluate numerical computations which strongly depend on hyperparameters such as machine learning. Bayesian optimization is chosen as a sample efficient method to propose the next query set of parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/26/2020

PHS: A Toolbox for Parallel Hyperparameter Search

We introduce an open source python framework named PHS - Parallel Hyperp...
research
06/18/2020

Multi-Source Unsupervised Hyperparameter Optimization

How can we conduct efficient hyperparameter optimization for a completel...
research
01/13/2023

Hyperparameter Optimization as a Service on INFN Cloud

The simplest and often most effective way of parallelizing the training ...
research
05/22/2020

MANGO: A Python Library for Parallel Hyperparameter Tuning

Tuning hyperparameters for machine learning algorithms is a tedious task...
research
08/07/2023

HomOpt: A Homotopy-Based Hyperparameter Optimization Method

Machine learning has achieved remarkable success over the past couple of...
research
10/10/2022

Multi-step Planning for Automated Hyperparameter Optimization with OptFormer

As machine learning permeates more industries and models become more exp...
research
04/20/2023

PED-ANOVA: Efficiently Quantifying Hyperparameter Importance in Arbitrary Subspaces

The recent rise in popularity of Hyperparameter Optimization (HPO) for d...

Please sign up or login with your details

Forgot password? Click here to reset