HYPPO: A Surrogate-Based Multi-Level Parallelism Tool for Hyperparameter Optimization

10/04/2021
by   Vincent Dumont, et al.
65

We present a new software, HYPPO, that enables the automatic tuning of hyperparameters of various deep learning (DL) models. Unlike other hyperparameter optimization (HPO) methods, HYPPO uses adaptive surrogate models and directly accounts for uncertainty in model predictions to find accurate and reliable models that make robust predictions. Using asynchronous nested parallelism, we are able to significantly alleviate the computational burden of training complex architectures and quantifying the uncertainty. HYPPO is implemented in Python and can be used with both TensorFlow and PyTorch libraries. We demonstrate various software features on time-series prediction and image classification problems as well as a scientific application in computed tomography image reconstruction. Finally, we show that (1) we can reduce by an order of magnitude the number of evaluations necessary to find the most optimal region in the hyperparameter space and (2) we can reduce by two orders of magnitude the throughput for such HPO process to complete.

READ FULL TEXT

page 4

page 6

page 8

page 9

page 10

research
05/30/2021

Surrogate Model Based Hyperparameter Tuning for Deep Learning with SPOT

A surrogate model based hyperparameter tuning approach for deep learning...
research
03/24/2020

Model-based Asynchronous Hyperparameter Optimization

We introduce a model-based asynchronous multi-fidelity hyperparameter op...
research
01/06/2021

Hyperboost: Hyperparameter Optimization by Gradient Boosting surrogate models

Bayesian Optimization is a popular tool for tuning algorithms in automat...
research
07/30/2019

pySOT and POAP: An event-driven asynchronous framework for surrogate optimization

This paper describes Plumbing for Optimization with Asynchronous Paralle...
research
02/05/2019

How to "DODGE" Complex Software Analytics?

AI software is still software. Software engineers need better tools to m...
research
05/30/2019

Meta-Surrogate Benchmarking for Hyperparameter Optimization

Despite the recent progress in hyperparameter optimization (HPO), availa...
research
05/27/2023

Python Wrapper for Simulating Multi-Fidelity Optimization on HPO Benchmarks without Any Wait

Hyperparameter (HP) optimization of deep learning (DL) is essential for ...

Please sign up or login with your details

Forgot password? Click here to reset