Massively Parallel Hyperparameter Tuning

10/13/2018
by   Liam Li, et al.
0

Modern learning models are characterized by large hyperparameter spaces. In order to adequately explore these large spaces, we must evaluate a large number of configurations, typically orders of magnitude more configurations than available parallel workers. Given the growing costs of model training, we would ideally like to perform this search in roughly the same wall-clock time needed to train a single model. In this work, we tackle this challenge by introducing ASHA, a simple and robust hyperparameter tuning algorithm with solid theoretical underpinnings that exploits parallelism and aggressive early-stopping. Our extensive empirical results show that ASHA slightly outperforms Fabolas and Population Based Tuning, state-of-the hyperparameter tuning methods; scales linearly with the number of workers in distributed settings; converges to a high quality configuration in half the time taken by Vizier (Google's internal hyperparameter tuning service) in an experiment with 500 workers; and beats the published result for a near state-of-the-art LSTM architecture in under 2x the time to train a single model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/04/2022

ACE: Adaptive Constraint-aware Early Stopping in Hyperparameter Optimization

Deploying machine learning models requires high model quality and needs ...
research
05/22/2020

MANGO: A Python Library for Parallel Hyperparameter Tuning

Tuning hyperparameters for machine learning algorithms is a tedious task...
research
04/20/2018

Autotune: A Derivative-free Optimization Framework for Hyperparameter Tuning

Machine learning applications often require hyperparameter tuning. The h...
research
09/16/2019

Weighted Sampling for Combined Model Selection and Hyperparameter Tuning

The combined algorithm selection and hyperparameter tuning (CASH) proble...
research
03/13/2020

Accelerating and Improving AlphaZero Using Population Based Training

AlphaZero has been very successful in many games. Unfortunately, it stil...
research
03/12/2019

Exploiting Reuse in Pipeline-Aware Hyperparameter Tuning

Hyperparameter tuning of multi-stage pipelines introduces a significant ...

Please sign up or login with your details

Forgot password? Click here to reset