PASHA: Efficient HPO with Progressive Resource Allocation

07/14/2022
by   Ondrej Bohdal, et al.
0

Hyperparameter optimization (HPO) and neural architecture search (NAS) are methods of choice to obtain the best-in-class machine learning models, but in practice they can be costly to run. When models are trained on large datasets, tuning them with HPO or NAS rapidly becomes prohibitively expensive for practitioners, even when efficient multi-fidelity methods are employed. We propose an approach to tackle the challenge of tuning machine learning models trained on large datasets with limited computational resources. Our approach, named PASHA, is able to dynamically allocate maximum resources for the tuning procedure depending on the need. The experimental comparison shows that PASHA identifies well-performing hyperparameter configurations and architectures while consuming significantly fewer computational resources than solutions like ASHA.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/08/2020

Sherpa: Robust Hyperparameter Optimization for Machine Learning

Sherpa is a hyperparameter optimization library for machine learning mod...
research
01/17/2019

EAT-NAS: Elastic Architecture Transfer for Accelerating Large-scale Neural Architecture Search

Neural architecture search (NAS) methods have been proposed to release h...
research
03/30/2021

A resource-efficient method for repeated HPO and NAS problems

In this work we consider the problem of repeated hyperparameter and neur...
research
01/23/2023

Efficient Training Under Limited Resources

Training time budget and size of the dataset are among the factors affec...
research
11/12/2021

A Simple and Fast Baseline for Tuning Large XGBoost Models

XGBoost, a scalable tree boosting algorithm, has proven effective for ma...
research
06/13/2020

Neural Architecture Search using Bayesian Optimisation with Weisfeiler-Lehman Kernel

Bayesian optimisation (BO) has been widely used for hyperparameter optim...
research
07/28/2023

Is One Epoch All You Need For Multi-Fidelity Hyperparameter Optimization?

Hyperparameter optimization (HPO) is crucial for fine-tuning machine lea...

Please sign up or login with your details

Forgot password? Click here to reset