A multi-objective perspective on jointly tuning hardware and hyperparameters

06/10/2021
by   David Salinas, et al.
0

In addition to the best model architecture and hyperparameters, a full AutoML solution requires selecting appropriate hardware automatically. This can be framed as a multi-objective optimization problem: there is not a single best hardware configuration but a set of optimal ones achieving different trade-offs between cost and runtime. In practice, some choices may be overly costly or take days to train. To lift this burden, we adopt a multi-objective approach that selects and adapts the hardware configuration automatically alongside neural architectures and their hyperparameters. Our method builds on Hyperband and extends it in two ways. First, we replace the stopping rule used in Hyperband by a non-dominated sorting rule to preemptively stop unpromising configurations. Second, we leverage hyperparameter evaluations from related tasks via transfer learning by building a probabilistic estimate of the Pareto front that finds promising configurations more efficiently than random search. We show in extensive NAS and HPO experiments that both ingredients bring significant speed-ups and cost savings, with little to no impact on accuracy. In three benchmarks where hardware is selected in addition to hyperparameters, we obtain runtime and cost reductions of at least 5.8x and 8.8x, respectively. Furthermore, when applying our multi-objective method to the tuning of hyperparameters only, we obtain a 10% improvement in runtime while maintaining the same accuracy on two popular NAS benchmarks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/03/2021

Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization

Neural architecture search (NAS) and hyperparameter optimization (HPO) m...
research
12/02/2018

Automatic hyperparameter selection in Autodock

Autodock is a widely used molecular modeling tool which predicts how sma...
research
04/20/2023

HyperTuner: A Cross-Layer Multi-Objective Hyperparameter Auto-Tuning Framework for Data Analytic Services

Hyper-parameters optimization (HPO) is vital for machine learning models...
research
06/11/2019

PABO: Pseudo Agent-Based Multi-Objective Bayesian Hyperparameter Optimization for Efficient Neural Accelerator Design

The ever increasing computational cost of Deep Neural Networks (DNN) and...
research
07/01/2019

Single-Path Mobile AutoML: Efficient ConvNet Design and NAS Hyperparameter Optimization

Can we reduce the search cost of Neural Architecture Search (NAS) from d...
research
05/17/2019

MOBA: A multi-objective bounded-abstention model for two-class cost-sensitive problems

Abstaining classifiers have been widely used in cost-sensitive applicati...

Please sign up or login with your details

Forgot password? Click here to reset