Multi-objective Tree-structured Parzen Estimator Meets Meta-learning

12/13/2022
by   Shuhei Watanabe, et al.
0

Hyperparameter optimization (HPO) is essential for the better performance of deep learning, and practitioners often need to consider the trade-off between multiple metrics, such as error rate, latency, memory requirements, robustness, and algorithmic fairness. Due to this demand and the heavy computation of deep learning, the acceleration of multi-objective (MO) optimization becomes ever more important. Although meta-learning has been extensively studied to speedup HPO, existing methods are not applicable to the MO tree-structured parzen estimator (MO-TPE), a simple yet powerful MO-HPO algorithm. In this paper, we extend TPE's acquisition function to the meta-learning setting, using a task similarity defined by the overlap in promising domains of each task. In a comprehensive set of experiments, we demonstrate that our method accelerates MO-TPE on tabular HPO benchmarks and yields state-of-the-art performance. Our method was also validated externally by winning the AutoML 2022 competition on "Multiobjective Hyperparameter Optimization for Transformers".

READ FULL TEXT
research
06/02/2023

Multi-Objective Population Based Training

Population Based Training (PBT) is an efficient hyperparameter optimizat...
research
05/26/2022

Towards Learning Universal Hyperparameter Optimizers with Transformers

Meta-learning hyperparameter optimization (HPO) algorithms from prior ex...
research
02/14/2021

Multi-Objective Meta Learning

Meta learning with multiple objectives can be formulated as a Multi-Obje...
research
04/21/2023

Tree-structured Parzen estimator: Understanding its algorithm components and their roles for better empirical performance

Recent advances in many domains require more and more complicated experi...
research
06/13/2018

Far-HO: A Bilevel Programming Package for Hyperparameter Optimization and Meta-Learning

In (Franceschi et al., 2018) we proposed a unified mathematical framewor...
research
10/25/2018

Truncated Back-propagation for Bilevel Optimization

Bilevel optimization has been recently revisited for designing and analy...

Please sign up or login with your details

Forgot password? Click here to reset