Hyperparameter Transfer Learning with Adaptive Complexity

02/25/2021
by   Samuel Horvath, et al.
18

Bayesian optimization (BO) is a sample efficient approach to automatically tune the hyperparameters of machine learning models. In practice, one frequently has to solve similar hyperparameter tuning problems sequentially. For example, one might have to tune a type of neural network learned across a series of different classification problems. Recent work on multi-task BO exploits knowledge gained from previous tuning tasks to speed up a new tuning task. However, previous approaches do not account for the fact that BO is a sequential decision making procedure. Hence, there is in general a mismatch between the number of evaluations collected in the current tuning task compared to the number of evaluations accumulated in all previously completed tasks. In this work, we enable multi-task BO to compensate for this mismatch, such that the transfer learning procedure is able to handle different data regimes in a principled way. We propose a new multi-task BO method that learns a set of ordered, non-linear basis functions of increasing complexity via nested drop-out and automatic relevance determination. Experiments on a variety of hyperparameter tuning problems show that our method improves the sample ef

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/23/2020

Weighting Is Worth the Wait: Bayesian Optimization with Importance Sampling

Many contemporary machine learning models require extensive tuning of hy...
research
09/30/2019

A Copula approach for hyperparameter transfer learning

Bayesian optimization (BO) is a popular methodology to tune the hyperpar...
research
09/16/2021

Automatic prior selection for meta Bayesian optimization with a case study on tuning deep neural network optimizers

The performance of deep neural networks can be highly sensitive to the c...
research
07/07/2022

Pre-training helps Bayesian optimization too

Bayesian optimization (BO) has become a popular strategy for global opti...
research
04/27/2023

π-Tuning: Transferring Multimodal Foundation Models with Optimal Multi-task Interpolation

Foundation models have achieved great advances in multi-task learning wi...
research
06/29/2023

Obeying the Order: Introducing Ordered Transfer Hyperparameter Optimisation

We introduce ordered transfer hyperparameter optimisation (OTHPO), a ver...
research
11/17/2021

Self-Learning Tuning for Post-Silicon Validation

Increasing complexity of modern chips makes design validation more diffi...

Please sign up or login with your details

Forgot password? Click here to reset