Hyperparameter Transfer Learning with Adaptive Complexity

02/25/2021
by   Samuel Horvath, et al.
18

Bayesian optimization (BO) is a sample efficient approach to automatically tune the hyperparameters of machine learning models. In practice, one frequently has to solve similar hyperparameter tuning problems sequentially. For example, one might have to tune a type of neural network learned across a series of different classification problems. Recent work on multi-task BO exploits knowledge gained from previous tuning tasks to speed up a new tuning task. However, previous approaches do not account for the fact that BO is a sequential decision making procedure. Hence, there is in general a mismatch between the number of evaluations collected in the current tuning task compared to the number of evaluations accumulated in all previously completed tasks. In this work, we enable multi-task BO to compensate for this mismatch, such that the transfer learning procedure is able to handle different data regimes in a principled way. We propose a new multi-task BO method that learns a set of ordered, non-linear basis functions of increasing complexity via nested drop-out and automatic relevance determination. Experiments on a variety of hyperparameter tuning problems show that our method improves the sample ef

READ FULL TEXT

page 1

page 2

page 3

page 4

02/23/2020

Weighting Is Worth the Wait: Bayesian Optimization with Importance Sampling

Many contemporary machine learning models require extensive tuning of hy...
09/30/2019

A Copula approach for hyperparameter transfer learning

Bayesian optimization (BO) is a popular methodology to tune the hyperpar...
09/16/2021

Automatic prior selection for meta Bayesian optimization with a case study on tuning deep neural network optimizers

The performance of deep neural networks can be highly sensitive to the c...
07/07/2022

Pre-training helps Bayesian optimization too

Bayesian optimization (BO) has become a popular strategy for global opti...
12/30/2019

Model-Agnostic Approaches to Multi-Objective Simultaneous Hyperparameter Tuning and Feature Selection

Highly non-linear machine learning algorithms have the capacity to handl...
11/17/2021

Self-Learning Tuning for Post-Silicon Validation

Increasing complexity of modern chips makes design validation more diffi...
10/15/2018

Hyperparameter Learning via Distributional Transfer

Bayesian optimisation is a popular technique for hyperparameter learning...