Learning to Warm-Start Bayesian Hyperparameter Optimization

10/17/2017
by   Jungtaek Kim, et al.
0

Hyperparameter optimization undergoes extensive evaluations of validation errors in order to find its best configuration. Bayesian optimization is now popular for hyperparameter optimization, since it reduces the number of validation error evaluations required. Suppose that we are given a collection of datasets on which hyperparameters are already tuned by either humans with domain expertise or extensive trials of cross-validation. When a model is applied to a new dataset, it is desirable to let Bayesian optimization start from configurations that were successful on similar datasets. To this end, we construct a Siamese network with convolutional layers followed by bi-directional LSTM layers, to learn meta-features over image datasets. Learned meta-features are used to select a few datasets that are similar to the new dataset, so that a set of configurations in similar datasets is adopted as initialization to warm-start Bayesian hyperparameter optimization. Experiments on image datasets demonstrate that our learned meta-features are useful in optimizing hyperparameters in deep residual networks for image classification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2023

Quick-Tune: Quickly Learning Which Pretrained Model to Finetune and How

With the ever-increasing number of pretrained models, machine learning p...
research
04/25/2023

Bayesian Optimization Meets Self-Distillation

Bayesian optimization (BO) has contributed greatly to improving model pe...
research
09/07/2019

Transferable Neural Processes for Hyperparameter Optimization

Automated machine learning aims to automate the whole process of machine...
research
02/07/2021

Hyperparameter Optimization with Differentiable Metafeatures

Metafeatures, or dataset characteristics, have been shown to improve the...
research
05/21/2020

HyperSTAR: Task-Aware Hyperparameters for Deep Networks

While deep neural networks excel in solving visual recognition tasks, th...
research
04/16/2021

Overfitting in Bayesian Optimization: an empirical study and early-stopping solution

Bayesian Optimization (BO) is a successful methodology to tune the hyper...

Please sign up or login with your details

Forgot password? Click here to reset