Learning search spaces for Bayesian optimization: Another view of hyperparameter transfer learning

09/27/2019
by   Valerio Perrone, et al.
0

Bayesian optimization (BO) is a successful methodology to optimize black-box functions that are expensive to evaluate. While traditional methods optimize each black-box function in isolation, there has been recent interest in speeding up BO by transferring knowledge across multiple related black-box functions. In this work, we introduce a method to automatically design the BO search space by relying on evaluations of previous black-box functions. We depart from the common practice of defining a set of arbitrary search ranges a priori by considering search space geometries that are learned from historical data. This simple, yet effective strategy can be used to endow many existing BO methods with transfer learning properties. Despite its simplicity, we show that our approach considerably boosts BO by reducing the size of the search space, thus accelerating the optimization of a variety of black-box optimization problems. In particular, the proposed approach combined with random search results in a parameter-free, easy-to-implement, robust hyperparameter optimization strategy. We hope it will constitute a natural baseline for further research attempting to warm-start BO.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/18/2020

Solving Black-Box Optimization Challenge via Learning Search Space Partition for Local Bayesian Optimization

This paper describes our approach to solving the black-box optimization ...
research
05/06/2022

Generative Evolutionary Strategy For Black-Box Optimizations

Many scientific and technological problems are related to optimization. ...
research
03/25/2022

LAMBDA: Covering the Solution Set of Black-Box Inequality by Search Space Quantization

Black-box functions are broadly used to model complex problems that prov...
research
09/30/2019

A Copula approach for hyperparameter transfer learning

Bayesian optimization (BO) is a popular methodology to tune the hyperpar...
research
05/18/2023

Neuromorphic Bayesian Optimization in Lava

The ever-increasing demands of computationally expensive and high-dimens...
research
05/03/2016

Blackbox: A procedure for parallel optimization of expensive black-box functions

This note provides a description of a procedure that is designed to effi...
research
06/11/2021

HPO-B: A Large-Scale Reproducible Benchmark for Black-Box HPO based on OpenML

Hyperparameter optimization (HPO) is a core problem for the machine lear...

Please sign up or login with your details

Forgot password? Click here to reset