Bayesian Optimization Allowing for Common Random Numbers

10/21/2019
by   Michael Pearce, et al.
0

Bayesian optimization is a powerful tool for expensive stochastic black-box optimization problems such as simulation-based optimization or machine learning hyperparameter tuning. Many stochastic objective functions implicitly require a random number seed as input. By explicitly reusing a seed a user can exploit common random numbers, comparing two or more inputs under the same randomly generated scenario, such as a common customer stream in a job shop problem, or the same random partition of training data into training and validation set for a machine learning algorithm. With the aim of finding an input with the best average performance over infinitely many seeds, we propose a novel Gaussian process model that jointly models both the output for each seed and the average. We then introduce the Knowledge gradient for Common Random Numbers that iteratively determines a combination of input and random seed to evaluate the objective and automatically trades off reusing old seeds and querying new seeds, thus overcoming the need to evaluate inputs in batches or measuring differences of pairs as suggested in previous methods. We investigate the Knowledge Gradient for Common Random Numbers both theoretically and empirically, finding it achieves significant performance improvements with only moderate added computational cost.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/05/2016

Predictive Entropy Search for Multi-objective Bayesian Optimization with Constraints

This work presents PESMOC, Predictive Entropy Search for Multi-objective...
research
10/24/2022

We need to talk about random seeds

Modern neural network libraries all take as a hyperparameter a random se...
research
03/23/2018

Bayesian Optimization with Expensive Integrands

We propose a Bayesian optimization algorithm for objective functions tha...
research
09/23/2019

On Model Stability as a Function of Random Seed

In this paper, we focus on quantifying model stability as a function of ...
research
09/30/2022

Efficient computation of the Knowledge Gradient for Bayesian Optimization

Bayesian optimization is a powerful collection of methods for optimizing...
research
06/19/2022

Bayesian Optimization under Stochastic Delayed Feedback

Bayesian optimization (BO) is a widely-used sequential method for zeroth...
research
05/09/2022

Few-shot Mining of Naturally Occurring Inputs and Outputs

Creating labeled natural language training data is expensive and require...

Please sign up or login with your details

Forgot password? Click here to reset