Cost-aware Multi-objective Bayesian optimisation

09/09/2019
by   Majid Abdolshah, et al.
0

The notion of expense in Bayesian optimisation generally refers to the uniformly expensive cost of function evaluations over the whole search space. However, in some scenarios, the cost of evaluation for black-box objective functions is non-uniform since different inputs from search space may incur different costs for function evaluations. We introduce a cost-aware multi-objective Bayesian optimisation with non-uniform evaluation cost over objective functions by defining cost-aware constraints over the search space. The cost-aware constraints are a sorted tuple of indexes that demonstrate the ordering of dimensions of the search space based on the user's prior knowledge about their cost of usage. We formulate a new multi-objective Bayesian optimisation acquisition function with detailed analysis of the convergence that incorporates this cost-aware constraints while optimising the objective functions. We demonstrate our algorithm based on synthetic and real-world problems in hyperparameter tuning of neural networks and random forests.

READ FULL TEXT

page 1

page 4

research
03/27/2020

Incorporating Expert Prior in Bayesian Optimisation via Space Warping

Bayesian optimisation is a well-known sample-efficient method for the op...
research
02/22/2023

MONGOOSE: Path-wise Smooth Bayesian Optimisation via Meta-learning

In Bayesian optimisation, we often seek to minimise the black-box object...
research
07/19/2022

On the development of a Bayesian optimisation framework for complex unknown systems

Bayesian optimisation provides an effective method to optimise expensive...
research
10/12/2020

Multi-Objective Bayesian Optimisation and Joint Inversion for Active Sensor Fusion

A critical decision process in data acquisition for mineral and energy r...
research
10/15/2021

Choice functions based multi-objective Bayesian optimisation

In this work we introduce a new framework for multi-objective Bayesian o...
research
03/15/2019

Tuning Hyperparameters without Grad Students: Scalable and Robust Bayesian Optimisation with Dragonfly

Bayesian Optimisation (BO), refers to a suite of techniques for global o...
research
06/07/2019

When and Why Metaheuristics Researchers Can Ignore "No Free Lunch" Theorems

The No Free Lunch (NFL) theorem for search and optimisation states that ...

Please sign up or login with your details

Forgot password? Click here to reset