Are we Forgetting about Compositional Optimisers in Bayesian Optimisation?

by   Antoine Grosnit, et al.

Bayesian optimisation presents a sample-efficient methodology for global optimisation. Within this framework, a crucial performance-determining subroutine is the maximisation of the acquisition function, a task complicated by the fact that acquisition functions tend to be non-convex and thus nontrivial to optimise. In this paper, we undertake a comprehensive empirical study of approaches to maximise the acquisition function. Additionally, by deriving novel, yet mathematically equivalent, compositional forms for popular acquisition functions, we recast the maximisation task as a compositional optimisation problem, allowing us to benefit from the extensive literature in this field. We highlight the empirical advantages of the compositional approach to acquisition function maximisation across 3958 individual experiments comprising synthetic optimisation tasks as well as tasks from Bayesmark. Given the generality of the acquisition function maximisation subroutine, we posit that the adoption of compositional optimisers has the potential to yield performance improvements across all domains in which Bayesian optimisation is currently being applied.



page 25

page 27

page 28


HEBO: Heteroscedastic Evolutionary Bayesian Optimisation

We introduce HEBO: Heteroscedastic Evolutionary Bayesian Optimisation th...

What Makes an Effective Scalarising Function for Multi-Objective Bayesian Optimisation?

Performing multi-objective Bayesian optimisation by scalarising the obje...

Tuning Hyperparameters without Grad Students: Scalable and Robust Bayesian Optimisation with Dragonfly

Bayesian Optimisation (BO), refers to a suite of techniques for global o...

Bayesian Optimisation for Constrained Problems

Many real-world optimisation problems such as hyperparameter tuning in m...

Compositional ADAM: An Adaptive Compositional Solver

In this paper, we present C-ADAM, the first adaptive solver for composit...

Emergent Language Generalization and Acquisition Speed are not tied to Compositionality

Studies of discrete languages emerging when neural agents communicate to...

GLASSES: Relieving The Myopia Of Bayesian Optimisation

We present GLASSES: Global optimisation with Look-Ahead through Stochast...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.