Testing Surrogate-Based Optimization with the Fortified Branin-Hoo Extended to Four Dimensions

07/16/2021
by   Charles F Jekel, et al.
0

Some popular functions used to test global optimization algorithms have multiple local optima, all with the same value, making them all global optima. It is easy to make them more challenging by fortifying them via adding a localized bump at the location of one of the optima. In previous work the authors illustrated this for the Branin-Hoo function and the popular differential evolution algorithm, showing that the fortified Branin-Hoo required an order of magnitude more function evaluations. This paper examines the effect of fortifying the Branin-Hoo function on surrogate-based optimization, which usually proceeds by adaptive sampling. Two algorithms are considered. The EGO algorithm, which is based on a Gaussian process (GP) and an algorithm based on radial basis functions (RBF). EGO is found to be more frugal in terms of the number of required function evaluations required to identify the correct basin, but it is expensive to run on a desktop, limiting the number of times the runs could be repeated to establish sound statistics on the number of required function evaluations. The RBF algorithm was cheaper to run, providing more sound statistics on performance. A four-dimensional version of the Branin-Hoo function was introduced in order to assess the effect of dimensionality. It was found that the difference between the ordinary function and the fortified one was much more pronounced for the four-dimensional function compared to the two dimensional one.

READ FULL TEXT

page 3

page 5

research
08/21/2019

A tree-based radial basis function method for noisy parallel surrogate optimization

Parallel surrogate optimization algorithms have proven to be efficient m...
research
09/10/2019

Surrogate-based Optimization using Mutual Information for Computer Experiments (optim-MICE)

The computational burden of running a complex computer model can make op...
research
04/17/2019

SACOBRA with Online Whitening for Solving Optimization Problems with High Conditioning

Real-world optimization problems often have expensive objective function...
research
06/15/2018

Data-Efficient Design Exploration through Surrogate-Assisted Illumination

Design optimization techniques are often used at the beginning of the de...
research
03/30/2020

Initial Design Strategies and their Effects on Sequential Model-Based Optimization

Sequential model-based optimization (SMBO) approaches are algorithms for...
research
06/24/2022

Sequential adaptive design for emulating costly computer codes

Gaussian processes (GPs) are generally regarded as the gold standard sur...

Please sign up or login with your details

Forgot password? Click here to reset