DeepAI
Log In Sign Up

Benchmarking the Hooke-Jeeves Method, MTS-LS1, and BSrr on the Large-scale BBOB Function Set

04/28/2022
by   Ryoji Tanabe, et al.
0

This paper investigates the performance of three black-box optimizers exploiting separability on the 24 large-scale BBOB functions, including the Hooke-Jeeves method, MTS-LS1, and BSrr. Although BSrr was not specially designed for large-scale optimization, the results show that BSrr has a state-of-the-art performance on the five separable large-scale BBOB functions. The results show that the asymmetry significantly influences the performance of MTS-LS1. The results also show that the Hooke-Jeeves method performs better than MTS-LS1 on unimodal separable BBOB functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/09/2019

Stochastic Implicit Natural Gradient for Black-box Optimization

Black-box optimization is primarily important for many compute-intensive...
11/22/2022

Randomized sketching of nonlinear eigenvalue problems

Rational approximation is a powerful tool to obtain accurate surrogates ...
10/02/2020

Reviewing and Benchmarking Parameter Control Methods in Differential Evolution

Many Differential Evolution (DE) algorithms with various parameter contr...
10/01/2015

An Asynchronous Implementation of the Limited Memory CMA-ES

We present our asynchronous implementation of the LM-CMA-ES algorithm, w...
06/08/2022

Incremental Recursive Ranking Grouping for Large Scale Global Optimization

Real-world optimization problems may have a different underlying structu...
07/15/2022

Distributed Learning of Neural Lyapunov Functions for Large-Scale Networked Dissipative Systems

This paper considers the problem of characterizing the stability region ...
06/24/2019

ZomeFab: Cost-effective Hybrid Fabrication with Zometools

In recent years, personalized fabrication has received considerable atte...