On strong homogeneity of a class of global optimization algorithms working with infinite and infinitesimal scales

01/13/2018
by   Yaroslav D. Sergeyev, et al.
0

The necessity to find the global optimum of multiextremal functions arises in many applied problems where finding local solutions is insufficient. One of the desirable properties of global optimization methods is strong homogeneity meaning that a method produces the same sequences of points where the objective function is evaluated independently both of multiplication of the function by a scaling constant and of adding a shifting constant. In this paper, several aspects of global optimization using strongly homogeneous methods are considered. First, it is shown that even if a method possesses this property theoretically, numerically very small and large scaling constants can lead to ill-conditioning of the scaled problem. Second, a new class of global optimization problems where the objective function can have not only finite but also infinite or infinitesimal Lipschitz constants is introduced. Third, the strong homogeneity of several Lipschitz global optimization algorithms is studied in the framework of the Infinity Computing paradigm allowing one to work numerically with a variety of infinities and infinitesimals. Fourth, it is proved that a class of efficient univariate methods enjoys this property for finite, infinite and infinitesimal scaling and shifting constants. Finally, it is shown that in certain cases the usage of numerical infinities and infinitesimals can avoid ill-conditioning produced by scaling. Numerical experiments illustrating theoretical results are described.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/15/2019

Safe global optimization of expensive noisy black-box functions in the δ-Lipschitz framework

In this paper, the problem of safe global maximization (it should not be...
research
01/18/2022

Low Regret Binary Sampling Method for Efficient Global Optimization of Univariate Functions

In this work, we propose a computationally efficient algorithm for the p...
research
03/07/2017

Global optimization of Lipschitz functions

The goal of the paper is to design sequential strategies which lead to e...
research
04/05/2020

Regularized asymptotic descents for a class of nonconvex optimization problems

We propose and analyze regularized asymptotic descent (RAD) methods for ...
research
03/03/2022

Parametric complexity analysis for a class of first-order Adagrad-like algorithms

A class of algorithms for optimization in the presence of noise is prese...
research
07/14/2021

A Granular Sieving Algorithm for Deterministic Global Optimization

A gradient-free deterministic method is developed to solve global optimi...
research
12/06/2011

Entropy Search for Information-Efficient Global Optimization

Contemporary global optimization algorithms are based on local measures ...

Please sign up or login with your details

Forgot password? Click here to reset