Derivative-Free Multiobjective Trust Region Descent Method Using Radial Basis Function Surrogate Models

02/26/2021
by   Manuel Berkemeier, et al.
0

We present a flexible trust region descend algorithm for unconstrained and convexly constrained multiobjective optimization problems. It is targeted at heterogeneous and expensive problems, i.e., problems that have at least one objective function that is computationally expensive. The method is derivative-free in the sense that neither need derivative information be available for the expensive objectives nor are gradients approximated using repeated function evaluations as is the case in finite-difference methods. Instead, a multiobjective trust region approach is used that works similarly to its well-known scalar pendants. Local surrogate models constructed from evaluation data of the true objective functions are employed to compute possible descent directions. In contrast to existing multiobjective trust region algorithms, these surrogates are not polynomial but carefully constructed radial basis function networks. This has the important advantage that the number of data points scales linearly with the parameter space dimension. The local models qualify as fully linear and the corresponding general scalar framework is adapted for problems with multiple objectives. Convergence to Pareto critical points is proven and numerical examples illustrate our findings.

READ FULL TEXT
research
01/19/2022

A trust region reduced basis Pascoletti-Serafini algorithm for multi-objective PDE-constrained parameter optimization

In the present paper non-convex multi-objective parameter optimization p...
research
05/30/2018

A Radial Basis Function based Optimization Algorithm with Regular Simplex set geometry in Ellipsoidal Trust-Regions

We present a novel derivative-free interpolation based optimization algo...
research
01/22/2021

Surrogate Models for Optimization of Dynamical Systems

Driven by increased complexity of dynamical systems, the solution of sys...
research
01/18/2021

Accelerating Derivative-Free Optimization with Dimension Reduction and Hyperparameter Learning

We consider convex, black-box objective functions with additive or multi...
research
12/15/2021

Error Analysis of Surrogate Models Constructed through Operations on Sub-models

Model-based methods are popular in derivative-free optimization (DFO). I...
research
02/24/2014

A hybrid swarm-based algorithm for single-objective optimization problems involving high-cost analyses

In many technical fields, single-objective optimization procedures in co...
research
07/26/2020

Scalable Derivative-Free Optimization for Nonlinear Least-Squares Problems

Derivative-free - or zeroth-order - optimization (DFO) has gained recent...

Please sign up or login with your details

Forgot password? Click here to reset