The Hypervolume Indicator Hessian Matrix: Analytical Expression, Computational Time Complexity, and Sparsity

11/08/2022
by   André H. Deutz, et al.
0

The problem of approximating the Pareto front of a multiobjective optimization problem can be reformulated as the problem of finding a set that maximizes the hypervolume indicator. This paper establishes the analytical expression of the Hessian matrix of the mapping from a (fixed size) collection of n points in the d-dimensional decision space (or m dimensional objective space) to the scalar hypervolume indicator value. To define the Hessian matrix, the input set is vectorized, and the matrix is derived by analytical differentiation of the mapping from a vectorized set to the hypervolume indicator. The Hessian matrix plays a crucial role in second-order methods, such as the Newton-Raphson optimization method, and it can be used for the verification of local optimal sets. So far, the full analytical expression was only established and analyzed for the relatively simple bi-objective case. This paper will derive the full expression for arbitrary dimensions (m≥2 objective functions). For the practically important three-dimensional case, we also provide an asymptotically efficient algorithm with time complexity in O(nlog n) for the exact computation of the Hessian Matrix' non-zero entries. We establish a sharp bound of 12m-6 for the number of non-zero entries. Also, for the general m-dimensional case, a compact recursive analytical expression is established, and its algorithmic implementation is discussed. Also, for the general case, some sparsity results can be established; these results are implied by the recursive expression. To validate and illustrate the analytically derived algorithms and results, we provide a few numerical examples using Python and Mathematica implementations. Open-source implementations of the algorithms and testing data are made available as a supplement to this paper.

READ FULL TEXT

page 4

page 10

research
10/20/2022

HesScale: Scalable Computation of Hessian Diagonals

Second-order optimization uses curvature information about the objective...
research
10/15/2019

Adjoint-based exact Hessian-vector multiplication using symplectic Runge–Kutta methods

We consider a function of the numerical solution of an initial value pro...
research
11/03/2017

First-order Stochastic Algorithms for Escaping From Saddle Points in Almost Linear Time

Two classes of methods have been proposed for escaping from saddle point...
research
12/22/2019

Modeling Hessian-vector products in nonlinear optimization: New Hessian-free methods

In this paper, we suggest two ways of calculating interpolation models f...
research
04/04/2020

Optimization methods for achieving high diffraction efficiency with perfect electric conducting gratings

This work presents the implementation, analysis, and convergence study o...
research
12/11/2016

Improved Quick Hypervolume Algorithm

In this paper, we present a significant improvement of Quick Hypervolume...
research
01/10/2020

Dominance Move calculation using a MIP approach for comparison of multi and many-objective optimization solution sets

Dominance move (DoM) is a binary quality indicator that can be used in m...

Please sign up or login with your details

Forgot password? Click here to reset