Incremental Recursive Ranking Grouping for Large Scale Global Optimization

06/08/2022
by   Marcin Michal Komarnicki, et al.
0

Real-world optimization problems may have a different underlying structure. In black-box optimization, the dependencies between decision variables remain unknown. However, some techniques can discover such interactions accurately. In Large Scale Global Optimization (LSGO), problems are high-dimensional. It was shown effective to decompose LSGO problems into subproblems and optimize them separately. The effectiveness of such approaches may be highly dependent on the accuracy of problem decomposition. Many state-of-the-art decomposition strategies are derived from Differential Grouping (DG). However, if a given problem consists of non-additively separable subproblems, their ability to detect only true interactions might decrease significantly. Therefore, we propose Incremental Recursive Ranking Grouping (IRRG) that does not suffer from this flaw. IRRG consumes more fitness function evaluations than the recent DG-based propositions, e.g., Recursive DG 3 (RDG3). Nevertheless, the effectiveness of the considered Cooperative Co-evolution frameworks after embedding IRRG or RDG3 was similar for problems with additively separable subproblems that are suitable for RDG3. However, after replacing the additive separability with non-additive, embedding IRRG leads to results of significantly higher quality.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset