Tuning metaheuristics by sequential optimization of regression models

09/11/2018
by   Áthila R. Trindade, et al.
0

Tuning parameters is an important step for the application of metaheuristics to problem classes of interest. In this work we present a tuning framework based on the sequential optimization of perturbed regression models. Besides providing algorithm configurations with good expected performance, the proposed methodology can also provide insights on the relevance of each parameter and their interactions, as well as models of expected algorithm performance for a given problem class, conditional on the parameter values. A test case is presented for the tuning of six parameters of a decomposition-based multiobjective optimization algorithm, in which an instantiation of the proposed framework is compared against the results obtained by the most recent version the Iterated Racing (Irace) procedure. The results suggest that the proposed approach returns solutions that are as good as those of Irace in terms of mean performance, with the advantage of providing more information on the relevance and effect of each parameter on the expected performance of the algorithm.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset