
LandscapeAware FixedBudget Performance Regression and Algorithm Selection for Modular CMAES Variants
Automated algorithm selection promises to support the user in the decisi...
read it

Black Box Algorithm Selection by Convolutional Neural Network
Although a large number of optimization algorithms have been proposed fo...
read it

Personalizing Performance Regression Models to BlackBox Optimization Problems
Accurately predicting the performance of different optimization algorith...
read it

Exploratory Landscape Analysis is Strongly Sensitive to the Sampling Strategy
Exploratory landscape analysis (ELA) supports supervised learning approa...
read it

Extreme Algorithm Selection With Dyadic Feature Representation
Algorithm selection (AS) deals with selecting an algorithm from a fixed ...
read it

Towards Explainable Exploratory Landscape Analysis: Extreme Feature Selection for Classifying BBOB Functions
Facilitated by the recent advances of Machine Learning (ML), the automat...
read it

Generalization in portfoliobased algorithm selection
Portfoliobased algorithm selection has seen tremendous practical succes...
read it
Towards FeatureBased Performance Regression Using Trajectory Data
Blackbox optimization is a very active area of research, with many new algorithms being developed every year. This variety is needed, on the one hand, since different algorithms are most suitable for different types of optimization problems. But the variety also poses a metaproblem: which algorithm to choose for a given problem at hand? Past research has shown that perinstance algorithm selection based on exploratory landscape analysis (ELA) can be an efficient mean to tackle this metaproblem. Existing approaches, however, require the approximation of problem features based on a significant number of samples, which are typically selected through uniform sampling or Latin Hypercube Designs. The evaluation of these points is costly, and the benefit of an ELAbased algorithm selection over a default algorithm must therefore be significant in order to pay off. One could hope to bypass the evaluations for the feature approximations by using the samples that a default algorithm would anyway perform, i.e., by using the points of the default algorithm's trajectory. We analyze in this paper how well such an approach can work. Concretely, we test how accurately trajectorybased ELA approaches can predict the final solution quality of the CMAES after a fixed budget of function evaluations. We observe that the loss of trajectorybased predictions can be surprisingly small compared to the classical global sampling approach, if the remaining budget for which solution quality shall be predicted is not too large. Feature selection, in contrast, did not show any advantage in our experiments and rather led to worsened prediction accuracy. The inclusion of state variables of CMAES only has a moderate effect on the prediction accuracy.
READ FULL TEXT
Comments
There are no comments yet.