IOHprofiler: A Benchmarking and Profiling Tool for Iterative Optimization Heuristics
IOHprofiler is a new tool for analyzing and comparing iterative optimization heuristics. Given as input algorithms and problems written in C or Python, it provides as output a statistical evaluation of the algorithms' performance by means of the distribution on the fixed-target running time and the fixed-budget function values. In addition, IOHprofiler also allows to track the evolution of algorithm parameters, making our tool particularly useful for the analysis, comparison, and design of (self-)adaptive algorithms. IOHprofiler is a ready-to-use software. It consists of two parts: an experimental part, which generates the running time data, and a post-processing part, which produces the summarizing comparisons and statistical evaluations. The experimental part is build on the COCO software, which has been adjusted to cope with optimization problems that are formulated as functions f:S^n → with S being a discrete alphabet of integers. The post-processing part is our own work. It can be used as a stand-alone tool for the evaluation of running time data of arbitrary benchmark problems. It accepts as input files not only the output files of IOHprofiler, but also original COCO data files. The post-processing tool is designed for an interactive evaluation, allowing the user to chose the ranges and the precision of the displayed data according to his/her needs. IOHprofiler is available on GitHub at <https://github.com/IOHprofiler>.
READ FULL TEXT