Analyzing Search Techniques for Autotuning Image-based GPU Kernels: The Impact of Sample Sizes

03/25/2022
by   Jacob O. Tørring, et al.
0

Modern computing systems are increasingly more complex, with their multicore CPUs and GPUs accelerators changing yearly, if not more often. It thus has become very challenging to write programs that efficiently use the associated complex memory systems and take advantage of the available parallelism. Autotuning addresses this by optimizing parameterized code to the targeted hardware by searching for the optimal set of parameters. Empirical autotuning has therefore gained interest during the past decades. While new autotuning algorithms are regularly presented and published, we will show why comparing these autotuning algorithms is a deceptively difficult task. In this paper, we describe our empirical study of state-of-the-art search techniques for autotuning by comparing them on a range of sample sizes, benchmarks and architectures. We optimize 6 tunable parameters with a search-space size of over 2 million. The algorithms studied include Random Search (RS), Random Forest Regression (RF), Genetic Algorithms (GA), Bayesian Optimization with Gaussian Processes (BO GP) and Bayesian Optimization with Tree-Parzen Estimators (BO TPE). Our results on the ImageCL benchmark suite suggest that the ideal autotuning algorithm heavily depends on the sample size. In our study, BO GP and BO TPE outperform the other algorithms in most scenarios with sample sizes from 25 to 100. However, GA usually outperforms the others for sample sizes 200 and beyond. We generally see the most speedup to be gained over RS in the lower range of sample sizes (25-100). However, the algorithms more consistently outperform RS for higher sample sizes (200-400). Hence, no single state-of-the-art algorithm outperforms the rest for all sample sizes. Some suggestions for future work are also included.

READ FULL TEXT

page 1

page 7

page 8

research
08/01/2015

The Interactive Effects of Operators and Parameters to GA Performance Under Different Problem Sizes

The complex effect of genetic algorithm's (GA) operators and parameters ...
research
02/17/2021

Using Distance Correlation for Efficient Bayesian Optimization

We propose a novel approach for Bayesian optimization, called GP-DC, whi...
research
09/17/2020

Utilizing remote sensing data in forest inventory sampling via Bayesian optimization

In large-area forest inventories a trade-off between the amount of data ...
research
10/21/2022

Structural Kernel Search via Bayesian Optimization and Symbolical Optimal Transport

Despite recent advances in automated machine learning, model selection i...
research
10/08/2018

Practical Bayesian Optimization for Transportation Simulators

Simulators play a major role in analyzing multi-modal transportation net...
research
03/15/2023

Towards a Benchmarking Suite for Kernel Tuners

As computing system become more complex, it is becoming harder for progr...
research
03/15/2021

Autotuning Benchmarking Techniques: A Roofline Model Case Study

Peak performance metrics published by vendors often do not correspond to...

Please sign up or login with your details

Forgot password? Click here to reset