A Test for FLOPs as a Discriminant for Linear Algebra Algorithms

09/07/2022
by   Aravind Sankaran, et al.
0

Linear algebra expressions, which play a central role in countless scientific computations, are often computed via a sequence of calls to existing libraries of building blocks (such as those provided by BLAS and LAPACK). A sequence identifies a computing strategy, i.e., an algorithm, and normally for one linear algebra expression many alternative algorithms exist. Although mathematically equivalent, those algorithms might exhibit significant differences in terms of performance. Several high-level languages and tools for matrix computations such as Julia, Armadillo, Linnea, etc., make algorithmic choices by minimizing the number of Floating Point Operations (FLOPs). However, there can be several algorithms that share the same (or have nearly identical) number of FLOPs; in many cases, these algorithms exhibit execution times which are statistically equivalent and one could arbitrarily select one of them as the best algorithm. It is however not unlikely to find cases where the execution times are significantly different from one another (despite the FLOP count being almost the same). It is also possible that the algorithm that minimizes FLOPs is not the one that minimizes execution time. In this work, we develop a methodology to test the reliability of FLOPs as discriminant for linear algebra algorithms. Given a set of algorithms (for an instance of a linear algebra expression) as input, the methodology ranks them into performance classes; i.e., multiple algorithms are allowed to share the same rank. To this end, we measure the algorithms iteratively until the changes in the ranks converge to a value close to zero. FLOPs are a valid discriminant for an instance if all the algorithms with minimum FLOPs are assigned the best rank; otherwise, the instance is regarded as an anomaly, which can then be used in the investigation of the root cause of performance differences.

READ FULL TEXT

page 1

page 5

research
07/05/2022

FLOPs as a Discriminant for Dense Linear Algebra Algorithms

Expressions that involve matrices and vectors, known as linear algebra e...
research
12/30/2019

Linnea: Automatic Generation of Efficient Linear Algebra Programs

The translation of linear algebra computations into efficient sequences ...
research
02/25/2021

Performance Comparison for Scientific Computations on the Edge via Relative Performance

In a typical Internet-of-Things setting that involves scientific applica...
research
10/14/2020

Robust Ranking of Equivalent Algorithms via Relative Performance

In scientific computing, it is common that one target computation can be...
research
04/10/2018

The Generalized Matrix Chain Algorithm

In this paper, we present a generalized version of the matrix chain algo...
research
06/22/2020

Primitive idempotents in central simple algebras over 𝔽_q(t) with an application to coding theory

We consider the algorithmic problem of computing a primitive idempotent ...
research
08/20/2018

A Simple Methodology for Computing Families of Algorithms

Discovering "good" algorithms for an operation is often considered an ar...

Please sign up or login with your details

Forgot password? Click here to reset