Fast And Automatic Floating Point Error Analysis With CHEF-FP

04/13/2023
by   Garima Singh, et al.
0

As we reach the limit of Moore's Law, researchers are exploring different paradigms to achieve unprecedented performance. Approximate Computing (AC), which relies on the ability of applications to tolerate some error in the results to trade-off accuracy for performance, has shown significant promise. Despite the success of AC in domains such as Machine Learning, its acceptance in High-Performance Computing (HPC) is limited due to stringent requirements for accuracy. We need tools and techniques to identify regions of code that are amenable to approximations and their impact on the application output quality to guide developers to employ selective approximation. To this end, we propose CHEF-FP, a flexible, scalable, and easy-to-use source-code transformation tool based on Automatic Differentiation (AD) for analyzing approximation errors in HPC applications. CHEF-FP uses Clad, an efficient AD tool built as a plugin to the Clang compiler and based on the LLVM compiler infrastructure, as a backend and utilizes its AD abilities to evaluate approximation errors in C++ code. CHEF-FP works at the source by injecting error estimation code into the generated adjoints. This enables the error-estimation code to undergo compiler optimizations resulting in improved analysis time and reduced memory usage. We also provide theoretical and architectural augmentations to source code transformation-based AD tools to perform FP error analysis. This paper primarily focuses on analyzing errors introduced by mixed-precision AC techniques. We also show the applicability of our tool in estimating other kinds of errors by evaluating our tool on codes that use approximate functions. Moreover, we demonstrate the speedups CHEF-FP achieved during analysis time compared to the existing state-of-the-art tool due to its ability to generate and insert approximation error estimate code directly into the derivative source.

READ FULL TEXT

page 1

page 8

research
10/18/2018

Don't Unroll Adjoint: Differentiating SSA-Form Programs

This paper presents reverse-mode algorithmic differentiation (AD) based ...
research
10/04/2020

Instead of Rewriting Foreign Code for Machine Learning, Automatically Synthesize Fast Gradients

Applying differentiable programming techniques and machine learning algo...
research
02/23/2021

Event-Based Automatic Differentiation of OpenMP with OpDiLib

We present the new software OpDiLib, a universal add-on for classical op...
research
09/13/2022

AnICA: Analyzing Inconsistencies in Microarchitectural Code Analyzers

Microarchitectural code analyzers, i.e., tools that estimate the through...
research
04/04/2023

Automatic Differentiation of Binned Likelihoods With Roofit and Clad

RooFit is a toolkit for statistical modeling and fitting used by most ex...
research
03/11/2022

GPU Accelerated Automatic Differentiation With Clad

Automatic Differentiation (AD) is instrumental for science and industry....
research
02/12/2020

Performance analysis of Volna-OP2 – massively parallel code for tsunami modelling

The software package Volna-OP2 is a robust and efficient code capable of...

Please sign up or login with your details

Forgot password? Click here to reset