Mixed Precision s-step Lanczos and Conjugate Gradient Algorithms

03/16/2021
by   Erin Carson, et al.
0

Compared to the classical Lanczos algorithm, the s-step Lanczos variant has the potential to improve performance by asymptotically decreasing the synchronization cost per iteration. However, this comes at a cost. Despite being mathematically equivalent, the s-step variant is known to behave quite differently in finite precision, with potential for greater loss of accuracy and a decrease in the convergence rate relative to the classical algorithm. It has previously been shown that the errors that occur in the s-step version follow the same structure as the errors in the classical algorithm, but with the addition of an amplification factor that depends on the square of the condition number of the O(s)-dimensional Krylov bases computed in each outer loop. As the condition number of these s-step bases grows (in some cases very quickly) with s, this limits the parameter s that can be chosen and thus limits the performance that can be achieved. In this work we show that if a select few computations in s-step Lanczos are performed in double the working precision, the error terms then depend only linearly on the conditioning of the s-step bases. This has the potential for drastically improving the numerical behavior of the algorithm with little impact on per-iteration performance. Our numerical experiments demonstrate the improved numerical behavior possible with the mixed precision approach, and also show that this improved behavior extends to the s-step CG algorithm in mixed precision.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

01/15/2017

The Adaptive s-step Conjugate Gradient Method

On modern large-scale parallel computers, the performance of Krylov subs...
08/12/2019

An Adaptive s-step Conjugate Gradient Algorithm with Dynamic Basis Updating

The adaptive s-step CG algorithm is a solver for sparse, symmetric posit...
07/13/2021

Multistage Mixed Precision Iterative Refinement

Low precision arithmetic, in particular half precision (16-bit) floating...
12/12/2019

Mixed-Precision analysis of Householder QR Algorithms

Although mixed precision arithmetic has recently garnered interest for t...
05/14/2019

On the Convergence Rate of Variants of the Conjugate Gradient Algorithm in Finite Precision Arithmetic

We consider three mathematically equivalent variants of the conjugate gr...
03/22/2016

A mixed precision semi-Lagrangian algorithm and its performance on accelerators

In this paper we propose a mixed precision algorithm in the context of t...
04/09/2018

Numerical stability analysis of the class of communication hiding pipelined Conjugate Gradient methods

Krylov subspace methods are widely known as efficient algebraic methods ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.