Two-Stage Gauss–Seidel Preconditioners and Smoothers for Krylov Solvers on a GPU cluster

04/02/2021
by   Luc Berger-Vergiat, et al.
0

Gauss-Seidel (GS) relaxation is often employed as a preconditioner for a Krylov solver or as a smoother for Algebraic Multigrid (AMG). However, the requisite sparse triangular solve is difficult to parallelize on many-core architectures such as graphics processing units (GPUs). In the present study, the performance of the traditional GS relaxation based on a triangular solve is compared with two-stage variants, replacing the direct triangular solve with a fixed number of inner Jacobi-Richardson (JR) iterations. When a small number of inner iterations is sufficient to maintain the Krylov convergence rate, the two-stage GS (GS2) often outperforms the traditional algorithm on many-core architectures. We also compare GS2 with JR. When they perform the same number of flops for SpMV (e.g. three JR sweeps compared to two GS sweeps with one inner JR sweep), the GS2 iterations, and the Krylov solver preconditioned with GS2, may converge faster than the JR iterations. Moreover, for some problems (e.g. elasticity), it was found that JR may diverge with a damping factor of one, whereas two-stage GS may improve the convergence with more inner iterations. Finally, to study the performance of the two-stage smoother and preconditioner for a practical problem, these were applied to incompressible fluid flow simulations on GPUs.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

06/18/2020

Kaczmarz-type inner-iteration preconditioned flexible GMRES methods for consistent linear systems

We propose using greedy and randomized Kaczmarz inner-iterations as prec...
12/29/2021

Neumann Series in GMRES and Algebraic Multigrid Smoothers

Neumann series underlie both Krylov methods and algebraic multigrid smoo...
06/24/2020

Provably Convergent Working Set Algorithm for Non-Convex Regularized Regression

Owing to their statistical properties, non-convex sparse regularizers ha...
11/07/2021

Convergence analysis of two-level methods with general coarse solvers

Multilevel methods are among the most efficient numerical methods for so...
04/19/2022

Proximal Implicit ODE Solvers for Accelerating Learning Neural ODEs

Learning neural ODEs often requires solving very stiff ODE systems, prim...
01/16/2020

On Solving Groundwater Flow and Transport Models with Algebraic Multigrid Preconditioning

Sparse iterative solvers preconditioned with the algebraic multigrid has...
11/17/2021

The Hierarchical Subspace Iteration Method for Laplace–Beltrami Eigenproblems

Sparse eigenproblems are important for various applications in computer ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.