Multigrid deflation for Lattice QCD

by   Eloy Romero, et al.

Computing the trace of the inverse of large matrices is typically addressed through statistical methods. Deflating out the lowest eigenvectors or singular vectors of the matrix reduces the variance of the trace estimator. This work summarizes our efforts to reduce the computational cost of computing the deflation space while achieving the desired variance reduction for Lattice QCD applications. Previous efforts computed the lower part of the singular spectrum of the Dirac operator by using an eigensolver preconditioned with a multigrid linear system solver. Despite the improvement in performance in those applications, as the problem size grows the runtime and storage demands of this approach will eventually dominate the stochastic estimation part of the computation. In this work, we propose to compute the deflation space in one of the following two ways. First, by using an inexact eigensolver on the Hermitian, but maximally indefinite, operator A γ_5. Second, by exploiting the fact that the multigrid prolongator for this operator is rich in components toward the lower part of the singular spectrum. We show experimentally that the inexact eigensolver can approximate the lower part of the spectrum even for ill-conditioned operators. Also, the deflation based on the multigrid prolongator is more efficient to compute and apply, and, despite its limited ability to approximate the fine level spectrum, it obtains similar variance reduction on the trace estimator as deflating with approximate eigenvectors from the fine level operator.



There are no comments yet.


page 1

page 3


Probing for the Trace Estimation of a Permuted Matrix Inverse Corresponding to a Lattice Displacement

Probing is a general technique that is used to reduce the variance of th...

A Multilevel Approach to Variance Reduction in the Stochastic Estimation of the Trace of a Matrix

The trace of a matrix function f(A), most notably of the matrix inverse,...

Estimating the inverse trace using random forests on graphs

Some data analysis problems require the computation of (regularised) inv...

Variance Reduction for Matrix Computations with Applications to Gaussian Processes

In addition to recent developments in computing speed and memory, method...

Variance Reduction with Sparse Gradients

Variance reduction methods such as SVRG and SpiderBoost use a mixture of...

Efficient approximation for global functions of matrix product operators

Building on a previously introduced block Lanczos method, we demonstrate...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.