Efficient computation of matrix-vector products with full observation weighting matrices in data assimilation

09/05/2021
by   Guannan Hu, et al.
0

Recent studies have demonstrated improved skill in numerical weather prediction via the use of spatially correlated observation error covariance information in data assimilation systems. In this case, the observation weighting matrices (inverse error covariance matrices) used in the assimilation may be full matrices rather than diagonal. Thus, the computation of matrix-vector products in the variational minimization problem may be very time-consuming, particularly if the parallel computation of the matrix-vector product requires a high degree of communication between processing elements. Hence, we introduce a well-known numerical approximation method, called the fast multipole method (FMM), to speed up the matrix-vector multiplications in data assimilation. We explore a particular type of FMM that uses a singular value decomposition (SVD-FMM) and adjust it to suit our new application in data assimilation. By approximating a large part of the computation of the matrix-vector product, the SVD-FMM technique greatly reduces the computational complexity compared with the standard approach. We develop a novel possible parallelization scheme of the SVD-FMM for our application, which can reduce the communication costs. We investigate the accuracy of the SVD-FMM technique in several numerical experiments: we first assess the accuracy using covariance matrices that are created using different correlation functions and lengthscales; then investigate the impact of reconditioning the covariance matrices on the accuracy; and finally examine the feasibility of the technique in the presence of missing observations. We also provide theoretical explanations for some numerical results. Our results show that the SVD-FMM technique has potential as an efficient technique for assimilation of a large volume of observational data within a short time interval.

READ FULL TEXT
research
05/27/2021

A generalization of the randomized singular value decomposition

The randomized singular value decomposition (SVD) is a popular and effec...
research
02/16/2022

Vectorization of a thread-parallel Jacobi singular value decomposition method

The eigenvalue decomposition (EVD) of (a batch of) Hermitian matrices of...
research
12/05/2022

Impact of correlated observation errors on the convergence of the conjugate gradient algorithm in variational data assimilation

An important class of nonlinear weighted least-squares problems arises f...
research
09/29/2020

What if Neural Networks had SVDs?

Various Neural Networks employ time-consuming matrix operations like mat...
research
10/11/2019

Background Error Covariance Iterative Updating with Invariant Observation Measures for Data Assimilation

In order to leverage the information embedded in the background state an...
research
07/09/2022

Batch-efficient EigenDecomposition for Small and Medium Matrices

EigenDecomposition (ED) is at the heart of many computer vision algorith...

Please sign up or login with your details

Forgot password? Click here to reset