Sherman-Morrison-Woodbury Identity for Tensors

07/03/2020
by   Shih Yu Chang, et al.
0

In linear algebra, the sherman-morrison-woodbury identity says that the inverse of a rank-k correction of some matrix can be computed by doing a rank-k correction to the inverse of the original matrix. This identity is crucial to accelerate the matrix inverse computation when the matrix involves correction. Many scientific and engineering applications have to deal with this matrix inverse problem after updating the matrix, e.g., sensitivity analysis of linear systems, covariance matrix update in kalman filter, etc. However, there is no similar identity in tensors. In this work, we will derive the sherman-morrison-woodbury identity for invertible tensors first. Since not all tensors are invertible, we further generalize the sherman-morrison-woodbury identity for tensors with moore-penrose generalized inverse by utilizing orthogonal projection of the correction tensor part into the original tensor and its Hermitian tensor. According to this new established the sherman-morrison-woodbury identity for tensors, we can perform sensitivity analysis for multi-linear systems by deriving the normalized upper bound for the solution of a multilinear system. Several numerical examples are also presented to demonstrate how the normalized error upper bounds are affected by perturbation degree of tensor coefficients.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro