# The antitriangular factorization of skew-symmetric matrices

In this paper we develop algorithms for similarity transformations of skew-symmetric matrices to simpler forms. The first algorithm is similar to the algorithm for the block antitriangular factorization of symmetric matrices, but in the case of skew-symmetric matrices, antitriangular form is always obtained. Moreover, simple two-sided permutation of the antitriangular form transforms the matrix into multi-arrowhead matrix. In addition, we show here that the block antitriangular form of the skew-Hermitian matrices has the same structure as the block antitriangular form of the symmetric matrices.

## Authors

• 5 publications
03/18/2022

### Existence of flipped orthogonal conjugate symmetric Jordan canonical bases for real H-selfadjoint matrices

For real matrices selfadjoint in an indefinite inner product there are t...
11/08/2020

### LDU factorization

LU-factorization of matrices is one of the fundamental algorithms of lin...
04/08/2022

### On computing the symplectic LL^T factorization

We analyze two algorithms for computing the symplectic LL^T factorizatio...
09/01/2017

### Look-Ahead in the Two-Sided Reduction to Compact Band Forms for Symmetric Eigenvalue Problems and the SVD

We address the reduction to compact band forms, via unitary similarity t...
02/26/2018

### Symmetric indefinite triangular factorization revealing the rank profile matrix

We present a novel recursive algorithm for reducing a symmetric matrix t...
05/16/2017

### The Incremental Multiresolution Matrix Factorization Algorithm

Multiresolution analysis and matrix factorization are foundational tools...
08/09/2019

### Extending the Davis-Kahan theorem for comparing eigenvectors of two symmetric matrices I: Theory

The Davis-Kahan theorem can be used to bound the distance of the spaces ...
##### This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

## 1 Introduction

Skew-symmetric matrices are significantly less used than the symmetric ones. Many algorithms designed for the symmetric matrices have been transformed in the course of last two decades to work with the skew-symmetric and other structured matrices, to avoid using algorithms for general, nonstructured, matrices.

Mastronardi and Van Dooren in Mastronardi-VanDooren-13 showed that every symmetric and indefinite matrix can be transformed into the block antitriangular form by orthogonal similarities. More precisely, if , ,

, there exists orthogonal matrix

such that

 M=QTAQ=⎡⎢ ⎢ ⎢ ⎢⎣0000000YT00XZT0YZW⎤⎥ ⎥ ⎥ ⎥⎦, (1.1)

where is nonsingular and lower antitriangular, is symmetric, is symmetric and definite, and .

Bujanović and Kressner in Bujanovic-Kressner-16 derived computationally effective block algorithm that computes the block antitriangular factorization (1.1). Unfortunately that algorithm sometimes fails to detect the inertia. A new algorithm for the antitriangular factorization was presented in Laudadio-Mastronardi-VanDooren-16 .

Pestana and Wathen in Pestana-Wathen-14 simplified the algorithm for the special saddle point matrices

 A=[HBTB0],

where is symmetric, but not necessary positive definite, and , .

In this paper we show that the skew-symmetric matrices have antitriangular form, while the skew-Hermitian have the block antitriangular form similar to the block antitriangular form of the real symmetric matrices.

In the next section of the paper we constructively prove that every skew-symmetric matrix can be transformed into the lower antritriangular form, and establish the connection between the number of nontrivial antidiagonals and the rank of the skew-symmetric matrix. In Section 3 we show that the antitriangular form can be reorganized to the multi-arrowhead form. Section 4 contains the results about block antitriangular form of the Hermitian, and, therefore, skew-Hermitian matrices.

## 2 Factorization of a skew-symmetric matrix into antitriangular form

In this section we constructively prove that every skew-symmetric matrix can be reduced to antitriangular form, by using orthogonal similarity transformations.

To this end we use Givens rotations, since Jacobi rotations cannot annihilate element at the position in a skew symmetric matrix . Suppose that is a skew symmetric matrix of order , and is a rotation. Then we have

 QTijAijQij=[−cosφsinφ−sinφcosφ][−0aij−aij0][cosφ−sinφsinφ−cosφ]=[−0aij−aij0]=Aij.

Therefore, we will use Givens rotation to annihilate element at positions , and , , or at positions , and , .

###### Theorem 2.1.

Let be a skew-symmetric matrix. Then matrix can be factored as

 A=QMQT,

where is orthogonal, and is antitriangular matrix.

###### Proof.

The proof is by induction over the number of already annihilated antidiagonals of a skew-symmetric matrix .

Note that has zero on its position , and this fact serves as basis of induction.

Suppose that after annihilated antidiagonals has the following form,

 Mk−1:=QTk−1AQk−1=[−M11M12−MT12M22], (2.1)

where

 M1,1=⎡⎢ ⎢ ⎢ ⎢ ⎢ ⎢⎣0⋯⋯0⋮\iddotsm2,k−1⋮\iddots\iddots⋮0−m2,k−1⋯mk−1,k−1⎤⎥ ⎥ ⎥ ⎥ ⎥ ⎥⎦, (2.2)

while the matrices and are generally full. In matrix we keep the product of used rotations. If , we have completed the job. Otherwise, in the next step we annihilate th antidiagonal.

First we annihilate elements at positions and by using rotation in the plane

that is equal to identity matrix except at crossings of the

th and th rows and columns, where

 ˆQk,k+1=[cosφk,k+1−sinφk,k+1sinφk,k+1−cosφk,k+1]. (2.3)

We may assume that the elements at positions and are nonzero. Otherwise, we may skip this transformation.

Since the element at the position is transformed only from the right-hand side (and element at the position only from the left-hand side), new elements at these positions are

 m′1k =m1kcosφk,k+1+m1,k+1sinφk,k+1, m′k1 =−(m1kcosφk,k+1+m1,k+1sinφk,k+1)=−m′1k.

By choosing

 cotφk,k+1=−m1,k+1m1k, (2.4)

from the basic identity for the trigonometric functions , it is easy to derive that the sines and cosines in (2.1) (which annihilate ) are

 sinφk,k+1=±1√1+cot2φk,k+1,cosφk,k+1=sinφk,k+1cotφk,k+1,

where is defined by (2.4).

The next step is to annihilate elements at positions and by using rotation in the plane . This transformation will not destroy the zero pattern, since rows/columns and already have zeros as the first elements in the corresponding row/column.

In the similar way all the elements of the th antidiagonal will be annihilated without destroying already introduced zeros.

After the annihilation in this step we obtain which has the same form as from (2.1), but the matrix , still antitriangular, has one row and column more than the matrix from (2.2). This was the step of the induction.

We proceed with the annihilation of one antidiagonal after another until becomes . ∎

As one can expect, since the skew-symmetric matrices have eigenvalues in pairs

, one ‘positive’ and one ‘negative’ on the imaginary axis, there is no submatrix in the symmetric block antitriangular form (1.1), whose dimension corresponds to the difference between the number of positive and negative eigenvalues of the symmetric matrix.

If skew-symmetric matrix is given by its antitriangular factors, then the determinant of , where is

 det(A) =det(QMQT)=det(M) =(−1)2p+1⋅(−1)2p⋯(−1)3⋅(−1)2⋅(−1)pm21,2pm22,2p−1⋯m2p,p+1 =(−1)2(p2+p)m21,2pm22,2p−1⋯m2p,p+1=m21,2pm22,2p−1⋯m2p,p+1.

Therefore, (of even order) is singular if and only if at least one of the antidiagonal entries is equal to zero. If

is of odd order, one of the zeroes of the main diagonal is on the antidiagonal, which proves well-known fact that the skew-symmetric matrix of odd order is always singular. Now suppose that

is of even order and singular, and the antidiagonal entry at the position , is zero. Obviously, due to skew-symmetry, element at the position is also zero. If there are more than one pair of zeroes on the antidiagonal, we start from the zero with the smallest difference of the column minus row index.

Now we are applying procedure similar to procedure of annihilation of the elements of the antidiagonal from the previous theorem, but starting with the annihilation of the element at position by using rotation in the plane . This rotation will also annihilate the element at the position . We are proceeding with this annihilation process until all the elements on the antidiagonal between and are zeros.

If is of even order, after the previous sequence of transformations, our matrix has middle part of the antidiagonal equal to zero. From now on, procedure for the annihilation of nonzero elements on the antidiagonal is valid no matter if is of odd or even order. If is of odd order, first elements to be annihilated are at positions and by the rotation in the plane . If is of even order, we proceed with the annihilation of the elements at positions and with rotation in the plane . The process is finished when the elements at the position and are annihilated by using the rotation in the plane .

If all the elements on the first nontrivial antidiagonal of the obtained final matrix are nonzero, matrix has rank . Otherwise, we continue the process until all diagonal elements of some antidiagonal are nonzero. Their count is the rank of the matrix.

The process of detecting rank is illustrated in the Figures 2.1 and 2.2. First of them is for the matrix of even, and the second for the matrix of odd order.

This result is of purely theoretical nature, not the computational procedure to obtain the rank of the skew-symmetric matrix.

## 3 Multi-arrowhead form of a skew-symmetric matrix

In the previous section we transform full skew-symmetric matrix to an antritriangular form. From the antritriangular form of a skew-symmetric matrix it is easy to obtain a new form – multi-arrowhead form of a matrix.

###### Theorem 3.1.

Let be a skew-symmetric matrix in the antitriangular form. By two-sided permutations , matrix can be transformed into

 M=PSPT,

where has the following multi-arrowhead form. If is odd, then

and if is even, then

 S=⎡⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢⎣0s120s140⋯0s1n−s1200s240⋯0s2n000s340⋯0s3n−s14−s24−s3400⋯0s4n00000⋯0s5n⋮⋮⋮⋮⋮⋱⋮⋮00000⋯0sn−1,n−s1n−s2n−s3n−s4n−s5n⋯−sn−1,n0⎤⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥⎦.

In addition, if is odd with additional sequence of rotations at positions , , …, the first row and the first column can become zero row and column.

###### Proof.

The required result is obtained by symmetric permutation , where

 P={[ek,ek−1,ek+1,ek−2,ek+2,…,e1,en],if n=2k−1,,if n=2k.

Remaining part of the proof for the skew-symmetric matrices of the odd order is straightforward. By rotation at position we will annihilate the elements at positions and . Then we will use rotations at position and annihilate the elements at positions and , and so on until the rotation at position which will annihilate elements at positions and . ∎

## 4 Factorization of a skew-Hermitian matrix into block antitriangular form

Skew-Hermitian matrices are complex generalizations of the skew-symmetric matrices, with the purely imaginary eigenvalues, but now they need not be in the complex-conjugate pairs. Therefore we can have surplus of ‘positive’ or ‘negative’ signs on the imaginary axis.

For example, if

is any unitary matrix, then matrix

, where , and cannot be transformed into the antitriangular form since .

On the other hand, is Hermitian matrix if is skew-Hermitian. Therefore if can be transformed into block antitriangular form, relation between skew-Hermitian and Hermitian matrices will be used to obtain the block antitriangular form of .

If we look at the proof of Theorem 2.1 from Mastronardi-VanDooren-13 , that theorem is also valid for Hermitian matrices if in the statement of the Theorem orthogonal matrices are replaced by unitary and transpose is replaced by conjugate transpose. That proof relies on the properties of the nonnegative, nonpositive, neutral and null-spaces. In Gohberg-Lancaster-Rodman-05 , all the required properties are derived, not only for the complex Euclidean scalar products, but for the indefinite complex scalar products. Therefore, it is easy to prove the following theorem.

###### Theorem 4.1.

Let be a Hermitian indefinite matrix with , , . Then, there exist unitary matrix such that

 M=Q∗AQ=⎡⎢ ⎢ ⎢⎣0000000Y∗00XZ∗0YZW⎤⎥ ⎥ ⎥⎦,

where is nonsingular and lower antitriangular, is Hermitian, is Hermitian and definite, and .

In the next Corollary we abuse notation for the inertia of the skew-Hermitian matrices. If the skew-Hermitian matrix has eigenvalues on the negative part of the imaginary axis, zeros as eigenvalues and eigenvalues on the positive part of the imaginary axis, we denote this by .

###### Corollary 4.2.

Let be a skew-Hermitian matrix, and let , such that neither , nor , and , . Then, there exist unitary matrix such that

 M=Q∗AQ=⎡⎢ ⎢ ⎢⎣000−0000−Y∗00X−Z∗0YZ−W⎤⎥ ⎥ ⎥⎦, (4.1)

where is nonsingular and lower antitriangular, and are skew-Hermitian, and . Then, either or .

###### Proof.

If the previous Theorem 4.1 is applied to , it holds

 ˜M=Q∗HQ=⎡⎢ ⎢ ⎢ ⎢⎣0000000˜Y∗00˜X˜Z∗0˜Y˜Z˜W⎤⎥ ⎥ ⎥ ⎥⎦. (4.2)

If (4.2) is multiplied by we obtain (4.1) by setting , , , , . It is easy to see that and . According to Theorem 4.1, matrix is definite, therefore all eigenvalues of ,

 λk(X)=λk(−i˜X)=−iλk(˜X),

are situated at the same part of the imaginary axis. ∎

## References

• (1) Z. Bujanović, D. Kressner, A block algorithm for computing antitriangular factorizations of symmetric matrices, Numer. Algorithms 71 (1) (2016) 41–57.
• (2) I. Gohberg, P. Lancaster, L. Rodman, Indefinite Linear Algebra and Applications, Birkhäuser, Basel, 2005.
• (3) T. Laudadio, N. Mastronardi, P. Van Dooren, Numerical issues in computing the antitriangular factorization of symmetric indefinite matrices, Appl. Numer. Math. 116 (2016) 204–214.
• (4) N. Mastronardi, P. Van Dooren, The antitriangular factorization of symmetric matrices, SIAM J. Matrix Anal. Appl. 34 (1) (2013) 173–196.
• (5) J. Pestana, A. J. Wathen, The antitriangular factorization of saddle point matrices, SIAM J. Matrix Anal. Appl. 35 (2) (2014) 339–353.