# Structure-preserving diagonalization of matrices in indefinite inner product spaces

In this work some results on the structure-preserving diagonalization of selfadjoint and skewadjoint matrices in indefinite inner product spaces are presented. In particular, necessary and sufficient conditions on the symplectic diagonalizability of (skew)-Hamiltonian matrices and the perplectic diagonalizability of per(skew)-Hermitian matrices are provided. Assuming the structured matrix at hand is additionally normal, it is shown that any symplectic or perplectic diagonalization can always be constructed to be unitary. As a consequence of this fact, the existence of a unitary, structure-preserving diagonalization is equivalent to the existence of a specially structured additive decomposition of such matrices. The implications of this decomposition are illustrated by several examples.

07/01/2020

### Density of diagonalizable matrices in sets of structured matrices defined from indefinite scalar products

For an (indefinite) scalar product [x,y]_B = x^HBy for B= ± B^H ∈ Gl_n(ℂ...
03/13/2020

### Finding the closest normal structured matrix

Given a structured matrix A we study the problem of finding the closest ...
08/02/2020

### Structured strong linearizations of structured rational matrices

Structured rational matrices such as symmetric, skew-symmetric, Hamilton...
07/28/2020

### Equivalent conditions for simultaneous diagonalization via ^*-congruence of Hermitian matrices

This paper aims at giving some equivalent conditions for that a collecti...
12/29/2019

### On Parity-Preserving Constrained Coding

Necessary and sufficient conditions are presented for the existence of f...
12/07/2019

### Minimal Sufficient Conditions for Structural Observability/Controllability of Composite Networks via Kronecker Product

In this paper, we consider composite networks formed from the Kronecker ...
07/07/2022

### Constructions and restrictions for balanced splittable Hadamard matrices

A Hadamard matrix is balanced splittable if some subset of its rows has ...

## 1 Introduction

Structured matrices are omnipresent in many areas of mathematics. For instance, in the theory of matrix equations [13], structures arising from the consideration of selfadjoint and skewadjoint matrices with respect to certain inner products play a crucial role . Often, these inner products are indefinite, so that the underlying bilinear or sesquilinear form does not define a scalar product. Hence, results from Hilbert-space-theory are not available in this case and an independent mathematical analysis is required. In this work, some results in this direction are presented.

Considering the (definite) standard Euclidean inner product , on

, it is well-known that selfadjoint and skewadjoint matrices (i.e. Hermitian and skew-Hermitian matrices) have very special properties. For example, (skew)-Hermitian matrices are always diagonalizable by a unitary matrix. The unitary matrices constitute the automorphism group of the scalar product

which means that always holds for any unitary matrix

and all vectors

. The automorphism group is sometimes called the Lie-group with respect to whereas the selfadjoint and skewadjoint matrices are referred to as the Jordan and Lie algebras [15]. The Euclidean scalar product is a special case of a sesquilinear form on with being the identity matrix. Often, sesquilinear forms appear in mathematics where . In particular, cases that have been intensively studied are those where is some (positive/negative definite or indefinite) Hermitian matrix [8] or a skew-Hermitian matrix [4]. The Lie-group, the Lie algebra and the Jordan algebra are defined analogously to the Euclidean scalar product for such forms as the group of automorphisms, selfadjoint and skewadjoint matrices with respect to .

In this work, selfadjoint and skewadjoint matrices with respect to indefinite Hermitian or skew-Hermitian sesquilinear forms are considered from the viewpoint of diagonalizability. In particular, since Hermitian and skew-Hermitian matrices are always diagonalizable by a unitary matrix (i.e. an automorphism with respect to ), we will consider the question under what conditions a similar statement holds for the automorphic diagonalization of selfadjoint and skewadjoint matrices with respect to other (indefinite) sesquilinear forms. For two particular sesquilinear forms (the symplectic and the perplectic sesquilinear form) this question will be fully analyzed and answered in Sections 3 and 4. For the symplectic bilinear form, this question was already addressed in [5]. In Section 5 we consider these results in the context of normal matrices for which there always exists a unitary diagonalization. In particular, the results presented in this section apply to selfadjoint and skewadjoint matrices for which a unitary diagonalization exists. We will show that this subclass of matrices has very nice properties with respect to unitary and automorphic diagonalization and how both types of diagonalizations interact. In Section 2 the notation used throughout this work is introduced whereas in Section 6 some concluding remarks are given.

## 2 Notation

For any and we denote by the -dimensional vector space over and by the vector space of all matrices over . The vector subspace of which is obtained from all possible linear combinations of some vectors is called the span of and is denoted by . A basis of some subspace is a linearly independent set of vectors such that . In this case we say that the dimension of equals , that is, . The symbol is used to denote the direct product of with itself, i.e. . For any matrix , the notions and refer to the image and the nullspace (kernel) of , i.e. and . The rank of is defined as the dimension of its image. For any matrix the superscripts and denote the transpose of and the Hermitian transpose . The overbar denotes the conjugation of a complex number and applies entrywise to matrices. The identity matrix is throughout denoted by whereas the zero matrix, the zero vector in or the number zero are simply denoted by (to specify dimensions is used in some places to refer to the zero matrix). A Hermitian matrix satisfies and a skew-Hermitian matrix . Moreover, a matrix is called unitary if holds and normal in case holds. For two matrices the notation is used to denote their direct sum, i.e. the matrix given by

 C=[A0m×m0m×mB].

For a given matrix any scalar which satisfies for some nonzero vector

is called an eigenvalue of

. The set of all eigenvalues of is denoted by and equals the zero set of the degree- polynomial . The algebraic multiplicity of as an eigenvalue of equals the multiplicity of as a zero of . Whenever is some eigenvalue of any vector satisfying

is called an eigenvector of

(for ). The set of all eigenvectors of for is a vector subspace of

and is called the corresponding eigenspace (of

for ). Its dimension is referred to as the geometric multiplicity of . The matrix is called diagonalizable if there exist linearly independent eigenvectors of . These vectors consequently form a basis of . A matrix is diagonalizable if and only if the geometric and algebraic multiplicities of all eigenvalues of coincide.

## 3 Sesquilinear Forms

In this section we introduce the notion of a sesquilinear form on and some related basic concepts. Notice that Definition 1 slightly deviates from the definition of a sesquilinear form given in [12, Sec. 5.1].

###### Definition 1.

A sesquilinear form on is a mapping so that for all and all the following relations and hold

 (i)[αu+βv,w]=¯¯¯¯α[u,w]+¯¯¯β[v,w](ii)[u,αv+βw] =α[u,v]+β[u,w].

If is some sesquilinear form and with are two vectors that are multiples of the th and th unit vectors and , then . Thus any sesquilinear form is uniquely determined by the images of the standard unit vectors . In particular, on can be expressed as

 [x,y]=xHBy (1)

for the particular matrix with , . A form as in (1) is called Hermitian if holds for all . It is easy to see that is Hermitian if and only if is Hermitian, i.e. [8, Sec. 2.1]. The form is called skew-Hermitian if holds for all . This is the case if and only if .

The following Definition 2 introduces two classes of subspaces related in a particular fashion to a sesquilinear form (see, e.g., [8, Sec. 2.3]).

###### Definition 2.

Let be some sesquilinear form on .

1. A subspace of dimension is called neutral (with respect to ) if for any basis of and

2. A subspace of dimension is called nondegenerate (with respect to ) if is nonsingular, i.e. , for any basis of and Otherwise, is called degenerate.

In case is even, any neutral subspace with is called Lagrangian (subspace) (see, e.g., [6, Def. 1.2]). Some analysis on this kind of subspaces is presented in Section 5.1. A sesquilinear form as in (1) is called nondegenerate, if is nondegenerate with respect to . In the sequel, nondegenerate sesquilinear forms are called indefinite inner products. Note that the sesquilinear form in (1) is nondegenerate, i.e. an indefinite inner product, if and only if is nonsingular [14, Sec. 2.1].

###### Proposition 1.

For any indefinite inner product on and any there exists a unique matrix such that

 [Ax,y]=[x,A⋆y]holds for all x,y∈Cm.

The matrix corresponding to in Proposition 1 is called the adjoint of . It can be expressed as and also satisfies for all [14, Sec. 2.2]. A matrix that commutes with its adjoint , i.e. , is called normal with respect to or simply -normal. For any indefinite inner product on there are three classes of -normal matrices that deserve special attention (see also [14, Sec. 2.2]).

###### Definition 3.

Let be some indefinite inner product on .

1. A matrix with the property is called an automorphism (with respect to ).

2. A matrix satisfying is called selfadjoint (with respect to ) whereas a matrix satisfying is called skewadjoint.

Notice that, if is an automorphism, holds for all since and . In particular, any automorphism is nonsingular. For the standard Euclidean scalar product , automorphisms, selfadjoint and skewadjoint matrices are those which are unitary, Hermitian or skew-Hermitian, respectively. Beside these, special names have also been given to matrices which are automorph, selfadjoint or skewadjoint with respect to the indefinite inner products on induced by the matrices and given by

 J2n=[In−In],R2n=[RnRn]withRn=⎡⎢⎣1\iddots1⎤⎥⎦.

These names are listed in the table from Figure 1111Notice that these names are not consistently used in the literature. For instance, a Hamiltonian matrix here
and in [6] is called -Hermitian in [14].
. For instance, a skew-Hamiltonian matrix and a per-Hermitian matrix have expressions of the form

 A=[A1A2A3AH1]andC=[C1C2C3RnCH1Rn],Aj,Cj∈Mn×n(C), (2)

where it holds that and that are themselves per-Hermitian with respect to on . Notice that for any indefinite inner product on the selfadjoint and skewadjoint structures are preserved under similarity transformations with automorphisms. This fact is well known and easily confirmed for unitary similarity transformations of Hermitian and skew-Hermitian matrices. In our setting this means that, whenever is (skew)-Hamiltonian (per(skew)-Hermitian) and is symplectic (perplectic), then is again (skew)-Hamiltonian (per(skew)-Hermitian). We will only be considering the indefinite inner products induced by and on from now on.

The result from Proposition 2 below is central for the upcoming discussion and can be found in, e.g., [11, Sec. 4.5] (for the case ). The statement for is easily verified by noting that is Hermitian if and only if is skew-Hermitian.

###### Proposition 2 (Sylvesters Law of Inertia).

Let and assume that either or holds. Then there exists a nonsingular matrix so that

 UHAU=⎡⎢ ⎢⎣−αIpαIq0r×r⎤⎥ ⎥⎦

where if is Hermitian and otherwise. Hereby, coincides with the number of negative real/purely imaginary eigenvalues of , coincides with the number of positive real/purely imaginary eigenvalues of and is the algebraic multiplicity of zero as an eigenvalue of .

The triple from Proposition 2 is usually referred to as the inertia of [11, Sec. 4.5]. Two Hermitian or skew-Hermitian matrices with the same inertia are called congruent. Following directly from Proposition 2 we obtain the following proposition (see also [11, Thm. 4.5.8]).

###### Proposition 3.

Let be two matrices which are either both Hermitian or skew-Hermitian. Then there exists a nonsingular matrix so that if and only if and have the same inertia.

## 4 Symplectic and Perplectic Diagonalizability

In this section the symplectic and perplectic diagonalization of (skew)-Hamiltonian and per(skew)-Hermitian matrices is analyzed. As those matrices need not be diagonalizable per se, cf. [8, Ex. 4.2.1], their diagonalizability has to be assumed throughout the whole section. At first, we consider arbitrary (skew)-Hermitian indefinite inner products and provide two auxiliary results related to their selfadjoint matrices. These results will turn out to be useful in Sections 4.1 and 4.2 where we derive necessary and sufficient conditions for (skew)-Hamiltonian or per(skew)-Hermitian matrices to be diagonalizable by a symplectic (perplectic, respectively) similarity transformation. This section is based on [18, Chap. 9].

Let be some (skew)-Hermitian indefinite inner product on and let be selfadjoint with respect to . Then, as , we have . In particular, for each , , is an eigenvalue of , too, with the same multiplicity. Proposition 4 shows that, among the eigenvectors of , those corresponding to and , respectively, are the only candidates for having a nonzero inner product . This result can also be found in, e.g., [14, Thm. 7.8].

###### Proposition 4.

Let be some (skew)-Hermitian indefinite inner product and be selfadjoint. Moreover, assume are eigenvectors of corresponding to some eigenvalues , respectively. Then implies that . Consequently, each eigenspace of for an eigenvalue of is neutral.

###### Proof.

Under the given assumptions we have

 ¯¯¯λ[x,y]=[λx,y]=[Ax,y]=[x,A⋆y]=[x,Ay]=[x,μy]=μ[x,y]

so, whenever then has to hold. This proves the statement by contraposition noting that if and only if . ∎

Now assume that is diagonalizable. Let , , and suppose and are eigenbases corresponding to and , respectively. Additionally, let be eigenvectors of completing to a basis of and set . According to Proposition 4 we have

 VHBV=⎡⎢ ⎢ ⎢⎣0Sℓ±SHℓ000X⎤⎥ ⎥ ⎥⎦∈Mm×m(C) (3)

for some matrices and . In case we have in (3) and whereas we have and in case . As and are nonsingular, so is . This implies and in (3) to be nonsingular, too. As equals the direct sum of the eigenspaces of corresponding to and , the nonsingularity of gives the following Corollary 1 taking Definition 2 (1) into account.

###### Corollary 1.

Let be some (skew)-Hermitian indefinite inner product and let be selfadjoint and diagonalizable. Then, for any , , the direct sum of the eigenspaces of corresponding to and is always nondegenerate.

Similarly to the derivation preceding Corollary 1 one shows that the eigenspace of a selfadjoint matrix corresponding to some real eigenvalue is always nondegenerate, too. We are now in the position to derive statements on the symplectic and perplectic diagonalizability of (skew)-Hamiltonian and per(skew)-Hermitian matrices.

### 4.1 Symplectic Diagonalization of (skew)-Hamiltonian Matrices

The following Theorem 1 states the main result of this section characterizing those (diagonalizable) (skew)-Hamiltonian matrices which can be brought to diagonal form by a symplectic similarity transformation. Recall that, according to (2), a diagonal skew-Hamiltonian matrix has the form

 ˜D=[D00DH]withD=diag(λ1,…,λn)∈Mn×n(C). (4)
###### Theorem 1.

Let be diagonalizable.

1. Assume that is skew-Hamiltonian. Then is symplectic diagonalizable if and only if for any real eigenvalue and some basis of the corresponding eigenspace, the matrix for has equally many positive and negative imaginary eigenvalues.

2. Assume that is Hamiltonian. Then is symplectic diagonalizable if and only if for any purely imaginary eigenvalue and some basis of the corresponding eigenspace, the matrix for has equally many positive and negative imaginary eigenvalues.

###### Proof.

1. Let be skew-Hamiltonian, that is , and symplectic such that

 S−1AS=[DDH]=S−1A⋆S,SHJ2nS=J2n, (5)

with is a (symplectic) diagonalization of . If is real, it follows from (5) that has even multiplicity, say, with instances of appearing in and , respectively (w. l. o. g. on the diagonal positions ). Let be the corresponding eigenvectors (appearing as columns in the corresponding positions in ) which span the eigenspace of and for . Now set . Then we have

 SHjJ2nSj=[Ik−Ik]∈M2k×2k(C)

which follows directly from . The eigenvalues of are and both with the same multiplicity . As was arbitrary, this holds for any real eigenvalue of .

Now let be skew-Hamiltonian and diagonalizable. Moreover assume that the condition stated above holds for all real eigenvalues of . We now generate bases for the different eigenspaces of according to the following rules:

1. For each pair of eigenvalues , , both with multiplicity , let be corresponding eigenvectors of for and corresponding eigenvectors of for . Set . Then, according to Proposition 4 and Corollary 1 and are both neutral and is nonsingular. Therefore the form of is

 SHjJ2nSj=⎡⎣0ˆSj−ˆSHj0⎤⎦∈M2mj×2mj(C)

for some nonsingular matrix . Now, multiplying by and (from the right and the left) we observe that

 (ˆS−Hj⊕Imj)HSHjJ2nSj(ˆS−Hj⊕Imj) =[ˆS−1jImj]⎡⎣0ˆSj−ˆSHj0⎤⎦[ˆS−HjImj] =[Imj−Imj].

Let denote the columns of and notice that, due to the form of , and are still bases for the eigenspaces of for and , respectively. According to Proposition 4, the inner products for any and any eigenvector of corresponding to some eigenvalue are zero.

2. For each , , let be a basis of the corresponding eigenspace (assuming the even multiplicity of is ). For the skew-Hermitian matrix is nonsingular and has, according to our assumptions, exactly positive and negative purely imaginary eigenvalues. Thus, it has the same inertia as and there exists some nonsingular matrix such that according to Proposition 2. Let denote the columns of and note that is still a basis for the eigenspace of corresponding to . According to Proposition 4, the inner products for any and any eigenvector of corresponding to some eigenvalue are zero.

If bases of the eigenspaces for all eigenvalues of have been constructed according to (a) if and (b) if , the new eigenvectors obtained this way are collected in a matrix , i.e. Note that is nonsingular and that is diagonal. Due to the construction of , the skew-Hermitian matrix has only and as nonzero entries. Hence, it is permutation-similar to . In other words, there exists a (real) permutation matrix with . Now so is symplectic. Moreover, remains to be diagonal as is a permutation matrix and the statement 1. is proven.

2. If is Hamiltonian notice that is skew-Hamiltonian. Thus, whenever is symplectic and is a symplectic diagonalization of for some diagonal matrix we have that

 S−1ˆAS=S−1(iA)S=iS−1AS=[iD−iDH]=[ˆDˆDH]

for is a symplectic diagonalization of . From 1. it is known that the diagonalization exists if and only if for each real eigenvalue has even multiplicity and, given any basis of the corresponding eigenspace, the matrix for has equally many positive and negative purely imaginary eigenvalues. Vice versa this implies that the symplectic diagonalization exists if and only if each purely imaginary eigenvalue has even multiplicity and, given any basis of the corresponding eigenspace, the matrix for has equally many positive and negative purely imaginary eigenvalues. ∎

The following Corollary 2 is a direct consequence of Theorem 1 which guarantees the existence of a symplectic diagonalization whenever no real or purely imaginary eigenvalues are present. To understand Corollary 2 correctly, zero should be regarded as both, real and purely imaginary.

###### Corollary 2.
1. A diagonalizable skew-Hamiltonian matrix is always symplectic diagonalizable if has no purely real eigenvalues.

2. A diagonalizable Hamiltonian matrix is always symplectic diagonalizable if has no purely imaginary eigenvalues.

###### Example 1.

Let be skew-Hermitian matrices. Taking (2) into account it is easy to check that the matrix

 M=[ABB−A]∈M2n×2n(C) (6)

is skew-Hamiltonian and skew-Hermitian. The skew-Hermitian structure implies that has only purely imaginary eigenvalues. Therefore, Corollary 2 applies and, whenever is nonsingular, it can be diagonalized by a symplectic similarity transformation. The diagonalizability of is always guaranteed since any skew-Hermitian matrix can be diagonalized (by a unitary matrix). In Section 5.2 we will show that a symplectic diagonalization of can always be constructed to be unitary, too. An analogous statement holds for nonsingular matrices of the form (6) where and are Hermitian. Such matrices are consequently Hermitian and Hamiltonian, i.e. they have no purely imaginary eigenvalues.

### 4.2 Perplectic Diagonalization of per(skew)-Hermitian Matrices

The main result on the perplectic diagonalization of per-Hermitian and perskew-Hermitian matrices is similar to the statement from Theorem 1. In particular, the proof of Theorem 2 below is analogous to the proof of Theorem 1 with the only significant change being the replacement of the skew-Hermitian structures appearing in the proof of Theorem 1 (due to the skew-Hermitian matrix ) by Hermitian structures caused by . Therefore, statements on purely imaginary eigenvalues turn into statements on real eigenvalues. The proof is consequently omitted.

###### Theorem 2.

Let be diagonalizable.

1. Assume that is per-Hermitian. Then is perplectic diagonalizable if and only if for any real eigenvalue and some basis of the corresponding eigenspace, the matrix for has equally many positive and negative real eigenvalues.

2. Assume that is perskew-Hermitian. Then is perplectic diagonalizable if and only if for any purely imaginary eigenvalue and some basis of the corresponding eigenspace, the matrix for has equally many positive and negative real eigenvalues.

The following Corollary 3 is an immediate consequence of Theorem 2 and is the analogous result to Corollary 2 for per(skew)-Hermitian matrices.

###### Corollary 3.
1. A diagonalizable per-Hermitian matrix is always perplectic diagonalizable if has no purely real eigenvalues.

2. A diagonalizable perskew-Hermitian matrix is always perplectic diagonalizable if has no purely imaginary eigenvalues.

## 5 Normal Structured Matrices

In this section we analyze the matrix structures from Section 4 assuming the matrix at hand is additionally normal. Recall that a matrix is called normal if holds. It is well-known that for any normal matrix there exists a unitary matrix , so that is diagonal (where are the eigenvalues of ) [9]. Now partition and as with and with . We now obtain from that

 A=[Q1Q2][D1D2][QH1QH2]=Q1D1QH1+Q2D2QH2=:E+F (7)

holds, where . Notice that and are normal for themselves. Moreover, since is unitary, i.e. , we have . It is now seen directly that holds. Beside this property there are no more obvious relations between and . This situation changes whenever the normal matrix is (skew)-Hamiltonian or per(skew)-Hermitian. In case of symplectic or perplectic diagonalizability, the matrices and are related in a particular way. This relation between and is investigated in this section giving some new insights on the symplectic and perplectic diagonalization of those matrices. To this end, the following subsection provides some facts about Lagrangian and neutral subspaces which will be of advantage for our discussion in the sequel. This section is based on [18, Chap. 10].

### 5.1 Lagrangian Subspaces

Let be either the perplectic form with or the symplectic form with on . In this section we briefly collect some information about neutral subspaces222The results from this section (in particular Corollary 4) are likely to be known
although they are not readily found in the literature. They have already been stated in [18, Sec. 10.1].
with respect to the indefinite inner product . At first, it is obvious that the set of all neutral subspaces in constitutes a partial order under the relation of set-inclusion. That is, for any neutral subspaces we have reflexivity (), transitivity ( yields ) and anti-symmetry ( yields ). Moreover, for any chain of neutral subspaces the space contains all other spaces from this chain [7, Def. O-1.6]. In other words, each chain of subspaces has an neutral subspace as an upper bound. According to the lemma of Zorn [20], these facts lead to the observation that the (partially ordered) set of neutral subspaces has maximal elements. The next proposition presents an upper bound for the dimensions of neutral subspaces.

###### Proposition 5.

For the symplectic inner product and the perplectic inner product on