DeepAI
Log In Sign Up

On the Distribution of GSVD

In this paper, some new results on the distribution of the generalized singular value decomposition (GSVD) are presented.

READ FULL TEXT VIEW PDF

page 1

page 2

page 3

page 4

06/27/2019

Singular Value Decomposition and Neural Networks

Singular Value Decomposition (SVD) constitutes a bridge between the line...
01/24/2020

Electric Field Propagation Through Singular Value Decomposition

We demonstrate that the singular value decomposition algorithm in conjun...
10/05/2022

Convergence rates of the Kaczmarz-Tanabe method for linear systems

In this paper, we investigate the Kaczmarz-Tanabe method for exact and i...
06/03/2020

Variational Quantum Singular Value Decomposition

Singular value decomposition is central to many problems in both enginee...
02/07/2020

Randomized Algorithms for Generalized Singular Value Decomposition with Application to Sensitivity Analysis

The generalized singular value decomposition (GSVD) is a valuable tool t...
11/29/2019

High Order Singular Value Decomposition for Plant Biodiversity Estimation

We propose a new method to estimate plant biodiversity with Rényi and Ra...
09/29/2022

Generalized matrix nearness problems

We show that the global minimum solution of ‖ A - BXC ‖ can be found in ...

I Some new results on GSVD

In this section, the GSVD of two Gaussian matrices is defined first. Then, the distribution of the squared generalized singular values are presented.

I-a Definition of GSVD

Given two matrices and

, whose entries are i.i.d. complex Gaussian random variables with zero mean and unit variance. Let us define

, , and . Then, the GSVD of and can be expressed as follows [1]:

(1)

where and are two nonnegative diagonal matrices, and are two unitary matrices, and can be expressed as in (9).

Moreover, and have the following form:

(2)

where and are two nonnegative diagonal matrices, satisfying . Then, the squared generalized singular values can be defined as , .

I-B Distribution of the squared generalized singular values

To characterize the distribution of the squared generalized singular value , a relationship between

and the eigenvalue of a common matrix model is established first as in the following theorem.

Theorem 1

Suppose that and are two Gaussian matrices whose elements are i.i.d. complex Gaussian random variables with zero mean and unit variance and their GSVD is defined as in (1). Without loss of generality, it is assumed that . Then, the distribution of their squared generalized singular values, , is identical to that of the nonzero eigenvalues of , where

(3)

and are two independent Gaussian matrices whose elements are are i.i.d. complex Gaussian random variables with zero mean and unit variance. Moreover, , and can be expressed as follows:

(4)

When , and and are deterministic.

Proof:

See Appendix A.

The distribution of the nonzero eigenvalues of can be characterized as in the following corollary.

Corollary 1

Suppose that , and and are two independent Gaussian matrices whose elements are are i.i.d. complex Gaussian random variables with zero mean and unit variance. And, it is assumed that

. Then, the joint probability density function (p.d.f.) of the nonzero eigenvalues of

can be characterized as follows:

(5)

where , , , and can be expressed as follows:

(6)

where and is the complex multivariate gamma function[2].

Proof:

Following steps similar to those in [3, Appendix A], the distribution of the nonzero eigenvalues of can be obtained.

Moreover, the marginal p.d.f. of , , can be characterized as in the following lemma.

Lemma 1

The marginal p.d.f. derived from (5) can be expressed as follows:

(7)

where and can be expressed as follows:

where , are permutations of length , , , is if the permutation is even and

if it is odd, and

is the Beta function [4].

Proof:

See Appendix B.

I-C Some properties about

As shown in [1], the GSVD decomposition matrix is often used to construct the precoding matrix at the transmitting end. In this section, some properties about are discussed.

First, define . As shown in [5, eq.(2.2)], can be expressed as

(9)

where and are two unitary matrices, is a nonnegative diagonal matrix and has the same singular values as the nonzero singular values of . Thus, the power of can be expressed as

where , , are the nonzero eigenvalues of .

Note that and are two independent Gaussian matrices. Then, it is easy to know that is a Wishart matrix. Finally, directly from [6, Lemma 2.10], the following corollary can be derived.

Corollary 2

Suppose that and are two Gaussian matrices whose elements are i.i.d. complex Gaussian random variables with zero mean and unit variance and their GSVD is defined as in (1). The average power of the GSVD decomposition matrix can be expressed as follows:

(11)

Appendix A: Proof of Theorem 1

1. The case when

When , , , and . Thus, from (1), the GSVD of and can be further expressed as follows:

(12)

Moreover, when , as shown in (9), is nonsingular and it can be shown that

(13)

Furthermore, from (2), it can be shown that

(14)

Thus, can be expressed as

(15)

Recall that and . Then, it can be shown that . Finally, it is easy to see that the distribution of , is identical to that of the nonzero eigenvalues of , where

and are two independent Gaussian matrices whose elements are are i.i.d. complex Gaussian random variables with zero mean and unit variance. Moreover, .

2. The case when

When , , , and . As shown in (2), are the squared generalized singular values of and , and .

On the other hand, define and the SVD of as . Note that is a Haar matrix. Divide into the following four blocks:

(17)

where and . From [5, eq.(2.7)], it is easy to see that equal the non-one eigenvalues of . Moreover, from the fact that is a Haar matrix, it can be shown that

(18)

Thus, the following equations can be derived:

(19)

Note that and have the same non-zero eigenvalues. Thus, and have the same non-one eigenvalues. Therefore, equal the non-one eigenvalues of . Define as

(20)

It is easy to see that and is also a Haar matrix.

Then, from the above discussions, it can be concluded that the distribution of is identical to the distribution of the non-one singular valus of the or truncated sub-matrix of a Haar matrix. Thus, define and , whose entries are i.i.d. complex Gaussian random variables with zero mean and unit variance. The distribution of the squared squared generalized singular values of and , is identical to that of the squared squared generalized singular values of and . Since , from Appendix A-1, it can be known that the distribution of the squared squared generalized singular values of and , is identical to that of the eigenvalues of , where and are two independent Gaussian matrices whose elements are are i.i.d. complex Gaussian random variables with zero mean and unit variance.

This completes the proof of the theorem.

Appendix B: Proof of Lemma 1

First, the marginal p.d.f. derived from (5) can be expressed as follows:

where , and

Note that can be expressed as

Thus, can be further expressed as

(24)

Therefore, can be expressed as

Moreover, from Eq.(3.194.3) [4], it can be shown that Then, can be expressed as

On the other hand, as shown in (Appendix B: Proof of Lemma 1), can be be expressed as

Moreover, define , . Then, can be be further expressed as

where . Thus, it can be known that .

This completes the proof of the lemma.

References