I Some new results on GSVD
In this section, the GSVD of two Gaussian matrices is defined first. Then, the distribution of the squared generalized singular values are presented.
Ia Definition of GSVD
Given two matrices and
, whose entries are i.i.d. complex Gaussian random variables with zero mean and unit variance. Let us define
, , and . Then, the GSVD of and can be expressed as follows [1]:(1) 
where and are two nonnegative diagonal matrices, and are two unitary matrices, and can be expressed as in (9).
Moreover, and have the following form:
(2) 
where and are two nonnegative diagonal matrices, satisfying . Then, the squared generalized singular values can be defined as , .
IB Distribution of the squared generalized singular values
To characterize the distribution of the squared generalized singular value , a relationship between
and the eigenvalue of a common matrix model is established first as in the following theorem.
Theorem 1
Suppose that and are two Gaussian matrices whose elements are i.i.d. complex Gaussian random variables with zero mean and unit variance and their GSVD is defined as in (1). Without loss of generality, it is assumed that . Then, the distribution of their squared generalized singular values, , is identical to that of the nonzero eigenvalues of , where
(3) 
and are two independent Gaussian matrices whose elements are are i.i.d. complex Gaussian random variables with zero mean and unit variance. Moreover, , and can be expressed as follows:
(4) 
When , and and are deterministic.
Proof:
See Appendix A.
The distribution of the nonzero eigenvalues of can be characterized as in the following corollary.
Corollary 1
Suppose that , and and are two independent Gaussian matrices whose elements are are i.i.d. complex Gaussian random variables with zero mean and unit variance. And, it is assumed that
. Then, the joint probability density function (p.d.f.) of the nonzero eigenvalues of
can be characterized as follows:(5) 
where , , , and can be expressed as follows:
(6) 
where and is the complex multivariate gamma function[2].
Proof:
Following steps similar to those in [3, Appendix A], the distribution of the nonzero eigenvalues of can be obtained.
Moreover, the marginal p.d.f. of , , can be characterized as in the following lemma.
Lemma 1
Proof:
See Appendix B.
IC Some properties about
As shown in [1], the GSVD decomposition matrix is often used to construct the precoding matrix at the transmitting end. In this section, some properties about are discussed.
First, define . As shown in [5, eq.(2.2)], can be expressed as
(9) 
where and are two unitary matrices, is a nonnegative diagonal matrix and has the same singular values as the nonzero singular values of . Thus, the power of can be expressed as
where , , are the nonzero eigenvalues of .
Note that and are two independent Gaussian matrices. Then, it is easy to know that is a Wishart matrix. Finally, directly from [6, Lemma 2.10], the following corollary can be derived.
Corollary 2
Suppose that and are two Gaussian matrices whose elements are i.i.d. complex Gaussian random variables with zero mean and unit variance and their GSVD is defined as in (1). The average power of the GSVD decomposition matrix can be expressed as follows:
(11) 
Appendix A: Proof of Theorem 1
1. The case when
When , , , and . Thus, from (1), the GSVD of and can be further expressed as follows:
(12) 
Moreover, when , as shown in (9), is nonsingular and it can be shown that
(13) 
Furthermore, from (2), it can be shown that
(14) 
Thus, can be expressed as
(15) 
Recall that and . Then, it can be shown that . Finally, it is easy to see that the distribution of , is identical to that of the nonzero eigenvalues of , where
and are two independent Gaussian matrices whose elements are are i.i.d. complex Gaussian random variables with zero mean and unit variance. Moreover, .
2. The case when
When , , , and . As shown in (2), are the squared generalized singular values of and , and .
On the other hand, define and the SVD of as . Note that is a Haar matrix. Divide into the following four blocks:
(17) 
where and . From [5, eq.(2.7)], it is easy to see that equal the nonone eigenvalues of . Moreover, from the fact that is a Haar matrix, it can be shown that
(18) 
Thus, the following equations can be derived:
(19) 
Note that and have the same nonzero eigenvalues. Thus, and have the same nonone eigenvalues. Therefore, equal the nonone eigenvalues of . Define as
(20) 
It is easy to see that and is also a Haar matrix.
Then, from the above discussions, it can be concluded that the distribution of is identical to the distribution of the nonone singular valus of the or truncated submatrix of a Haar matrix. Thus, define and , whose entries are i.i.d. complex Gaussian random variables with zero mean and unit variance. The distribution of the squared squared generalized singular values of and , is identical to that of the squared squared generalized singular values of and . Since , from Appendix A1, it can be known that the distribution of the squared squared generalized singular values of and , is identical to that of the eigenvalues of , where and are two independent Gaussian matrices whose elements are are i.i.d. complex Gaussian random variables with zero mean and unit variance.
This completes the proof of the theorem.
Appendix B: Proof of Lemma 1
First, the marginal p.d.f. derived from (5) can be expressed as follows:
where , and
Note that can be expressed as
Thus, can be further expressed as
(24) 
Therefore, can be expressed as
Moreover, from Eq.(3.194.3) [4], it can be shown that Then, can be expressed as
On the other hand, as shown in (Appendix B: Proof of Lemma 1), can be be expressed as
Moreover, define , . Then, can be be further expressed as
where . Thus, it can be known that .
This completes the proof of the lemma.
References
 [1] Z. Chen, Z. Ding, X. Dai, and R. Schober, “Asymptotic performance analysis of GSVDNOMA systems with a largescale antenna array.” Submitted to IEEE Trans. Wireless Commun. https://arxiv.org/abs/1805.09066.
 [2] A. T. James, “Distributions of matrix variates and latent roots derived from normal samples,” The Annals of Mathematical Statistics, vol. 35, no. 2, pp. 475–501, 1964.
 [3] Z. Chen, Z. Ding, and X. Dai, “On the distribution of the squared generalized singular values and its applications.” Submitted to IEEE Trans. Veh. Technol.
 [4] I. S. Gradshteyn and I. M. Ryzhik, Table of Integrals, Series and Products, 6th ed. Academic Press, 2000.
 [5] C. C. Paige and M. A. Saunders, “Towards a generalized singular value decomposition,” SIAM Journal on Numerical Analysis, vol. 18, no. 3, pp. 398–405, 1981.

[6]
A. Tulino and S. Verdú,
Foundations and Trends in Commun. and Inform. Theory: Random Matrix Theory and Wireless Communications
. now Publishers Inc., 2004.