I Introduction, Main Concepts, Literature, Main Results
In information theory and communications an important class of theoretical and practical problems is of a multiuser nature, such as, lossless and lossy network source coding for data compression over noiseless channels, network channel coding for data transmission over noisy channels [1], and secure communication [2]
. A subclass of network source coding problems deals with two sources that generate at each time instant, symbols that are stationary memoryless, multivariate, and jointly Gaussian distributed, and similarly for network channel coding problems, i.e., Gaussian multiple access channels (MAC) with two or more multivariate correlated sources and a multivariate output.
In this paper we show the relevance of
three fundamental concepts of statistics and probability
to the network problems discussed above found in the report by Charalambous and van Schuppen [3] that involve a tuple of multivariate jointly independent and identically distributed multivariate Gaussian random variables (RVs) ,(1)  
indep. of  (2) 
We illustrate their application to the calculation of rates that lie in the Gray and Wyner rate region [4] of the simple network shown in Fig. 1, with respect to the average squareerror distortions at the two decoders
(3)  
(4) 
and where are Euclidean distances on .
The rest of this section and the remaining of the paper is organized as follows.
In Section IA we introduced the three concepts which are further described in Charalambous and van Schuppen [3], in Sections IBIC we recall the Gray and Wyner characterization of the rate region [4], and the characterization of the minimum lossy common message rate on the Gray and Wyner rate region due to Viswanatha, Akyol and Rose [5], and Xu, Liu, and Chen [6]. In Section II we present our main results in the form of theorems.
In Section III we give the proofs of the main theorems, while citing [3] if necessary.
Ia Three Concepts of Statistics and Probability
Notation. An valued Gaussian RV, denoted by , with as parameters the mean value and the variance , , is a function
which is a RV and such that the measure of this RV equals a Gaussian measure described by its characteristic function. This definition includes
.The effective dimension of the RV is denoted by . An identity matrix is denoted by .
A tuple of Gaussian RVs will be denoted this way to save space, rather than by
Then the variance matrix of this tuple is denoted by
The variance is distinguished from .
The first concept is Hotelling’s [7] geometric approach to Gaussian RVs [8, 9], where the underlying geometric object of a Gaussian RV is the algebra generated by . A basis transformation of such a RV is then the transformation defined by a nonsingular matrix , and it then directly follows that . For the tuple of jointly Gaussian multivariate RVs , a basis transformation of this tuple consists of a matrix composed of two square and nonsingular matrices, (see [3, Algorithm 2.10]),
(5)  
(6) 
maps into the socalled canonical form of the tuple of RVs (the full specification is given in [3, Section 2.2, Definition 2.2]), which identifies identical, correlated, and private information, as interpreted in the table below,
identical information of and  
correlated information of w.r.t  
private information of w.r.t  
identical information of and  
correlated information of w.r.t  
private information of w.r.t 
where
(7)  
(8)  
(9)  
(10)  
(11)  
and are independent  (12)  
(13)  
(14) 
The entries of are called the canonical correlation coefficients. For the term identical information
is used. The linear transformation
is equivalent to a preprocessing of by a linear preencoder (see [3] for applications to network problems).The expression of mutual information between and , denoted by , as a function of the canonical correlation coefficients, discussed in [10] is given in Theorem II.1.
The second concept is van Putten’s and van Schuppen’s [11]
parametrization of the family of all jointly Gaussian probability distributions
by an auxiliary Gaussian RV that makes and conditional independent, defined bythe marginal dist. of is the fixed dist.  
(15) 
and its subset of the set , with the additional constraint that the dimension of the RV is minimal while all other conditions hold. The parametrizaion is in terms of a set of matrices. Consequences are found in [3, Section 2.3].
The third concept is the weak stochastic realization of RVs that induces distributions in the sets and (see [11, Def. 2.17 and Prop. 2.18] and [3, Def. 2.17 and Prop. 2.18]).
Theorem II.2 (our main theorem) gives as a special case (part (d)) an achievable lower bound on Wyner’s single letter information theoretic characterization of common information:
(16) 
and the weak stochastic realization of RVs that induce distributions in the sets and .
IB The Gray and Wyner Lossy Rate Region
Now, we describe our results with respect to the fundamental question posed by Gray and Wyner [4] for the simple network shown in Fig. 1, which is: determine which channel capacitity triples are necessary and sufficient for each sequence to be reliably reproduced at the intended decoders, while satisfying the average distortions with respect to single letter distortion functions . Gray and Wyner characterized the operational rate region, denoted by by a coding scheme that uses the auxiliary RV , as described below. Define the family of probability distributions
for some auxiliary random variable .
Theorem 8 in [4]: Let denote the Gray and Wyner rate region. Suppose there exists such that , . For each and , define the subset of Euclidean D space
(17) 
where is rate distortion function (RDF) of , conditioned on , at decoder , , and is the joint RDF of joint decoding of . Let
(18) 
where denotes the closure of the indicated set. Then the achievable GrayWyner lossy rate region is given by
(19) 
IC Wyner’s Lossy Common Information
Viswanatha, Akyol, and Rose [5], and Xu, Liu, and Chen [6], characterized the minimum lossy common message rate on the rate region , as follows.
Theorem 4 in [6]: Let denote the minimum common message rate on the Gray and Wyner lossy rate region , with sum rate not exceeding the joint rate distortion function .
Then is characterized by
(22) 
such that the following identity holds
(23) 
where the infimum is over all RVs in , which parametrize the source distribution via , having a marginal source distribution , and induce joint distributions which satisfy the constraint.
is also given the interpretation of Wyner’s lossy common information, due to its operational meaning [5, 6]. We should mention that from Appendix B in [6] it follows that a necessary condition for the equality constraint (23) is , and sufficient condition for this equality to hold is the conditional independence condition [6]: . Hence, a sufficient condition for any rate to lie on the Pangloss plane, i.e., to satisfy (23) is the conditional independence.
Ii Main Results
Given the tuple of multivariate Gaussian RVs and distortion functions (1)(4), the main contributions of the paper are:
(1) the theorem and the proof of Wyner’s common information (information definition).
The existing proof of this result in [12] is incomplete (see discussion below Theorem II.2).
(2) Paremetrization of rate triples , and Wyner’s lossy common information.
Below we state the expression of mutual information as a function of the canonical correlation coefficients, discussed in Gelfand and Yaglom [10].
Theorem II.1
Consider a tuple of multivariable jointly Gaussian RVs
, .
Compute the canonical variable form of the tuple
of Gaussian RVs
according to Algorithm 2.2 of [3].
This yields the indices
, , , , and
and the diagonal matrix with canonical correlation coefficients
for (as in [3, Definition 2.2]).
Then mutual information is given by the formula,
where are the canonical correlation coefficients.
is a generalization of the wellknown formula of a tuple of scalar RVs, i.e., , , where is the correlation coefficient.
The case gives ; if such components are present they should be removed. Hence, we state the next theorem under the restriction .
Theorem II.2
Consider a tuple of multivariable jointly Gaussian RVs
, and without loss of generality assume produces a canonical variable form such that (see [3, Definition 2.2]).
For any joint distrubution parametrized by an arbitrary RV with fixed marginal distribution the following hold.
(a) The mutual information satisfies
(24)  
(25)  
(26) 
where the lower bound is parametrized by ,
(27) 
and such that is jointly Gaussian.
(b) The lower bound in (25) is achieved if is jointly Gaussian and , and a realization of the RVs which achieves the lower bound is
(28)  
(29)  
(30)  
(31)  
(32) 
(c) A lower bound on (26) occurs if is diagonal, i.e., , and it is achieved by realization (28)(32), with .
(d) Wyner’s information common information is given by
(33) 
and it is achieved by a Gaussian RV , an identity covariance matrix, and the realization of part (b) with .
The characterization of the subset of the set of two RVs in canonical variable form by the set is due to Van Putten and Van Schuppen [11].
In [12] the proof of (33) is incomplete because there is no optimization over the set of measures achieving the conditional independence. In that reference there is an assumption that three crosscovariances can be simultaneously diagonalized. which is not true in general. This assumption implies that case (d) of the above theorem holds. This assumption is repeated in [13].
From Theorem II.2 follows directly the proposition below.
Proposition II.3
Proof Follows from Gray and Wyner [4, (4) of page 1703, eqn(42)] and Theorem II.2. (34) follows from RDF of Gaussian RVs.
Theorem II.4
Consider the tuple of jointly Gaussian RVs of Theorem II.2. Then
(36)  
(37)  
Iii Proofs of main Theorems
We present in this section additional exposition on the Concepts of Section IA, and outlines of the proofs of the main theorems (see [3] for additional exposition).
Iiia Further Discussion on the Three Conecpts
First we state a few facts.
(A1) The parametrization of the family of Gaussian probability distributions
and require the solution of the weak stochastic realization problem of Gaussian RVs (defined by Problem 2.15 in [3]) given in [14, Theorem 4.2] (see also [3, Theorem 3.8]), and reproduced below.
Theorem III.1
[14, Theorem 4.2] Consider a tuple of Gaussian RVs in the canonical variable form. Restrict attention to the correlated parts of these RVs, as follows:
(38)  
(39)  
(40) 

There exists a probability measure , and a triple of Gaussian RVs defined on it, such that (i) and (ii) and are conditional independent given with having minimal dimension.

There exist a family of Gaussian measures denoted by , that satisfy (i) and (ii) of (a), and moreover this family is parametrized by the matrices and sets:
(41) (42) (43) and .
(A2) The weak stochastic realization of a Gaussian measure on the Borel space is then defined and characterized as in Def. 2.17 and Prop. 2.18, Alg. 3.4 of [3].
Comments
There are no comments yet.