1 Introduction
In this paper, for any we introduce a new measure of correlation by
(1) 
where denotes a certain norm which for reduces to the norm. Since Rényi mutual information (according to Sibson’s proposal) can also be expressed in terms of the norm our measure of correlation is also related Rényi mutual information.
The main motivation for introducing these measures of correlation, particularly for , is their applications in decoupling theorems. The point is that the average of , when is the outcome of a certain random CPTP map applied on the bipartite quantum state , can be bounded by where is a constant. Thus our measures of correlation can be used to prove decoupling type theorems in information theory.
Decoupling theorems have already found several applications in information theory. Most achievability results in quantum information theory are based on the phenomenon of decoupling (see [1] and references therein). Also, in classical information theory the OSRB method of [2] provides a similar decouplingtype tool for proving achievability results. The advantage of our decoupling theorem based on the measure , comparing to previous ones, is that it works for all values of . Given the relation between and Rényi mutual information mentioned above, the parameters appearing in our decoupling theorem would be related to Rényi mutual information, which for reduces to Shannon’s mutual information. Therefore, we can use our decoupling theorems not only for proving achievability results but also for proving interesting bounds on the random coding exponents. We demonstrate this application via the examples of entanglement generation via a noisy quantum communication channel, and secure communication over a (classical) wiretap channel. In particular, we show a bound on the secrecy exponent of random coding over a wiretap channel in terms of Rényi mutual information according to Csiszár’s proposal.
Another application of our new measures of correlation is in secrecy. To measure the security of a communication system, one has to quantify the amount of information leaked to an eavesdropper. While the common security metric for measuring the leakage is mutual information (see e.g., see [3]) or the total variation distance [2, 4], there have been few recent works that motivate and define other measures of correlation to quantify leakage [5, 6, 7, 8, 9, 10, 11, 12]. Herein, we suggest the use of our metric instead of mutual information because it is a stronger metric and has a better ratesecurity tradeoff curve. To explain the ratesecurity tradeoff, consider a secure transmission protocol over a communication channel, achieving a communication rate of with certified leakage of at most
according to the mutual information metric. Now, if the transmitter obtains a classified message for which leakage
is no longer acceptable, it can sacrifice communication rate for improved transmission security. We show that the ratesecurity tradeoff with the mutual information metric is far worse than that of our metric. We will discuss this fact in more details via the problem of privacy amplification.The definition of our measure of correlation is based on the theory of vectorvalued spaces. These spaces are generalizations of the spaces and are defined via the theory of complex interpolation
. Then the proofs of our main theorems are heavily based on the interpolation theory. In particular, we use the RieszThorin interpolation theorem several times, in order to establish an inequality for all
by interpolating between and .In the following section, we review some notations and introduce vectorvalued norms. Section 3 introduces our new measure of correlation and presents some of its properties. Section 4 contains the main technical results of this paper. Section 5 and Section 6 contain some applications of our results in privacy amplification as well as in bounding the random coding exponents.
2 Vectorvalued norms
For a finite set let to be the vector space of functions . For any and we define
This quantity for satisfies the triangle inequality and turns into a normed space. The dual of norm is the norm where is the Hölder conjugate of given by
(2) 
More generally, for any with and any we have
where .
Suppose that is another set and we equip the vector space with the norm. The question is how we can naturally define a norm on the space that is compatible with the norm of the individual spaces . By compatible we mean that if with and (i.e., ) then
(3) 
To this end, any vector can be taught of as a collection of vectors for any , where . Let us denote . Then we may define
This definition of the norm satisfies (3). Moreover, when , this norm coincides with the usual norm. Finally, it is not hard to verify that the norm, for , is indeed a norm and satisfies the triangle inequality.
The norm can also be defined in the noncommutative case. Suppose that is a Hilbert space of finite dimension . Let to be the space of linear operators acting on . Again we can define
where , and is the adjoint of . For this equips with a norm, called the Schatten norm, that satisfies the triangle inequality. Hölder’s inequality is also satisfied for Schatten norms [13]: if with , then for we have
(4) 
Our notation in the noncommutative case can be made compatible with the commutative case. By abuse of notation, an element can be taught of as a diagonal matrix of the form
acting on the Hilbert space with the orthonormal basis . Therefore, can be taught of as a subspace of . We also have
Now the question is how we can define the norm in the noncommutative case. Let us start with the easy case of . Then, following the above notation, can be written as
with . Similar to the fully commutative case we can define
(5) 
Now let us turn to the fully noncommutative case. In this case, the definition of the norm is not easy and is derived from interpolation theory [14]. Here, we present an equivalent definition provided in [15] (see also [16]). We also focus on the case of that we need in this paper. In this case, since there exists such that . Then for any we define
(6) 
where the infimum is taken over all density matrices^{1}^{1}1A density matrix is a positive semidefinite operator with trace one. and is the identity operator. In the following, for simplicity we sometimes suppress the identity operators in expressions of the form and write . Therefore,
When , the norm satisfies the triangle inequality and is a norm. Some remarks are in line.
Remark 1.
As in the commutative case, the order of subsystems in the above definition is important, i.e., and are different.
Remark 2.
From Hölder’s inequality (4), one can derive that if , then
Remark 3.
Remark 5.
We will compare our measure of correlation with Rényi mutual information which interestingly can also be written in terms of norms. For the sandwiched Rényi relative entropy is defined by^{2}^{2}2All the logarithms in this paper are in base two.
where is the Hölder conjugate of given by (2). The Rényi mutual information (Sibson’s proposal) for is given by^{3}^{3}3See [17] for different definitions and properties of Rényi mutual information.
Using the definition of and Remark 5 we find that
In particular, for classical random variables
andwith joint distribution
we have(8) 
Finally the Rényi conditional entropy is defined by
(9) 
We finish this section by stating a lemma about the monotonicity of the norm.
Lemma 6.
For any and any density matrix the function is nondecreasing on .
Proof.
Let , and let be such that . Using Hölder’s inequality for arbitrary density matrices we have
Taking infimum over we obtain the desired result. ∎
2.1 Completely bounded norm
The completely bounded norm of a superoperator is defined by
where the supremum is taken over all auxiliary Hilbert spaces with arbitrary dimension and is the identity superoperator. In the above definition, we may replace with any , see [14]. That is, for any we have
(10) 
We say that a superoperator between spaces with certain norms is a complete contraction if its completely bounded norm is at most .
Lemma 7.
For any and we have
3 A new measure of correlation
In this section, we define our measure of correlation and study some of its properties.
Definition 1.
Let be an arbitrary bipartite density matrix. For any we define^{4}^{4}4When we have .
(11)  
(12) 
where , and are the marginal states on and subsystems, respectively.
As will be seen below, is a measure of correlation while is a related quantity that may be thought of as a conditional entropy.
By Remark 4 when , and can be expressed in terms of the norm:
(13)  
(14) 
As an immediate property of the above definitions, both and are nonnegative. Moreover, since they are defined in terms of a norm, we have if and only if , and if and only if .
Proposition 8.
For any the functions
and
are nondecreasing. In particular, for any we have
Proof.
For the monotonicity of , in Lemma 6 put and . For the other monotonicity let and . ∎
We now prove the main property of and , namely their monotonicity under local operations.
Theorem 9 (Monotonicity under local operations).

For any and all CPTP maps and we have
where .

For any and any CPTP map
where and is the identity superoperator.
Proof.
For (i) we compute
where denotes the superoperator norm:
Now using equation (3.5) and Theorem 13 of [16] we have
On the other hand, using Lemma 9 of [18] (see also [19]) we have
Moreover, by Lemma 5 of [16] we have
since is CPTP. We conclude that, .
The proof of (ii) is similar, so we skip it. ∎
We now state the relation between and Rényi information measures.
Proposition 10.
For any bipartite density matrix we have
where is the Hölder conjugate of . For we have
where .
Proof.
By the triangle inequality we have
Moreover, by Remark 2 we have
These give the first inequality. The proof of the second inequality is similar. ∎
Theorem 11.
Let be a tripartite density matrix. Then the followings hold:

For any we have

Assume that . Then for any we have
Moreover, if is classical (and ) then
where we define
Proof.
The proof of (ii) is immediate once we have (i) since if then
Moreover, when is classical and
with , we have
So we only need to prove (i).
Define by
We claim that
(15) 
Since vector valued spaces form an interpolation family [14], by the RieszThorin theorem (see Appendix A) it suffices to prove this for and . For by the triangle inequlality we have
where the second inequality comes from the fact that that is easy to verify. We now prove the inequality for . We compute
Then (15) holds for all and for any we have
Letting
in the above inequality we obtain the desired result.
∎
The next theorem gives a “weak converse” of the above inequalities.
Theorem 12.
For every tripartite density matrix and the followings hold:

.

If then
Comments
There are no comments yet.