Wyner's Common Information: Generalizations and A New Lossy Source Coding Interpretation
Wyner's common information was originally defined for a pair of dependent discrete random variables. Its significance is largely reflected in, hence also confined to, several existing interpretations in various source coding problems. This paper attempts to both generalize its definition and to expand its practical significance by providing a new operational interpretation. The generalization is two-folded: the number of dependent variables can be arbitrary, so are the alphabet of those random variables. New properties are determined for the generalized Wyner's common information of N dependent variables. More importantly, a lossy source coding interpretation of Wyner's common information is developed using the Gray-Wyner network. In particular, it is established that the common information equals to the smallest common message rate when the total rate is arbitrarily close to the rate distortion function with joint decoding. A surprising observation is that such equality holds independent of the values of distortion constraints as long as the distortions are within some distortion region. Examples about the computation of common information are given, including that of a pair of dependent Gaussian random variables.
READ FULL TEXT