The Common Information of N Dependent Random Variables

10/18/2010
by   Wei Liu, et al.
0

This paper generalizes Wyner's definition of common information of a pair of random variables to that of N random variables. We prove coding theorems that show the same operational meanings for the common information of two random variables generalize to that of N random variables. As a byproduct of our proof, we show that the Gray-Wyner source coding network can be generalized to N source squences with N decoders. We also establish a monotone property of Wyner's common information which is in contrast to other notions of the common information, specifically Shannon's mutual information and Gács and Körner's common randomness. Examples about the computation of Wyner's common information of N random variables are also given.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset