Relaxed Wyner's Common Information

12/15/2019 ∙ by Erixhen Sula, et al. ∙ 0

A natural relaxation of Wyner's Common Information is studied. Specifically, the constraint of conditional independence is replaced by an upper bound on the conditional mutual information. While of interest in its own right, this relaxation has operational significance in a source coding problem that models coded caching. For the special case of jointly Gaussian random variables, it is shown that (relaxed) Wyner's Common Information is attained by a Gaussian auxiliary, and a closed-form formula is found. In the case of Gaussian vectors, this is shown to lead to a novel allocation problem. Finally, using the same techniques, it is also shown that for the lossy Gray-Wyner network with Gaussian sources and mean-squared error, Gaussian auxiliaries are optimal, which leads to closed-form solutions.



There are no comments yet.


page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.