Relaxed Wyner's Common Information

12/15/2019
by   Erixhen Sula, et al.
0

A natural relaxation of Wyner's Common Information is studied. Specifically, the constraint of conditional independence is replaced by an upper bound on the conditional mutual information. While of interest in its own right, this relaxation has operational significance in a source coding problem that models coded caching. For the special case of jointly Gaussian random variables, it is shown that (relaxed) Wyner's Common Information is attained by a Gaussian auxiliary, and a closed-form formula is found. In the case of Gaussian vectors, this is shown to lead to a novel allocation problem. Finally, using the same techniques, it is also shown that for the lossy Gray-Wyner network with Gaussian sources and mean-squared error, Gaussian auxiliaries are optimal, which leads to closed-form solutions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/01/2021

On conditional Sibson's α-Mutual Information

In this work, we analyse how to define a conditional version of Sibson's...
research
05/03/2021

Partial Information Decomposition via Deficiency for Multivariate Gaussians

We consider the problem of decomposing the information content of three ...
research
02/02/2020

The Gaussian lossy Gray-Wyner network

We consider the problem of source coding subject to a fidelity criterion...
research
07/02/2018

Gaussian Signalling for Covert Communications

In this work, we examine the optimality of Gaussian signalling for cover...
research
11/12/2020

Bottleneck Problems: Information and Estimation-Theoretic View

Information bottleneck (IB) and privacy funnel (PF) are two closely rela...
research
12/16/2019

On Zero-Delay RDF for Vector-Valued Gauss-Markov Sources with Additional Noise

We consider a zero-delay remote source coding problem where a hidden sou...

Please sign up or login with your details

Forgot password? Click here to reset