DeepAI AI Chat
Log In Sign Up

The Gaussian lossy Gray-Wyner network

by   Erixhen Sula, et al.

We consider the problem of source coding subject to a fidelity criterion for the Gray-Wyner network that connects a single source with two receivers via a common channel and two private channels. General lower bounds are derived for jointly Gaussian sources subject to the mean-squared error criterion, leveraging convex duality and an argument involving the factorization of convex envelopes. The pareto-optimal trade-offs between the sum-rate of the private channels and the rate of the common channel is completely characterized. Specifically, it is attained by selecting the auxiliary random variable to be jointly Gaussian with the sources.


page 1

page 2

page 3

page 4


Shannon Bounds on Lossy Gray-Wyner Networks

The Gray-Wyner network subject to a fidelity criterion is studied. Upper...

On Distributed Lossy Coding of Symmetrically Correlated Gaussian Sources

In this paper, we consider a distributed lossy compression network with ...

Relaxed Wyner's Common Information

A natural relaxation of Wyner's Common Information is studied. Specifica...

Secret Key Generation from Vector Gaussian Sources with Public and Private Communications

In this paper, we consider the problem of secret key generation with one...

Zero-Delay Rate Distortion via Filtering for Vector-Valued Gaussian Sources

We deal with zero-delay source coding of a vector-valued Gauss-Markov so...

One-shot achievability and converse bounds of Gaussian random coding in AWGN channels under covert constraints

This paper considers the achievability and converse bounds on the maxima...