On Distributed Lossy Coding of Symmetrically Correlated Gaussian Sources

01/19/2022
by   Siyao Zhou, et al.
0

In this paper, we consider a distributed lossy compression network with L encoders and a decoder. Each encoder observes a source and compresses it, which is sent to the decoder. Moreover, each observed source can be written as the sum of a target signal and a noise which are independently generated from two symmetric multivariate Gaussian distributions. The decoder jointly constructs the target signals given a threshold on the mean squared error distortion. We are interested in the minimum compression rate of this network versus the distortion threshold which is known as the rate-distortion function. We derive a lower bound on the rate-distortion function by solving a convex program, explicitly. The proposed lower bound matches the well-known Berger-Tung's upper bound for some values of the distortion threshold. The asymptotic expressions of the upper and lower bounds are derived in the large L limit. Under specific constraints, the bounds match in the asymptotic regime yielding the characterization of the rate-distortion function.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset