Incremental Refinements and Multiple Descriptions with Feedback

11/05/2020
by   Jan Østergaard, et al.
0

It is well known that independent (separate) encoding of K correlated sources may incur some rate loss compared to joint encoding, even if the decoding is done jointly. This loss is particularly evident in the multiple descriptions problem, where the sources are repetitions of the same source, but each description must be individually good. We observe that under mild conditions about the source and distortion measure, the rate ratio Rindependent(K)/Rjoint goes to one in the limit of small rate/high distortion. Moreover, we consider the excess rate with respect to the rate-distortion function, Rindependent(K, M) - R(D), in M rounds of K independent encodings with a final distortion level D. We provide two examples - a Gaussian source with mean-squared error and an exponential source with one-sided error - for which the excess rate vanishes in the limit as the number of rounds M goes to infinity, for any fixed D and K. This result has an interesting interpretation for a multi-round variant of the multiple descriptions problem, where after each round the encoder gets a (block) feedback regarding which of the descriptions arrived: In the limit as the number of rounds M goes to infinity (i.e., many incremental rounds), the total rate of received descriptions approaches the rate-distortion function. We provide theoretical and experimental evidence showing that this phenomenon is in fact more general than in the two examples above.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/07/2012

Lossy Compression via Sparse Linear Regression: Computationally Efficient Encoding and Decoding

We propose computationally efficient encoders and decoders for lossy com...
research
01/19/2022

On Distributed Lossy Coding of Symmetrically Correlated Gaussian Sources

In this paper, we consider a distributed lossy compression network with ...
research
03/15/2018

Reconstructing Gaussian sources by spatial sampling

Consider a Gaussian memoryless multiple source with m components with jo...
research
03/27/2018

Rate-distortion functions of non-stationary Markoff chains and their block-independent approximations

It is proved that the limit of the normalized rate-distortion functions ...
research
04/10/2018

The Sum-Rate-Distortion Region of Correlated Gauss-Markov Sources

We derive the sum-rate-distortion region for a generic number of success...
research
02/03/2012

Lossy Compression via Sparse Linear Regression: Performance under Minimum-distance Encoding

We study a new class of codes for lossy compression with the squared-err...
research
01/17/2018

Rate-Distortion Performance of Sequential Massive Random Access to Gaussian Sources with Memory

In Sequential Massive Random Access (SMRA), a set of correlated sources ...

Please sign up or login with your details

Forgot password? Click here to reset