Vector Gaussian Successive Refinement With Degraded Side Information
We investigate the problem of the successive refinement for Wyner-Ziv coding with degraded side information and obtain a complete characterization of the rate region for the quadratic vector Gaussian case. The achievability part is based on the evaluation of the Tian-Diggavi inner bound that involves Gaussian auxiliary random vectors. For the converse part, a matching outer bound is obtained with the aid of a new extremal inequality. Herein, the proof of this extremal inequality depends on the integration of the monotone path argument and the doubling trick as well as information-estimation relations.
READ FULL TEXT