Towards Empirical Sandwich Bounds on the Rate-Distortion Function

11/23/2021
by   Yibo Yang, et al.
0

Rate-distortion (R-D) function, a key quantity in information theory, characterizes the fundamental limit of how much a data source can be compressed subject to a fidelity criterion, by any compression algorithm. As researchers push for ever-improving compression performance, establishing the R-D function of a given data source is not only of scientific interest, but also sheds light on the possible room for improving compression algorithms. Previous work on this problem relied on distributional assumptions on the data source (Gibson, 2017) or only applied to discrete data. By contrast, this paper makes the first attempt at an algorithm for sandwiching the R-D function of a general (not necessarily discrete) source requiring only i.i.d. data samples. We estimate R-D sandwich bounds on Gaussian and high-dimension banana-shaped sources, as well as GAN-generated images. Our R-D upper bound on natural images indicates room for improving the performance of state-of-the-art image compression methods by 1 dB in PSNR at various bitrates.

READ FULL TEXT

page 8

page 21

page 22

research
10/09/2018

Rate Distortion For Model Compression: From Theory To Practice

As the size of neural network models increases dramatically today, study...
research
01/19/2022

On Distributed Lossy Coding of Symmetrically Correlated Gaussian Sources

In this paper, we consider a distributed lossy compression network with ...
research
04/15/2019

Exact Rate-Distortion in Autoencoders via Echo Noise

Compression is at the heart of effective representation learning. Howeve...
research
04/25/2018

The Dispersion of the Gauss-Markov Source

The Gauss-Markov source produces U_i = aU_i-1 + Z_i for i≥ 1, where U_0 ...
research
01/26/2023

Improving Statistical Fidelity for Neural Image Compression with Implicit Local Likelihood Models

Lossy image compression aims to represent images in as few bits as possi...
research
06/28/2022

Fundamental Limits of Communication Efficiency for Model Aggregation in Distributed Learning: A Rate-Distortion Approach

One of the main focuses in distributed learning is communication efficie...
research
11/27/2021

On lossy Compression of Directed Graphs

The method of types presented by Csiszar and Korner is a central tool us...

Please sign up or login with your details

Forgot password? Click here to reset