DeepAI AI Chat
Log In Sign Up

A coding theorem for the rate-distortion-perception function

04/28/2021
by   Lucas Theis, et al.
0

The rate-distortion-perception function (RDPF; Blau and Michaeli, 2019) has emerged as a useful tool for thinking about realism and distortion of reconstructions in lossy compression. Unlike the rate-distortion function, however, it is unknown whether encoders and decoders exist that achieve the rate suggested by the RDPF. Building on results by Li and El Gamal (2018), we show that the RDPF can indeed be achieved using stochastic, variable-length codes. For this class of codes, we also prove that the RDPF lower-bounds the achievable rate

READ FULL TEXT

page 1

page 2

page 3

page 4

04/12/2022

On the Rate-Distortion-Perception Function

Rate-distortion-perception theory generalizes Shannon's rate-distortion ...
03/29/2022

Mismatched Rate-Distortion Theory: Ensembles, Bounds, and General Alphabets

In this paper, we consider the mismatched rate-distortion problem, in wh...
11/30/2018

Rate-Distortion-Perception Tradeoff of Variable-Length Source Coding for General Information Sources

Blau and Michaeli recently introduced a novel concept for inverse proble...
02/21/2018

Non-Asymptotic Bounds and a General Formula for the Rate-Distortion Region of the Successive Refinement Problem

In the successive refinement problem, a fixed-length sequence emitted fr...
11/27/2021

On lossy Compression of Directed Graphs

The method of types presented by Csiszar and Korner is a central tool us...
02/18/2021

On the advantages of stochastic encoders

Stochastic encoders have been used in rate-distortion theory and neural ...
06/18/2021

Universal Rate-Distortion-Perception Representations for Lossy Compression

In the context of lossy compression, Blau Michaeli (2019) adopt a ma...