DeepAI AI Chat
Log In Sign Up

A coding theorem for the rate-distortion-perception function

by   Lucas Theis, et al.

The rate-distortion-perception function (RDPF; Blau and Michaeli, 2019) has emerged as a useful tool for thinking about realism and distortion of reconstructions in lossy compression. Unlike the rate-distortion function, however, it is unknown whether encoders and decoders exist that achieve the rate suggested by the RDPF. Building on results by Li and El Gamal (2018), we show that the RDPF can indeed be achieved using stochastic, variable-length codes. For this class of codes, we also prove that the RDPF lower-bounds the achievable rate


page 1

page 2

page 3

page 4


On the Rate-Distortion-Perception Function

Rate-distortion-perception theory generalizes Shannon's rate-distortion ...

Mismatched Rate-Distortion Theory: Ensembles, Bounds, and General Alphabets

In this paper, we consider the mismatched rate-distortion problem, in wh...

Rate-Distortion-Perception Tradeoff of Variable-Length Source Coding for General Information Sources

Blau and Michaeli recently introduced a novel concept for inverse proble...

Non-Asymptotic Bounds and a General Formula for the Rate-Distortion Region of the Successive Refinement Problem

In the successive refinement problem, a fixed-length sequence emitted fr...

On lossy Compression of Directed Graphs

The method of types presented by Csiszar and Korner is a central tool us...

On the advantages of stochastic encoders

Stochastic encoders have been used in rate-distortion theory and neural ...

Universal Rate-Distortion-Perception Representations for Lossy Compression

In the context of lossy compression, Blau Michaeli (2019) adopt a ma...