Compressing Images by Encoding Their Latent Representations with Relative Entropy Coding

10/02/2020
by   Gergely Flamich, et al.
0

Variational Autoencoders (VAEs) have seen widespread use in learned image compression. They are used to learn expressive latent representations on which downstream compression methods can operate with high efficiency. Recently proposed 'bits-back' methods can indirectly encode the latent representation of images with codelength close to the relative entropy between the latent posterior and the prior. However, due to the underlying algorithm, these methods can only be used for lossless compression, and they only achieve their nominal efficiency when compressing multiple images simultaneously; they are inefficient for compressing single images. As an alternative, we propose a novel method, Relative Entropy Coding (REC), that can directly encode the latent representation with codelength close to the relative entropy for single images, supported by our empirical results obtained on the Cifar10, ImageNet32 and Kodak datasets. Moreover, unlike previous bits-back methods, REC is immediately applicable to lossy compression, where it is competitive with the state-of-the-art on the Kodak dataset.

READ FULL TEXT

page 14

page 15

page 19

page 20

page 21

page 22

research
01/30/2022

Fast Relative Entropy Coding with A* coding

Relative entropy coding (REC) algorithms encode a sample from a target d...
research
08/04/2011

Learning Representations by Maximizing Compression

We give an algorithm that learns a representation of data through compre...
research
06/07/2020

Improving Inference for Neural Image Compression

We consider the problem of lossy image compression with deep latent vari...
research
04/05/2022

Split Hierarchical Variational Compression

Variational autoencoders (VAEs) have witnessed great success in performi...
research
05/20/2023

Low-Entropy Latent Variables Hurt Out-of-Distribution Performance

We study the relationship between the entropy of intermediate representa...
research
01/05/2022

Understanding Entropy Coding With Asymmetric Numeral Systems (ANS): a Statistician's Perspective

Entropy coding is the backbone data compression. Novel machine-learning ...
research
11/17/2021

End-to-end optimized image compression with competition of prior distributions

Convolutional autoencoders are now at the forefront of image compression...

Please sign up or login with your details

Forgot password? Click here to reset