End-to-end Optimized Image Compression

11/05/2016
by   Johannes Ballé, et al.
0

We describe an image compression method, consisting of a nonlinear analysis transformation, a uniform quantizer, and a nonlinear synthesis transformation. The transforms are constructed in three successive stages of convolutional linear filters and nonlinear activation functions. Unlike most convolutional neural networks, the joint nonlinearity is chosen to implement a form of local gain control, inspired by those used to model biological neurons. Using a variant of stochastic gradient descent, we jointly optimize the entire model for rate-distortion performance over a database of training images, introducing a continuous proxy for the discontinuous loss function arising from the quantizer. Under certain conditions, the relaxed loss function may be interpreted as the log likelihood of a generative model, as implemented by a variational autoencoder. Unlike these models, however, the compression model must operate at any given point along the rate-distortion curve, as specified by a trade-off parameter. Across an independent set of test images, we find that the optimized method generally exhibits better rate-distortion performance than the standard JPEG and JPEG 2000 compression methods. More importantly, we observe a dramatic improvement in visual quality for all images at all bit rates, which is supported by objective quality estimates using MS-SSIM.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 8

page 18

page 19

page 21

page 22

page 23

page 25

page 27

02/01/2018

Variational image compression with a scale hyperprior

We describe an end-to-end trainable model for image compression based on...
12/25/2021

Pseudocylindrical Convolutions for Learned Omnidirectional Image Compression

Although equirectangular projection (ERP) is a convenient form to store ...
02/01/2022

Recognition-Aware Learned Image Compression

Learned image compression methods generally optimize a rate-distortion l...
05/24/2019

A Compression Objective and a Cycle Loss for Neural Image Compression

In this manuscript we propose two objective terms for neural image compr...
07/06/2020

Nonlinear Transform Coding

We review a class of methods that can be collected under the name nonlin...
08/23/2021

Rate distortion comparison of a few gradient quantizers

This article is in the context of gradient compression. Gradient compres...
10/05/2021

Metameric Varifocal Holography

Computer-Generated Holography (CGH) offers the potential for genuine, hi...

Code Repositories

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.