Substitutional Neural Image Compression

05/16/2021
by   Xiao Wang, et al.
0

We describe Substitutional Neural Image Compression (SNIC), a general approach for enhancing any neural image compression model, that requires no data or additional tuning of the trained model. It boosts compression performance toward a flexible distortion metric and enables bit-rate control using a single model instance. The key idea is to replace the image to be compressed with a substitutional one that outperforms the original one in a desired way. Finding such a substitute is inherently difficult for conventional codecs, yet surprisingly favorable for neural compression models thanks to their fully differentiable structures. With gradients of a particular loss backpropogated to the input, a desired substitute can be efficiently crafted iteratively. We demonstrate the effectiveness of SNIC, when combined with various neural compression models and target metrics, in improving compression quality and performing bit-rate control measured by rate-distortion curves. Empirical results of control precision and generation speed are also discussed.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 6

page 7

page 8

research
04/26/2022

Estimating the Resize Parameter in End-to-end Learned Image Compression

We describe a search-free resizing framework that can further improve th...
research
06/24/2021

Rate Distortion Characteristic Modeling for Neural Image Compression

End-to-end optimization capability offers neural image compression (NIC)...
research
05/08/2020

Lossy Compression with Distortion Constrained Optimization

When training end-to-end learned models for lossy compression, one has t...
research
09/25/2021

Revisiting Pre-analysis Information Based Rate Control in x265

Due to the excellent compression and high real-time performance, x265 is...
research
03/29/2021

Slimmable Compressive Autoencoders for Practical Neural Image Compression

Neural image compression leverages deep neural networks to outperform tr...
research
02/18/2020

Variable-Bitrate Neural Compression via Bayesian Arithmetic Coding

Deep Bayesian latent variable models have enabled new approaches to both...
research
01/26/2023

Improving Statistical Fidelity for Neural Image Compression with Implicit Local Likelihood Models

Lossy image compression aims to represent images in as few bits as possi...

Please sign up or login with your details

Forgot password? Click here to reset