Learning Context-Based Non-local Entropy Modeling for Image Compression

05/10/2020
by   Mu Li, et al.
2

The entropy of the codes usually serves as the rate loss in the recent learned lossy image compression methods. Precise estimation of the probabilistic distribution of the codes plays a vital role in the performance. However, existing deep learning based entropy modeling methods generally assume the latent codes are statistically independent or depend on some side information or local context, which fails to take the global similarity within the context into account and thus hinder the accurate entropy estimation. To address this issue, we propose a non-local operation for context modeling by employing the global similarity within the context. Specifically, we first introduce the proxy similarity functions and spatial masks to handle the missing reference problem in context modeling. Then, we combine the local and the global context via a non-local attention block and employ it in masked convolutional networks for entropy modeling. The entropy model is further adopted as the rate loss in a joint rate-distortion optimization to guide the training of the analysis transform and the synthesis transform network in transforming coding framework. Considering that the width of the transforms is essential in training low distortion models, we finally produce a U-Net block in the transforms to increase the width with manageable memory consumption and time complexity. Experiments on Kodak and Tecnick datasets demonstrate the superiority of the proposed context-based non-local attention block in entropy modeling and the U-Net block in low distortion compression against the existing image compression standards and recent deep image compression models.

READ FULL TEXT

page 1

page 3

page 4

page 9

page 11

research
06/24/2019

Efficient and Effective Context-Based Convolutional Entropy Modeling for Image Compression

It has long been understood that precisely estimating the probabilistic ...
research
04/22/2019

Non-local Attention Optimized Deep Image Compression

This paper proposes a novel Non-Local Attention Optimized Deep Image Com...
research
03/27/2023

Learned Image Compression with Mixed Transformer-CNN Architectures

Learned image compression (LIC) methods have exhibited promising progres...
research
09/19/2023

Multi-Context Dual Hyper-Prior Neural Image Compression

Transform and entropy models are the two core components in deep image c...
research
11/14/2022

Multi-Reference Entropy Model for Learned Image Compression

Recently, learned image compression has achieved remarkable performance....
research
03/04/2021

A Cross Channel Context Model for Latents in Deep Image Compression

This paper presents a cross channel context model for latents in deep im...
research
07/06/2020

Nonlinear Transform Coding

We review a class of methods that can be collected under the name nonlin...

Please sign up or login with your details

Forgot password? Click here to reset