Binary Probability Model for Learning Based Image Compression

02/21/2020
by   Théo Ladune, et al.
0

In this paper, we propose to enhance learned image compression systems with a richer probability model for the latent variables. Previous works model the latents with a Gaussian or a Laplace distribution. Inspired by binary arithmetic coding , we propose to signal the latents with three binary values and one integer, with different probability models. A relaxation method is designed to perform gradient-based training. The richer probability model results in a better entropy coding leading to lower rate. Experiments under the Challenge on Learned Image Compression (CLIC) test conditions demonstrate that this method achieves 18

READ FULL TEXT
research
09/18/2017

Neural network-based arithmetic coding of intra prediction modes in HEVC

In both H.264 and HEVC, context-adaptive binary arithmetic coding (CABAC...
research
03/23/2020

Learning Better Lossless Compression Using Lossy Compression

We leverage the powerful lossy image compression algorithm BPG to build ...
research
11/09/2012

Time Complexity Analysis of Binary Space Partitioning Scheme for Image Compression

Segmentation-based image coding methods provide high compression ratios ...
research
02/15/2022

Post-Training Quantization for Cross-Platform Learned Image Compression

It has been witnessed that learned image compression has outperformed co...
research
06/15/2019

Model Compression by Entropy Penalized Reparameterization

We describe an end-to-end neural network weight compression approach tha...
research
05/17/2018

Fully Convolutional Model for Variable Bit Length and Lossy High Density Compression of Mammograms

Early works on medical image compression date to the 1980's with the imp...
research
12/11/2020

Soft Compression for Lossless Image Coding

Soft compression is a lossless image compression method, which is commit...

Please sign up or login with your details

Forgot password? Click here to reset