Unified Multivariate Gaussian Mixture for Efficient Neural Image Compression

03/21/2022
by   Xiaosu Zhu, et al.
4

Modeling latent variables with priors and hyperpriors is an essential problem in variational image compression. Formally, trade-off between rate and distortion is handled well if priors and hyperpriors precisely describe latent variables. Current practices only adopt univariate priors and process each variable individually. However, we find inter-correlations and intra-correlations exist when observing latent variables in a vectorized perspective. These findings reveal visual redundancies to improve rate-distortion performance and parallel processing ability to speed up compression. This encourages us to propose a novel vectorized prior. Specifically, a multivariate Gaussian mixture is proposed with means and covariances to be estimated. Then, a novel probabilistic vector quantization is utilized to effectively approximate means, and remaining covariances are further induced to a unified mixture and solved by cascaded estimation without context models involved. Furthermore, codebooks involved in quantization are extended to multi-codebooks for complexity reduction, which formulates an efficient compression procedure. Extensive experiments on benchmark datasets against state-of-the-art indicate our model has better rate-distortion performance and an impressive 3.18× compression speed up, giving us the ability to perform real-time, high-quality variational image compression in practice. Our source code is publicly available at <https://github.com/xiaosu-zhu/McQuic>.

READ FULL TEXT

page 7

page 8

page 10

page 13

page 14

page 15

research
05/25/2023

NVTC: Nonlinear Vector Transform Coding

In theory, vector quantization (VQ) is always better than scalar quantiz...
research
03/10/2023

Context-Based Trit-Plane Coding for Progressive Image Compression

Trit-plane coding enables deep progressive image compression, but it can...
research
09/14/2022

Lossy Image Compression with Conditional Diffusion Models

Denoising diffusion models have recently marked a milestone in high-qual...
research
06/16/2020

On the Variational Posterior of Dirichlet Process Deep Latent Gaussian Mixture Models

Thanks to the reparameterization trick, deep latent Gaussian models have...
research
08/27/2022

Lossy Image Compression with Quantized Hierarchical VAEs

Recent work has shown a strong theoretical connection between variationa...
research
02/15/2022

Post-Training Quantization for Cross-Platform Learned Image Compression

It has been witnessed that learned image compression has outperformed co...
research
03/03/2022

Region-of-Interest Based Neural Video Compression

Humans do not perceive all parts of a scene with the same resolution, bu...

Please sign up or login with your details

Forgot password? Click here to reset