Frequency Disentangled Features in Neural Image Compression

08/04/2023
by   Ali Zafari, et al.
0

The design of a neural image compression network is governed by how well the entropy model matches the true distribution of the latent code. Apart from the model capacity, this ability is indirectly under the effect of how close the relaxed quantization is to the actual hard quantization. Optimizing the parameters of a rate-distortion variational autoencoder (R-D VAE) is ruled by this approximated quantization scheme. In this paper, we propose a feature-level frequency disentanglement to help the relaxed scalar quantization achieve lower bit rates by guiding the high entropy latent features to include most of the low-frequency texture of the image. In addition, to strengthen the de-correlating power of the transformer-based analysis/synthesis transform, an augmented self-attention score calculation based on the Hadamard product is utilized during both encoding and decoding. Channel-wise autoregressive entropy modeling takes advantage of the proposed frequency separation as it inherently directs high-informational low-frequency channels to the first chunks and conditions the future chunks on it. The proposed network not only outperforms hand-engineered codecs, but also neural network-based codecs built on computation-heavy spatially autoregressive entropy models.

READ FULL TEXT
research
12/14/2021

Modeling Image Quantization Tradeoffs for Optimal Compression

All Lossy compression algorithms employ similar compression schemes – fr...
research
07/17/2020

Channel-wise Autoregressive Entropy Models for Learned Image Compression

In learning-based approaches to image compression, codecs are developed ...
research
05/25/2023

NVTC: Nonlinear Vector Transform Coding

In theory, vector quantization (VQ) is always better than scalar quantiz...
research
12/14/2022

Image Compression with Product Quantized Masked Image Modeling

Recent neural compression methods have been based on the popular hyperpr...
research
11/25/2022

Homology-constrained vector quantization entropy regularizer

This paper describes an entropy regularization term for vector quantizat...
research
05/04/2023

Catch Missing Details: Image Reconstruction with Frequency Augmented Variational Autoencoder

The popular VQ-VAE models reconstruct images through learning a discrete...
research
04/07/2021

Learned transform compression with optimized entropy encoding

We consider the problem of learned transform compression where we learn ...

Please sign up or login with your details

Forgot password? Click here to reset