Homology-constrained vector quantization entropy regularizer

11/25/2022
by   Ivan Volkov, et al.
0

This paper describes an entropy regularization term for vector quantization (VQ) based on the analysis of persistent homology of the VQ embeddings. Higher embedding entropy positively correlates with higher codebook utilization, mitigating overfit towards the identity and codebook collapse in VQ-based autoencoders [1]. We show that homology-constrained regularization is an effective way to increase entropy of the VQ process (approximated to input entropy) while preserving the approximated topology in the quantized latent space, averaged over mini batches. This work further explores some patterns of persistent homology diagrams of latents formed by vector quantization. We implement and test the proposed algorithm as a module integrated into a sample VQ-VAE. Linked code repository provides a functioning implementation of the proposed architecture, referred to as homology-constrained vector quantization (HC-VQ) further in this work.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/27/2019

Quantization-Based Regularization for Autoencoders

Autoencoders and their variations provide unsupervised models for learni...
research
07/12/2021

HEMP: High-order Entropy Minimization for neural network comPression

We formulate the entropy of a quantized artificial neural network as a d...
research
01/07/2020

Entropy-Constrained Maximizing Mutual Information Quantization

In this paper, we investigate the quantization of the output of a binary...
research
08/04/2023

Frequency Disentangled Features in Neural Image Compression

The design of a neural image compression network is governed by how well...
research
11/29/2021

Nonuniform-to-Uniform Quantization: Towards Accurate Quantization via Generalized Straight-Through Estimation

The nonuniform quantization strategy for compressing neural networks usu...
research
04/07/2021

Learned transform compression with optimized entropy encoding

We consider the problem of learned transform compression where we learn ...
research
01/15/2020

Improvement of an Approximated Self-Improving Sorter and Error Analysis of its Estimated Entropy

The self-improving sorter proposed by Ailon et al. consists of two phase...

Please sign up or login with your details

Forgot password? Click here to reset