Lossless Compression with Probabilistic Circuits

11/23/2021
โˆ™
by   Anji Liu, et al.
โˆ™
0
โˆ™

Despite extensive progress on image generation, deep generative models are suboptimal when applied to lossless compression. For example, models such as VAEs suffer from a compression cost overhead due to their latent variables that can only be partially eliminated with elaborated schemes such as bits-back coding, resulting in oftentimes poor single-sample compression rates. To overcome such problems, we establish a new class of tractable lossless compression models that permit efficient encoding and decoding: Probabilistic Circuits (PCs). These are a class of neural networks involving |p| computational units that support efficient marginalization over arbitrary subsets of the D feature dimensions, enabling efficient arithmetic coding. We derive efficient encoding and decoding schemes that both have time complexity ๐’ช (log(D) ยท |p|), where a naive scheme would have linear costs in D and |p|, making the approach highly scalable. Empirically, our PC-based (de)compression algorithm runs 5-20x faster than neural compression algorithms that achieve similar bitrates. By scaling up the traditional PC structure learning pipeline, we achieved state-of-the-art results on image datasets such as MNIST. Furthermore, PCs can be naturally integrated with existing neural compression algorithms to improve the performance of these base models on natural image datasets. Our results highlight the potential impact that non-standard learning architectures may have on neural data compression.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
โˆ™ 11/01/2021

iFlow: Numerically Invertible Flows for Efficient Lossless Compression via a Uniform Coder

It was estimated that the world produced 59 ZB (5.9 ร— 10^13 GB) of data ...
research
โˆ™ 12/28/2022

Latent Discretization for Continuous-time Sequence Compression

Neural compression offers a domain-agnostic approach to creating codecs ...
research
โˆ™ 05/16/2019

Bit-Swap: Recursive Bits-Back Coding for Lossless Compression with Hierarchical Latent Variables

The bits-back argument suggests that latent variable models can be turne...
research
โˆ™ 10/10/2022

Scaling Up Probabilistic Circuits by Latent Variable Distillation

Probabilistic Circuits (PCs) are a unified framework for tractable proba...
research
โˆ™ 02/16/2023

Understanding the Distillation Process from Deep Generative Models to Tractable Probabilistic Circuits

Probabilistic Circuits (PCs) are a general and unified computational fra...
research
โˆ™ 06/17/2022

Lossy Compression with Gaussian Diffusion

We describe a novel lossy compression approach called DiffC which is bas...
research
โˆ™ 05/24/2023

Greedy Poisson Rejection Sampling

One-shot channel simulation is a fundamental data compression problem co...

Please sign up or login with your details

Forgot password? Click here to reset