A Universal Lossless Compression Method applicable to Sparse Graphs and Heavy-Tailed Sparse Graphs

07/18/2021
by   Payam Delgosha, et al.
0

Graphical data arises naturally in several modern applications, including but not limited to internet graphs, social networks, genomics and proteomics. The typically large size of graphical data argues for the importance of designing universal compression methods for such data. In most applications, the graphical data is sparse, meaning that the number of edges in the graph scales more slowly than n^2, where n denotes the number of vertices. Although in some applications the number of edges scales linearly with n, in others the number of edges is much smaller than n^2 but appears to scale superlinearly with n. We call the former sparse graphs and the latter heavy-tailed sparse graphs. In this paper we introduce a universal lossless compression method which is simultaneously applicable to both classes. We do this by employing the local weak convergence framework for sparse graphs and the sparse graphon framework for heavy-tailed sparse graphs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/13/2023

A Universal Low Complexity Compression Algorithm for Sparse Marked Graphs

Many modern applications involve accessing and processing graphical data...
research
02/21/2018

Distributed Compression of Graphical Data

In contrast to time series, graphical data is data indexed by the nodes ...
research
08/27/2020

Type Size Code for Compressing Erdös-Rényi Graphs

We consider universal source coding of unlabeled graphs which are common...
research
09/21/2019

Universal Lossless Compression of Graphical Data

Graphical data is comprised of a graph with marks on its edges and verti...
research
05/16/2023

Random Edge Coding: One-Shot Bits-Back Coding of Large Labeled Graphs

We present a one-shot method for compressing large labeled graphs called...
research
06/27/2021

On Graphical Models and Convex Geometry

We introduce a mixture-model of beta distributions to identify significa...
research
11/10/2020

Neural Network Compression Via Sparse Optimization

The compression of deep neural networks (DNNs) to reduce inference cost ...

Please sign up or login with your details

Forgot password? Click here to reset