Regular decomposition of large graphs and other structures: scalability and robustness towards missing data

11/23/2017
by   Hannu Reittu, et al.
0

A method for compression of large graphs and matrices to a block structure is further developed. Szemerédi's regularity lemma is used as a generic motivation of the significance of stochastic block models. Another ingredient of the method is Rissanen's minimum description length principle (MDL). We continue our previous work on the subject, considering cases of missing data and scaling of algorithms to extremely large size of graphs. In this way it would be possible to find out a large scale structure of a huge graphs of certain type using only a tiny part of graph information and obtaining a compact representation of such graphs useful in computations and visualization.

READ FULL TEXT
research
11/26/2018

Analysis of large sparse graphs using regular decomposition of graph distance matrices

Statistical analysis of large and sparse graphs is a challenging problem...
research
03/21/2017

On the Interplay between Strong Regularity and Graph Densification

In this paper we analyze the practical implications of Szemerédi's regul...
research
10/23/2020

Learning from missing data with the Latent Block Model

Missing data can be informative. Ignoring this information can lead to m...
research
05/16/2019

Separating Structure from Noise in Large Graphs Using the Regularity Lemma

How can we separate structural information from noise in large graphs? T...
research
11/24/2020

Fiedler vector analysis for particular cases of connected graphs

In this paper, some subclasses of block graphs are considered in order t...
research
06/18/2020

Local structure of idempotent algebras II

In this paper we continue the study of edge-colored graphs associated wi...
research
03/04/2018

Two-Dimensional Block Trees

The Block Tree (BT) is a novel compact data structure designed to compre...

Please sign up or login with your details

Forgot password? Click here to reset