Random Processes with High Variance Produce Scale Free Networks

02/02/2022
by   Josh Johnston, et al.
0

Real-world networks tend to be scale free, having heavy-tailed degree distributions with more hubs than predicted by classical random graph generation methods. Preferential attachment and growth are the most commonly accepted mechanisms leading to these networks and are incorporated in the Barabási-Albert (BA) model. We provide an alternative model using a randomly stopped linking process inspired by a generalized Central Limit Theorem (CLT) for geometric distributions with widely varying parameters. The common characteristic of both the BA model and our randomly stopped linking model is the mixture of widely varying geometric distributions, suggesting the critical characteristic of scale free networks is high variance, not growth or preferential attachment. The limitation of classical random graph models is low variance in parameters, while scale free networks are the natural, expected result of real-world variance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/04/2018

Use of the geometric mean as a statistic for the scale of the coupled Gaussian distributions

The geometric mean is shown to be an appropriate statistic for the scale...
research
03/13/2021

Extreme value analysis for mixture models with heavy-tailed impurity

This paper deals with the extreme value analysis for the triangular arra...
research
01/09/2018

Scale-free networks are rare

A central claim in modern network science is that real-world networks ar...
research
05/18/2020

Modeling Graphs Using a Mixture of Kronecker Models

Generative models for graphs are increasingly becoming a popular tool fo...
research
11/29/2017

Representation Learning for Scale-free Networks

Network embedding aims to learn the low-dimensional representations of v...
research
03/23/2022

The k-Cap Process on Geometric Random Graphs

The k-cap (or k-winners-take-all) process on a graph works as follows: i...
research
12/15/2020

Applications of multivariate quasi-random sampling with neural networks

Generative moment matching networks (GMMNs) are suggested for modeling t...

Please sign up or login with your details

Forgot password? Click here to reset