On the Distribution of Random Geometric Graphs

01/15/2018 ∙ by Mihai-Alin Badiu, et al. ∙ University of Oxford Aalborg University 0

Random geometric graphs (RGGs) are commonly used to model networked systems that depend on the underlying spatial embedding. We concern ourselves with the probability distribution of an RGG, which is crucial for studying its random topology, properties (e.g., connectedness), or Shannon entropy as a measure of the graph's topological uncertainty (or information content). Moreover, the distribution is also relevant for determining average network performance or designing protocols. However, a major impediment in deducing the graph distribution is that it requires the joint probability distribution of the n(n-1)/2 distances between n nodes randomly distributed in a bounded domain. As no such result exists in the literature, we make progress by obtaining the joint distribution of the distances between three nodes confined in a disk in R^2. This enables the calculation of the probability distribution and entropy of a three-node graph. For arbitrary n, we derive a series of upper bounds on the graph entropy; in particular, the bound involving the entropy of a three-node graph is tighter than the existing bound which assumes distances are independent. Finally, we provide numerical results on graph connectedness and the tightness of the derived entropy bounds.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

I Introduction

Uncertainty is pervasive in modern wireless networks. The sources of this uncertainty range from the humans that interact with the networks and the locations of the nodes in space down to the transmission protocols and the underlying scattering processes that affect signal propagation. To date, some progress has been made towards characterizing the structural uncertainty of wireless networks by modeling these networks as random geometric graphs (RGGs) where the probability that two particular nodes are connected is a function of the distance between them [1, 2, 3]. RGGs with probabilistic pair connection functions are known in the mathematics community as soft RGGs [4]. Work on these graphs has mostly been focused on analyzing their percolation (in an infinite domain) or connectivity (in a finite domain) properties [5, 6, 7]. In the case of finite (but dense) graphs, this sort of investigation typically amounts to obtaining an understanding of the probability that a single isolated node exists.

Ideally, one would like to obtain information about the complete distribution of the graphs in the ensemble. This information would enable us to study not only connectivity, but also important features such as topological structure and complexity through the lens of graph entropy [8]. Applications of entropy-based methods to the study of networked systems are abundant and include problems related to molecular structure classification [9], social networks [10, 11], data compression [12], and quantum entanglement [13, 14]. Graph entropy has also been invoked in the study of communication networks to quantify node and route stability [15] with the aim of improving link prediction [16] and routing protocols [17, 18]. Topological uncertainty in dynamic mobile ad hoc networks was investigated in [19] from a network layer perspective, and [20] treated self-organisation in networks using a basic graph entropy framework. More recently, an analytical approach for studying topological uncertainty in wireless networks was proposed in [21, 22, 23].

In this paper, we study the probability distribution of the RGG formed by nodes randomly distributed in a bounded domain. The joint distribution of all inter-node distances is greatly relevant for the distribution of the RGG. Finding distance distributions is a very challenging task in probabilistic geometry, as it often leads to intractable definite integrals; existing literature focuses on the distance between two nodes or the distances between a node and its neighbours (e.g., see [24, 25, 26, 27]). We derive the joint distribution of the inter-node distances in closed-form, for nodes confined in a disk in ; to our knowledge, this is the first time such a result is obtained. We avoid intractable integrations by using a conditioning technique and expect that the same approach could be used for larger . Also, for arbitrary , we derive a series of upper bounds on the graph entropy; in particular, the bound involving the entropy of a three-node graph is tighter than the existing bound which assumes distances are independent. Finally, we provide numerical results on graph connectedness and the tightness of the derived entropy bounds.

Ii Random Geometric Graph

Ii-a Model

Consider a set of nodes that are randomly located in a space of finite volume and diameter . We assume that the locations

of the nodes are independently and uniformly distributed in

. The existence of an (undirected) edge between nodes and

depends on the Euclidean distance between the two nodes and is indicated by the binary random variable

being one. Specifically, given the node locations, the variables are independent and each edge exists with probability

(1)

where is the pair connection function. For example, in the hard disk model, is an indicator function that equals one when its argument is less than and zero otherwise, where

denotes the maximum connection range. We define the binary vector

to include all edge variables, i.e., . The random geometric graph with edge set is distributed in the set of all possible graphs.

Ii-B Probability Distribution and Entropy

The distribution of is determined by both the distribution of locations and the probabilistic connection model specified by . The graph is uniquely determined by

, which has a multivariate Bernoulli distribution. Therefore, we study the pmf

, for each . Since the conditional probability of edge existence depends on distance, it is more convenient to work with inter-node distances instead of node locations. Let denote the random vector collecting the pair distances , and let be its pdf. We now write

(2)

where the integration domain is . The distribution of is symmetric, since the node locations are identically distributed and the pair connection function is the same for all edges. The topological uncertainty (or information content) of can be quantified by the Shannon entropy, i.e.,

(3)

It is clear from (2) that the joint pdf of inter-node distances is highly important for the graph distribution and its entropy. For , the sought pdf reduces to the pdf of the distance between two nodes, which has been extensively studied for various shapes of the embedding space (e.g., see [24, 25, 26, 27]). Obtaining the joint pdf analytically for is very challenging and no such results have been reported previously. In the next section, we make progress by obtaining the joint pdf for in closed-form by using a conditioning technique. This enables the calculation of the pmf (2) and entropy (II-B) for , which then can be used to bound the graph entropy when , as shown in Sec. IV.

Iii Joint PDF of Inter-Node Distances for

We consider and is a disk of diameter in . Even though the locations of the three nodes are independently and uniformly distributed, determining the joint pdf of the three distances by direct integration is very difficult. For example, one could attempt to transform the Cartesian coordinates (i.e., six variables) to other coordinates that include the three distances, apply the transformation theorem and integrate out the redundant coordinates. However, this leads to complicated definite integrals, because triangle inequalities and the condition that the points have to be inside the circle need to be ensured.

Computing integrals over complicated regions is often required in probabilistic geometry. Crofton’s technique [28] has proven to simplify such evaluations in many problems, such as finding the distribution of the distance between two random points [29]. The work [30] shows that Crofton’s method is essentially equivalent to the technique of computing expectations by conditioning. We use the latter in the following.

Our approach is to compute the joint pdf conditioned on an additional (suitably chosen) random variable, which is easier than the original problem. Then, we obtain the desired joint pdf by taking the expectation of the conditional pdf over the density of the additional variable. We expect that this approach is also useful for .

Before presenting the result, we fix some notation. For a triangle with side lengths , and , let be the diameter of its circumscribed circle, i.e.,

(4)

where ; note that is equivalent to , , satisfying the triangle inequalities. We denote the largest side length by . Let us also define the function , .

Proposition 1

Assume three points are independently and uniformly distributed inside a circle of diameter and let , and be the side lengths of the random triangle determined by the points. Then, for all , , such that and , the joint pdf of the side lengths is given by eq. (5) at the top of the next page. The pdf depends on whether the realized triangle is obtuse or acute, and whether the diameter (4) of its circumscribed circle is larger or smaller than .

Proof:

An outline of the proof is given in the appendix.

(5)

Iv Bounding the Graph Entropy

In [21, 22], the upper-bound is obtained for any by assuming that are independent (or, equivalently, that the pair distances are independent). While such an upper bound is simple and amenable to further analysis, its tightness might not always be sufficient. We set out to find tighter upper bounds by trying to preserve the dependency between pair distances. First, we establish the following result.

Proposition 2

For any such that , the entropies of and are related by

(6)
Proof:

The entropy of is given by the entropy of the binary variables in , see (II-B). Our intention is to relate to the entropy of RGGs with smaller number of nodes. Specifically, for , we consider all the subsets of that have nodes. Let be the th such subset, . The set of pair indices corresponding to is denoted by . We further define the set collecting all the sets of pair indices. In this construction, each pair index with appears in subsets of . According to Shearer’s inequality, which is a generalization of the subadditivity of joint entropy [31, 32], we have

(7)

where . Each term in the r.h.s. of (7) is the entropy of a graph with nodes; by invoking the system’s symmetry, all terms are equal to , such that

(8)

and (6) follows immediately.

The following corollary gives a series of tighter and tighter upper bounds on , for all .

Corollary 1

The normalized (i.e., per edge) entropy decreases with the number of nodes, i.e.,

(9)
Proof:

We immediately obtain (9) by successively applying (6) for consecutive integers.

V Numerical Experiments

In the following we assume that the random nodes are confined in a disk with diameter ; any two nodes are connected by an edge if and only if the distance between them is less than .

We first take an example from ad-hoc communications, where it is relevant to know conditions under which any two nodes of the network can communicate. If multi-hop communication is possible, this is equivalent to the requirement that the graph be connected; otherwise, the graph needs to be complete. We consider a three-node graph and evaluate and as functions of (which can be thought of as being monotonically related to the transmit power). We compute the pmf (2) by using the derived joint pdf (5) and numerical integration. The results in Fig. 1 show that two-hop relaying significantly improves the probability that any two of the three nodes can communicate.

Fig. 1: Probability of connectedness and probability of completeness for an RGG with nodes and maximum connection range ; the three nodes are randomly located inside a circle with diameter one.

We now study the entropy bounds derived in Sec. IV. We consider nodes and compute using Monte Carlo simulation. From (9), we have . We use the derived joint pdf (5) to compute the pmf (2), which then gives . We similarly obtain based on the pdf of the distance between two points inside a circle [24]. Fig. 2 shows that approaches zero when or (i.e., when the RGG becomes deterministically empty or complete, respectively). The entropy is significant at intermediate values of and always less than bits, which is the entropy of a five-node graph whose potential edges exist independently with probability . We can also observe that the bound based on provides an improvement over the one obtained by assuming the inter-node distances are independent.

Fig. 2: Entropy of an RGG with nodes and maximum connection range , and upper bounds; the five nodes are randomly located inside a circle with diameter one.

Vi Conclusion

In this paper, we studied the distribution of a random geometric graph and its entropy. The distribution provides insights into properties of the random graph, such as topological structure or connectivity, while entropy is useful for understanding topological complexity. We showed that the normalized (per edge) entropy decreases with the number of nodes. This result gave a series of upper bounds on entropy, each bound involving the entropy of a graph with smaller number of nodes. We pointed out the importance of the joint distribution of pair distances in determining the graph’s distribution and its entropy, and the lack of such results in the literature. We progressed by deriving the joint distribution of distances between three nodes confined in a disk, and expect that the approach we used could be applied for larger number of nodes.

Let be the center of the disk of diameter . We denote by the minimum diameter of a disk centred at that includes the th point and define . We write

(10)

Conditioning on is very convenient because, in the computation of , one of the three points is on the circle of center and diameter , while the other two points are inside ; this is a great simplification. The density is obtained as follows: we have , for each ; therefore, , which gives the pdf .

To compute , we study the “number of ways” in which one can fit a triangle of side-lengths , and inside when one of the triangle’s vertices is fixed on the circle. The side lengths must satisfy the triangle inequalities, which is equivalent to . It is also required that .

.

Fig. 3: Illustration of the circle of center and diameter ; the point is on the circle, while and are inside .

In Fig. 3, point represents node . Assuming is on we have

(11)

where superscript indicates conditioning on node being on , and . For each , the pdf of is [24]

(12)

To obtain , we use the law of cosines , with . For each , we further define ; we have , with . Since (i.e., the difference between two independent and uniformly distributed variables), it follows that has a trapezoidal distribution with pdf

Now, we make the transformation and obtain the pdf of from its cdf, which is computed as . Then, we use the law of cosines and transformation theorem to obtain . We distinguish between several cases depending on the diameter of the circumscribed circle (4). When , the only way in which the triangle can be inside while node is on is when the triangle is obtuse (i.e., ) and its largest side length is either or . Using (12) in (VI), we obtain

(13)

for all , , that satisfy and . Since each node can be on the circle with probability , it follows that , which gives

(14)

We have used that and when the node corresponding to the obtuse angle cannot be on .

Finally, by plugging (VI) into (10), we calculate the integral by distinguishing between the cases and , and arrive at the closed-form expression (5).

Acknowledgment

This work was supported by Independent Research Fund Denmark grant number DFF–5054-00212 and by EPSRC grant number EP/N002350/1 (“Spatially Embedded Networks”). The research was carried out during a visit to University of Oxford, and M.-A Badiu would like to thank the Communications Research Group for the hospitality.

References

  • [1] M. Franceschetti and R. Meester, Random Networks for Communication: From Statistical Physics to Information Systems, ser. Cambridge Series in Statistical and Probabilistic Mathematics.   Cambridge University Press, 2008.
  • [2] F. Baccelli and B. Błlaszczyszyn, “Stochastic geometry and wireless networks: Volume II applications,” Found. Trends Netw., vol. 4, no. 1-2, pp. 1–312, Jan. 2009.
  • [3] C. P. Dettmann and O. Georgiou, “Random geometric graphs with general connection functions,” Physical Review E, vol. 93, no. 3, p. 032313, 2016.
  • [4] M. D. Penrose, “Connectivity of soft random geometric graphs,” The Annals of Applied Probability, vol. 26, no. 2, pp. 986–1028, 2016.
  • [5] ——, “On a continuum percolation model,” Advances in applied probability, vol. 23, no. 3, pp. 536–556, 1991.
  • [6] R. Meester and R. Roy, Continuum percolation.   Cambridge University Press, 1996, vol. 119.
  • [7] J. Coon, C. P. Dettmann, and O. Georgiou, “Full connectivity: corners, edges and faces,” Journal of Statistical Physics, vol. 147, no. 4, pp. 758–778, 2012.
  • [8] G. Simonyi, “Graph entropy: a survey,” Combinatorial Optimization, vol. 20, pp. 399–441, 1995.
  • [9] D. Bonchev, Information theoretic indices for characterization of chemical structures.   Research Studies Press, 1983, no. 5.
  • [10] M. G. Everett, “Role similarity and complexity in social networks,” Social Networks, vol. 7, no. 4, pp. 353–359, 1985.
  • [11] M. Dehmer and A. Mowshowitz, “A history of graph entropy measures,” Information Sciences, vol. 181, no. 1, pp. 57–78, 2011.
  • [12] Y. Choi and W. Szpankowski, “Compression of graphical structures: Fundamental limits, algorithms, and experiments,” vol. 58, no. 2, pp. 620–638, 2012.
  • [13] N. de Beaudrap, V. Giovannetti, S. Severini, and R. Wilson, “Interpreting the von Neumann entropy of graph Laplacians, and coentropic graphs,” A Panorama of Mathematics: Pure and Applied, vol. 658, p. 227, 2016.
  • [14] D. E. Simmons, J. P. Coon, and A. Datta, “Symmetric Laplacians, quantum density matrices and their von Neumann entropy,” Linear Algebra and its Applications, vol. 532, pp. 534–549, nov 2017.
  • [15] J. L. Guo, W. Wu, and S. B. Xu, “Study on route stability based on the metrics of local topology transformation entropy in mobile ad hoc networks,” Advanced Engineering Forum, vol. 1, pp. 288–292, sep 2011.
  • [16] M.-H. Zayani, “Link prediction in dynamic and human-centered mobile wireless networks,” Ph.D. dissertation, 2012.
  • [17] B. An and S. Papavassiliou, “An entropy-based model for supporting and evaluating route stability in mobile ad hoc wireless networks,” IEEE Communications Letters, vol. 6, no. 8, pp. 328–330, aug 2002.
  • [18] M. Boushaba, A. Hafid, and M. Gendreau, “Node stability-based routing in wireless mesh networks,” Journal of Network and Computer Applications, vol. 93, pp. 1–12, 2017.
  • [19] R. Timo, K. Blackmore, and L. Hanlen, “On entropy measures for dynamic network topologies: Limits to MANET,” in Communications Theory Workshop, 2005. Proceedings. 6th Australian.   IEEE, 2005, pp. 95–101.
  • [20] J. Lu, F. Valois, M. Dohler, and D. Barthel, “Quantifying organization by means of entropy,” IEEE Communications Letters, vol. 12, no. 3, pp. 185–187, mar 2008.
  • [21] J. P. Coon, “Topological uncertainty in wireless networks,” in 2016 IEEE Global Communications Conference, GLOBECOM 2016, Washington, DC, USA, December 4-8, 2016.   IEEE, 2016, pp. 1–6.
  • [22] J. P. Coon, C. P. Dettmann, and O. Georgiou, “Entropy of Spatial Network Ensembles,” ArXiv e-prints, Jul. 2017.
  • [23] A. Cika, J. P. Coon, and S. Kim, “Effects of directivity on wireless network complexity,” in 2017 15th International Symposium on Modeling and Optimization in Mobile, Ad Hoc, and Wireless Networks (WiOpt).   IEEE, May 2017, pp. 1–7.
  • [24] A. M. Mathai, An Introduction to Geometrical Probability, ser. Statistical Distributions and Models with Applications.   Newark: Gordon and Breach, 1999.
  • [25] S. Srinivasa and M. Haenggi, “Distance distributions in finite uniformly random networks: Theory and applications,” IEEE Trans. Vehicular Technology, vol. 59, no. 2, pp. 940–949, 2010.
  • [26] Z. Khalid and S. Durrani, “Distance distributions in regular polygons,” IEEE Transactions on Vehicular Technology, vol. 62, no. 5, pp. 2363–2368, Jun 2013.
  • [27] R. Pure and S. Durrani, “Computing exact closed-form distance distributions in arbitrarily shaped polygons with arbitrary reference points,” The Mathematica Journal, vol. 17, pp. 1–27, 2015.
  • [28] H. Solomon, Geometric Probability.   Society for Industrial and Applied Mathematics, 1978.
  • [29] V. S. Alagar, “The distribution of the distance between random points,” Journal of Applied Probability, vol. 13, no. 3, pp. 558–566, 1976.
  • [30] B. Eisenberg and R. Sullivan, “Crofton’s differential equation,” The American Mathematical Monthly, vol. 107, pp. 129–139, 2000.
  • [31] F. Chung, R. Graham, P. Frankl, and J. Shearer, “Some intersection theorems for ordered sets and graphs,” Journal of Combinatorial Theory, Series A, vol. 43, no. 1, pp. 23 – 37, 1986.
  • [32] M. Yanagida and Y. Horibe, “A dropping proof of an entropy inequality,” Applied Mathematics Letters, vol. 21, no. 8, pp. 840 – 842, 2008.