Universally Consistent Latent Position Estimation and Vertex Classification for Random Dot Product Graphs

07/29/2012
by   Daniel L. Sussman, et al.
0

In this work we show that, using the eigen-decomposition of the adjacency matrix, we can consistently estimate latent positions for random dot product graphs provided the latent positions are i.i.d. from some distribution. If class labels are observed for a number of vertices tending to infinity, then we show that the remaining vertices can be classified with error converging to Bayes optimal using the k-nearest-neighbors classification rule. We evaluate the proposed methods on simulated data and a graph derived from Wikipedia.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/05/2012

Universally consistent vertex classification for latent positions graphs

In this work we show that, using the eigen-decomposition of the adjacenc...
research
06/04/2018

On estimation and inference in latent structure random graphs

We define a latent structure model (LSM) random graph as a random dot pr...
research
08/03/2020

Two-sample Testing on Latent Distance Graphs With Unknown Link Functions

We propose a valid and consistent test for the hypothesis that two laten...
research
11/07/2019

Improving Power of 2-Sample Random Graph Tests with Applications in Connectomics

In many applications, there is an interest in testing whether two graphs...
research
04/15/2020

Learning 1-Dimensional Submanifolds for Subsequent Inference on Random Dot Product Graphs

A random dot product graph (RDPG) is a generative model for networks in ...
research
10/10/2019

Efficient Estimation for Random Dot Product Graphs via a One-step Procedure

We propose a one-step procedure to efficiently estimate the latent posit...
research
05/21/2013

Out-of-sample Extension for Latent Position Graphs

We consider the problem of vertex classification for graphs constructed ...

Please sign up or login with your details

Forgot password? Click here to reset