Efficient Estimation for Random Dot Product Graphs via a One-step Procedure

10/10/2019 ∙ by Fangzheng Xie, et al. ∙ 0

We propose a one-step procedure to efficiently estimate the latent positions in random dot product graphs. Unlike the classical spectral-based methods such as the adjacency and Laplacian spectral embedding, the proposed one-step procedure takes both the low-rank structure of the expected value of the adjacency matrix and the Bernoulli likelihood information of the sampling model into account simultaneously. We show that for each individual vertex, the corresponding row of the one-step estimator converges to a multivariate normal distribution after proper scaling and centering up to an orthogonal transformation, with an efficient covariance matrix, provided that the initial estimator satisfies the so-called approximate linearization property. The one-step estimator improves the commonly-adopted spectral embedding methods in the following sense: Globally for all vertices, it yields a smaller asymptotic sum of squared-error, and locally for each individual vertex, the asymptotic covariance matrix of the corresponding row of the one-step estimator is smaller than those of the spectral embedding in spectra. The usefulness of the proposed one-step procedure is demonstrated via numerical examples and the analysis of a real-world Wikipedia graph dataset.



There are no comments yet.


page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.