Entrywise limit theorems of eigenvectors and their one-step refinement for sparse random graphs

06/17/2021
by   Fangzheng Xie, et al.
0

We establish finite-sample Berry-Esseen theorems for the entrywise limits of the eigenvectors and their one-step refinement for sparse random graphs. For the entrywise limits of the eigenvectors, the average expected degree is allowed to grow at the rate Ω(log n), where n is the number of vertices, and for the entrywise limits of the one-step refinement of the eigenvectors, we require the expected degree to grow at the rate ω(log n). The one-step refinement is shown to have a smaller entrywise covariance than the eigenvectors in spectra. The key technical contribution towards the development of these limit theorems is a sharp finite-sample entrywise eigenvector perturbation bound. In particular, the existed error bounds on the two-to-infinity norms of the higher-order remainders are not sufficient when the graph average expected degree is proportional to log n. Our proof relies on a decoupling strategy using a “leave-one-out” construction of auxiliary matrices.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/26/2023

Central Limit Theorems and Approximation Theory: Part II

In Part I of this article (Banerjee and Kuchibhotla (2023)), we have int...
research
04/20/2020

Eigenvalues of graphs and spectral Moore theorems

In this paper, we describe some recent spectral Moore theorems related t...
research
05/20/2020

The Iteration Number of Colour Refinement

The Colour Refinement procedure and its generalisation to higher dimensi...
research
09/23/2020

Finite sample inference for generic autoregressive models

Autoregressive stationary processes are fundamental modeling tools in ti...
research
08/18/2023

Path convergence of Markov chains on large graphs

We consider two classes of natural stochastic processes on finite unlabe...
research
11/21/2010

Stochastic blockmodels with growing number of classes

We present asymptotic and finite-sample results on the use of stochastic...
research
09/20/2021

Sharp global convergence guarantees for iterative nonconvex optimization: A Gaussian process perspective

We consider a general class of regression models with normally distribut...

Please sign up or login with your details

Forgot password? Click here to reset