Almost sure convergence of the largest and smallest eigenvalues of high-dimensional sample correlation matrices

01/30/2020
by   Johannes Heiny, et al.
0

In this paper, we show that the largest and smallest eigenvalues of a sample correlation matrix stemming from n independent observations of a p-dimensional time series with iid components converge almost surely to (1+√(γ))^2 and (1-√(γ))^2, respectively, as n →∞, if p/n→γ∈ (0,1] and the truncated variance of the entry distribution is 'almost slowly varying', a condition we describe via moment properties of self-normalized sums. Moreover, the empirical spectral distributions of these sample correlation matrices converge weakly, with probability 1, to the Marchenko-Pastur law, which extends a result in Bai and Zhou (2008). We compare the behavior of the eigenvalues of the sample covariance and sample correlation matrices and argue that the latter seems more robust, in particular in the case of infinite fourth moment. We briefly address some practical issues for the estimation of extreme eigenvalues in a simulation study. In our proofs we use the method of moments combined with a Path-Shortening Algorithm, which efficiently uses the structure of sample correlation matrices, to calculate precise bounds for matrix norms. We believe that this new approach could be of further use in random matrix theory.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset