Nonparametric estimation of low rank matrix valued function
Let A:[0,1]→H_m (the space of Hermitian matrices) be a matrix valued function which is low rank with entries in Hölder class Σ(β,L). The goal of this paper is to study statistical estimation of A based on the regression model E(Y_j|τ_j,X_j) = 〈 A(τ_j), X_j 〉, where τ_j are i.i.d. uniformly distributed in [0,1], X_j are i.i.d. matrix completion sampling matrices, Y_j are independent bounded responses. We propose an innovative nuclear norm penalized local polynomial estimator and establish an upper bound on its point-wise risk measured by Frobenius norm. Then we extend this estimator globally and prove an upper bound on its integrated risk measured by L_2-norm. We also propose another new estimator based on bias-reducing kernels to study the case when A is not necessarily low rank and establish an upper bound on its risk measured by L_∞-norm. We show that the obtained rates are all optimal up to some logarithmic factor in minimax sense. Finally, we propose an adaptive estimation procedure based on Lepski's method and the penalized data splitting technique which is computationally efficient and can be easily implemented and parallelized.
READ FULL TEXT