Projected Estimation for Large-dimensional Matrix Factor Models

03/23/2020
by   Long Yu, et al.
0

Large-dimensional factor models are drawing growing attention and widely applied to analyze the correlations of large datasets. Most related works focus on vector-valued data while nowadays matrix-valued or high-order tensor datasets are ubiquitous due to the accessibility to multiple data sources. In this article, we propose a projected estimation method for the matrix factor model under flexible conditions. We show that the averaged squared Frobenious norm of our projected estimators of the row (or column) loading matrix have convergence rates min{(Tp_2)^-1, (Tp_1)^-2, (p_1p_2)^-2} (or min{(Tp_1)^-1, (Tp_2)^-2, (p_1p_2)^-2}), where p_1 and p_2 are the row and column dimension of each data matrix and T is the number of observations. This rate is faster than the typical rates T^-1 and min{(Tp_2)^-1, p_1^-2} (or min{(Tp_1)^-1, p_2^-2}) that are conceivable from the literature on vector factor models. An easily satisfied sufficient condition on the projection direction to achieving the given rates for the projected estimators is provided. Moreover, we established the asymptotic distributions of the estimated row and column factor loadings. We also introduced an iterative approach to consistently determine the numbers of row and column factors. Two real data examples related to financial engineering and image recognition show that the projection estimators contribute to explaining portfolio variances and achieving accurate classification of digit numbers.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset