Statistical Inference for Large-dimensional Matrix Factor Model from Least Squares and Huber Loss Points of View
In the article we focus on large-dimensional matrix factor models and propose estimators of factor loading matrices and factor score matrix from the perspective of minimizing least squares objective function. The resultant estimators turns out to be equivalent to the corresponding projected estimators in Yu et al. (2021), which enjoys the nice properties of reducing the magnitudes of the idiosyncratic error components and thereby increasing the signal-to-noise ratio. We derive the convergence rate of the theoretical minimizers under sub-Gaussian tails, instead of the one-step iteration estimators by Yu et al. (2021). Motivated by the least squares formulation, we further consider a robust method for estimating large-dimensional matrix factor model by utilizing Huber Loss function. Theoretically, we derive the convergence rates of the robust estimators of the factor loading matrices under finite fourth moment conditions. We also propose an iterative procedure to estimate the pair of row and column factor numbers robustly. We conduct extensive numerical studies to investigate the empirical performance of the proposed robust methods relative to the sate-of-the-art ones, which show the proposed ones perform robustly and much better than the existing ones when data are heavy-tailed while perform almost the same (comparably) with the projected estimators when data are light-tailed, and as a result can be used as a safe replacement of the existing ones. An application to a Fama-French financial portfolios dataset illustrates its empirical usefulness.
READ FULL TEXT