DeepAI AI Chat
Log In Sign Up

Iterative Least Squares Algorithm for Large-dimensional Matrix Factor Model by Random Projection

by   Yong He, et al.

The matrix factor model has drawn growing attention for its advantage in achieving two-directional dimension reduction simultaneously for matrix-structured observations. In this paper, we propose a simple iterative least squares algorithm for matrix factor models, in contrast to the Principal Component Analysis (PCA)-based methods in the literature. In detail, we first propose to estimate the latent factor matrices by projecting the observations with two deterministic weight matrices, which are chosen to diversify away the idiosyncratic components. We show that the inferences on factors are still asymptotically valid even if we overestimate both the row/column factor numbers. We then estimate the row/column loading matrices by minimizing the squared loss function under certain identifiability conditions. The resultant estimators of the loading matrices are treated as the new weight/projection matrices and thus the above update procedure can be iteratively performed until convergence. Theoretically, given the true dimensions of the factor matrices, we derive the convergence rates of the estimators for loading matrices and common components at any s-th step iteration. Thorough numerical simulations are conducted to investigate the finite-sample performance of the proposed methods and two real datasets associated with financial portfolios and multinational macroeconomic indices are used to illustrate practical usefulness.


page 1

page 2

page 3

page 4


Manifold Principle Component Analysis for Large-Dimensional Matrix Elliptical Factor Model

Matrix factor model has been growing popular in scientific fields such a...

Statistical Inference for Large-dimensional Matrix Factor Model from Least Squares and Huber Loss Points of View

In the article we focus on large-dimensional matrix factor models and pr...

Robust Statistical Inference for Large-dimensional Matrix-valued Time Series via Iterative Huber Regression

Matrix factor model is drawing growing attention for simultaneous two-wa...

Leveraging Sparsity to Speed Up Polynomial Feature Expansions of CSR Matrices Using K-Simplex Numbers

We provide an algorithm that speeds up polynomial and interaction featur...

Projected Estimation for Large-dimensional Matrix Factor Models

Large-dimensional factor models are drawing growing attention and widely...

Large-Scale Multiple Testing for Matrix-Valued Data under Double Dependency

High-dimensional inference based on matrix-valued data has drawn increas...