Iterative Least Squares Algorithm for Large-dimensional Matrix Factor Model by Random Projection

01/01/2023
by   Yong He, et al.
0

The matrix factor model has drawn growing attention for its advantage in achieving two-directional dimension reduction simultaneously for matrix-structured observations. In this paper, we propose a simple iterative least squares algorithm for matrix factor models, in contrast to the Principal Component Analysis (PCA)-based methods in the literature. In detail, we first propose to estimate the latent factor matrices by projecting the observations with two deterministic weight matrices, which are chosen to diversify away the idiosyncratic components. We show that the inferences on factors are still asymptotically valid even if we overestimate both the row/column factor numbers. We then estimate the row/column loading matrices by minimizing the squared loss function under certain identifiability conditions. The resultant estimators of the loading matrices are treated as the new weight/projection matrices and thus the above update procedure can be iteratively performed until convergence. Theoretically, given the true dimensions of the factor matrices, we derive the convergence rates of the estimators for loading matrices and common components at any s-th step iteration. Thorough numerical simulations are conducted to investigate the finite-sample performance of the proposed methods and two real datasets associated with financial portfolios and multinational macroeconomic indices are used to illustrate practical usefulness.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/26/2022

Manifold Principle Component Analysis for Large-Dimensional Matrix Elliptical Factor Model

Matrix factor model has been growing popular in scientific fields such a...
research
12/08/2021

Statistical Inference for Large-dimensional Matrix Factor Model from Least Squares and Huber Loss Points of View

In the article we focus on large-dimensional matrix factor models and pr...
research
06/05/2023

Robust Statistical Inference for Large-dimensional Matrix-valued Time Series via Iterative Huber Regression

Matrix factor model is drawing growing attention for simultaneous two-wa...
research
03/16/2018

Leveraging Sparsity to Speed Up Polynomial Feature Expansions of CSR Matrices Using K-Simplex Numbers

We provide an algorithm that speeds up polynomial and interaction featur...
research
03/23/2020

Projected Estimation for Large-dimensional Matrix Factor Models

Large-dimensional factor models are drawing growing attention and widely...
research
06/17/2021

Large-Scale Multiple Testing for Matrix-Valued Data under Double Dependency

High-dimensional inference based on matrix-valued data has drawn increas...

Please sign up or login with your details

Forgot password? Click here to reset