Modeling High-Dimensional Matrix-Variate Observations by Tensor Factorization

09/29/2022
by   Xu Zhang, et al.
0

In the era of big data, it is prevailing of high-dimensional matrix-variate observations that may be independent or dependent. Unsupervised learning of matrix objects through low-rank approximation has benefited discovery of the hidden pattern and structure whilst concomitant statistical inference is known challenging and yet in infancy by the fact that, there is limited work and all focus on a class of bilinear form matrix factor models. In this paper, we propose a novel class of hierarchical CP product matrix factor models which model the rank-1 components of the low-rank CP decomposition of a matrix object by the tool of high-dimensional vector factor models. The induced CP tensor-decomposition based matrix factor model (TeDFaM) are apparently more informative in that it naturally incorporates the row-wise and column-wise interrelated information. Furthermore, the inner separable covariance structure yields efficient moment estimators of the loading matrices and thus approximate least squares estimators for the factor scores. The proposed TeDFaM model and estimation procedure make the signal part achieves better peak signal to noise ratio, evidenced in both theory and numerical analytics compared to bilinear form matrix factor models and existing methods. We establish an inferential theory for TeDFaM estimation including consistency, rates of convergence, and the limiting distributions under regular conditions. In applications, the proposed model and estimation procedure are superior in terms of matrix reconstruction for both independent two-dimensional image data and serial correlated matrix time series. The algorithm is fast and can be implemented expediently through an accompanied R package TeDFaM.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset