A Two-Way Transformed Factor Model for Matrix-Variate Time Series
We propose a new framework for modeling high-dimensional matrix-variate time series by a two-way transformation, where the transformed data consist of a matrix-variate factor process, which is dynamically dependent, and three other blocks of white noises. Specifically, for a given p_1× p_2 matrix-variate time series, we seek common nonsingular transformations to project the rows and columns onto another p_1 and p_2 directions according to the strength of the dynamic dependence of the series on the past values. Consequently, we treat the data as nonsingular linear row and column transformations of dynamically dependent common factors and white noise idiosyncratic components. We propose a common orthonormal projection method to estimate the front and back loading matrices of the matrix-variate factors. Under the setting that the largest eigenvalues of the covariance of the vectorized idiosyncratic term diverge for large p_1 and p_2, we introduce a two-way projected Principal Component Analysis (PCA) to estimate the associated loading matrices of the idiosyncratic terms to mitigate such diverging noise effects. A diagonal-path white noise testing procedure is proposed to estimate the order of the factor matrix. term is a matrix-variate white noise process. Asymptotic properties of the proposed method are established for both fixed and diverging dimensions as the sample size increases to infinity. We use simulated and real examples to assess the performance of the proposed method. We also compare our method with some existing ones in the literature and find that the proposed approach not only provides interpretable results but also performs well in out-of-sample forecasting.
READ FULL TEXT