Inference for Low-Rank Models

07/06/2021
by   Victor Chernozhukov, et al.
0

This paper studies inference in linear models whose parameter of interest is a high-dimensional matrix. We focus on the case where the high-dimensional matrix parameter is well-approximated by a “spiked low-rank matrix” whose rank grows slowly compared to its dimensions and whose nonzero singular values diverge to infinity. We show that this framework covers a broad class of models of latent-variables which can accommodate matrix completion problems, factor models, varying coefficient models, principal components analysis with missing data, and heterogeneous treatment effects. For inference, we propose a new “rotation-debiasing" method for product parameters initially estimated using nuclear norm penalization. We present general high-level results under which our procedure provides asymptotically normal estimators. We then present low-level conditions under which we verify the high-level conditions in a treatment effects example.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset