Adaptive Estimation of Multivariate Regression with Hidden Variables

03/30/2020
by   Xin Bing, et al.
0

This paper studies the estimation of the coefficient matrix in multivariate regression with hidden variables, Y = ()^TX + (B^*)^TZ + E, where Y is a m-dimensional response vector, X is a p-dimensional vector of observable features, Z represents a K-dimensional vector of unobserved hidden variables, possibly correlated with X, and E is an independent error. The number of hidden variables K is unknown and both m and p are allowed but not required to grow with the sample size n. Since only Y and X are observable, we provide necessary conditions for the identifiability of . The same set of conditions are shown to be sufficient when the error E is homoscedastic. Our identifiability proof is constructive and leads to a novel and computationally efficient estimation algorithm, called HIVE. The first step of the algorithm is to estimate the best linear prediction of Y given X in which the unknown coefficient matrix exhibits an additive decomposition of and a dense matrix originated from the correlation between X and the hidden variable Z. Under the row sparsity assumption on , we propose to minimize a penalized least squares loss by regularizing via a group-lasso penalty and regularizing the dense matrix via a multivariate ridge penalty. Non-asymptotic deviation bounds of the in-sample prediction error are established. Our second step is to estimate the row space of B^* by leveraging the covariance structure of the residual vector from the first step. In the last step, we remove the effect of hidden variable by projecting Y onto the complement of the estimated row space of B^*. Non-asymptotic error bounds of our final estimator are established. The model identifiability, parameter estimation and statistical guarantees are further extended to the setting with heteroscedastic errors.

READ FULL TEXT
research
01/20/2022

Inference in High-dimensional Multivariate Response Regression with Hidden Variables

This paper studies the inference of the regression coefficient matrix un...
research
10/05/2020

Detecting approximate replicate components of a high-dimensional random vector with latent structure

High-dimensional feature vectors are likely to contain sets of measureme...
research
07/16/2021

Chi-square and normal inference in high-dimensional multi-task regression

The paper proposes chi-square and normal inference methodologies for the...
research
02/26/2022

New error bounds for the extended vertical LCP

In this paper, by making use of this fact that for a_j, b_j∈ℝ, j=1,2,…,n...
research
12/10/2018

Capturing Between-Tasks Covariance and Similarities Using Multivariate Linear Mixed Models

We consider the problem of predicting several response variables using t...
research
05/26/2019

Gaussian DAGs on network data

The traditional directed acyclic graph (DAG) model assumes data are gene...
research
08/22/2017

Learning Combinations of Sigmoids Through Gradient Estimation

We develop a new approach to learn the parameters of regression models w...

Please sign up or login with your details

Forgot password? Click here to reset