Strong consistency of the AIC, BIC, C_p and KOO methods in high-dimensional multivariate linear regression

10/30/2018
by   Zhidong Bai, et al.
0

Variable selection is essential for improving inference and interpretation in multivariate linear regression. Although a number of alternative regressor selection criteria have been suggested, the most prominent and widely used are the Akaike information criterion (AIC), Bayesian information criterion (BIC), Mallow's C_p, and their modifications. However, for high-dimensional data, experience has shown that the performance of these classical criteria is not always satisfactory. In the present article, we begin by presenting the necessary and sufficient conditions (NSC) for the strong consistency of the high-dimensional AIC, BIC, and C_p, based on which we can identify some reasons for their poor performance. Specifically, we show that under certain mild high-dimensional conditions, if the BIC is strongly consistent, then the AIC is strongly consistent, but not vice versa. This result contradicts the classical understanding. In addition, we consider some NSC for the strong consistency of the high-dimensional kick-one-out (KOO) methods introduced by Zhao et al. (1986) and Nishii et al. (1988). Furthermore, we propose two general methods based on the KOO methods and prove their strong consistency. The proposed general methods remove the penalties while simultaneously reducing the conditions for the dimensions and sizes of the regressors. A simulation study supports our consistency conclusions and shows that the convergence rates of the two proposed general KOO methods are much faster than those of the original methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/19/2022

Consistent Bayesian Information Criterion Based on a Mixture Prior for Possibly High-Dimensional Multivariate Linear Regression Models

In the problem of selecting variables in a multivariate linear regressio...
research
04/17/2019

High-dimensional variable selection via low-dimensional adaptive learning

A stochastic search method, the so-called Adaptive Subspace (AdaSub) met...
research
06/27/2013

Optimal Feature Selection in High-Dimensional Discriminant Analysis

We consider the high-dimensional discriminant analysis problem. For this...
research
11/14/2012

Order-independent constraint-based causal structure learning

We consider constraint-based methods for causal structure learning, such...
research
08/20/2020

Strong consistent model selection for general causal time series

We consider the strongly consistent question for model selection in a la...
research
11/09/2020

High dimensional PCA: a new model selection criterion

Given a random sample from a multivariate population, estimating the num...
research
02/06/2013

Models and Selection Criteria for Regression and Classification

When performing regression or classification, we are interested in the c...

Please sign up or login with your details

Forgot password? Click here to reset