High-dimensional variable selection via low-dimensional adaptive learning

04/17/2019
by   Christian Staerk, et al.
0

A stochastic search method, the so-called Adaptive Subspace (AdaSub) method, is proposed for variable selection in high-dimensional linear regression models. The method aims at finding the best model with respect to a certain model selection criterion and is based on the idea of adaptively solving low-dimensional sub-problems in order to provide a solution to the original high-dimensional problem. Any of the usual ℓ_0-type model selection criteria can be used, such as Akaike's Information Criterion (AIC), the Bayesian Information Criterion (BIC) or the Extended BIC (EBIC), with the last being particularly suitable for high-dimensional cases. The limiting properties of the new algorithm are analysed and it is shown that, under certain conditions, AdaSub converges to the best model according to the considered criterion. In a simulation study, the performance of AdaSub is investigated in comparison to alternative methods. The effectiveness of the proposed method is illustrated via various simulated datasets and a high-dimensional real data example.

READ FULL TEXT
research
08/19/2022

Consistent Bayesian Information Criterion Based on a Mixture Prior for Possibly High-Dimensional Multivariate Linear Regression Models

In the problem of selecting variables in a multivariate linear regressio...
research
02/03/2023

Trade-off between prediction and FDR for high-dimensional Gaussian model selection

In the context of the high-dimensional Gaussian linear regression for or...
research
06/18/2019

Model selection for high-dimensional linear regression with dependent observations

We investigate the prediction capability of the orthogonal greedy algori...
research
10/30/2018

Strong consistency of the AIC, BIC, C_p and KOO methods in high-dimensional multivariate linear regression

Variable selection is essential for improving inference and interpretati...
research
10/01/2018

Bayesian inference in high-dimensional linear models using an empirical correlation-adaptive prior

In the context of a high-dimensional linear regression model, we propose...
research
01/29/2018

Model selection in sparse high-dimensional vine copula models with application to portfolio risk

Vine copulas allow to build flexible dependence models for an arbitrary ...
research
03/25/2014

Selective Factor Extraction in High Dimensions

This paper studies simultaneous feature selection and extraction in supe...

Please sign up or login with your details

Forgot password? Click here to reset