Sparse-Input Neural Network using Group Concave Regularization

07/01/2023
by   Bin Luo, et al.
0

Simultaneous feature selection and non-linear function estimation are challenging, especially in high-dimensional settings where the number of variables exceeds the available sample size in modeling. In this article, we investigate the problem of feature selection in neural networks. Although the group LASSO has been utilized to select variables for learning with neural networks, it tends to select unimportant variables into the model to compensate for its over-shrinkage. To overcome this limitation, we propose a framework of sparse-input neural networks using group concave regularization for feature selection in both low-dimensional and high-dimensional settings. The main idea is to apply a proper concave penalty to the l_2 norm of weights from all outgoing connections of each input node, and thus obtain a neural net that only uses a small subset of the original variables. In addition, we develop an effective algorithm based on backward path-wise optimization to yield stable solution paths, in order to tackle the challenge of complex optimization landscapes. Our extensive simulation studies and real data examples demonstrate satisfactory finite sample performances of the proposed estimator, in feature selection and prediction for modeling continuous, binary, and time-to-event outcomes.

READ FULL TEXT
research
11/21/2017

Sparse-Input Neural Networks for High-dimensional Nonparametric Regression and Classification

Neural networks are usually not the tool of choice for nonparametric hig...
research
04/04/2022

Deep Feature Screening: Feature Selection for Ultra High-Dimensional Data via Deep Neural Networks

The applications of traditional statistical feature selection methods to...
research
04/21/2023

Effective Neural Network L_0 Regularization With BinMask

L_0 regularization of neural networks is a fundamental problem. In addit...
research
07/29/2019

A neural network with feature sparsity

We propose a neural network model, with a separate linear (residual) ter...
research
07/02/2016

Group Sparse Regularization for Deep Neural Networks

In this paper, we consider the joint task of simultaneously optimizing (...
research
05/11/2016

Asymptotic properties for combined L_1 and concave regularization

Two important goals of high-dimensional modeling are prediction and vari...
research
06/16/2020

Efficient Path Algorithms for Clustered Lasso and OSCAR

In high dimensional regression, feature clustering by their effects on o...

Please sign up or login with your details

Forgot password? Click here to reset