Provable Identifiability of Two-Layer ReLU Neural Networks via LASSO Regularization

05/07/2023
โˆ™
by   Gen Li, et al.
โˆ™
0
โˆ™

LASSO regularization is a popular regression tool to enhance the prediction accuracy of statistical models by performing variable selection through the โ„“_1 penalty, initially formulated for the linear model and its variants. In this paper, the territory of LASSO is extended to two-layer ReLU neural networks, a fashionable and powerful nonlinear regression model. Specifically, given a neural network whose output y depends only on a small subset of input x, denoted by ๐’ฎ^โ‹†, we prove that the LASSO estimator can stably reconstruct the neural network and identify ๐’ฎ^โ‹† when the number of samples scales logarithmically with the input dimension. This challenging regime has been well understood for linear models while barely studied for neural networks. Our theory lies in an extended Restricted Isometry Property (RIP)-based analysis framework for two-layer ReLU neural networks, which may be of independent interest to other LASSO or neural network settings. Based on the result, we advocate a neural network-based variable selection method. Experiments on simulated and real-world datasets show promising performance of the variable selection approach compared with existing techniques.

READ FULL TEXT
research
โˆ™ 04/11/2019

FATSO: A family of operators for variable selection in linear models

In linear models it is common to have situations where several regressio...
research
โˆ™ 08/30/2018

Simulation-Selection-Extrapolation: Estimation in High-Dimensional Errors-in-Variables Models

This paper considers errors-in-variables models in a high-dimensional se...
research
โˆ™ 05/15/2019

Efficient hinging hyperplanes neural network and its application in nonlinear system identification

In this paper, the efficient hinging hyperplanes (EHH) neural network is...
research
โˆ™ 04/08/2008

Bolasso: model consistent Lasso estimation through the bootstrap

We consider the least-square linear regression problem with regularizati...
research
โˆ™ 09/22/2020

ABM: an automatic supervised feature engineering method for loss based models based on group and fused lasso

A vital problem in solving classification or regression problem is to ap...
research
โˆ™ 02/28/2018

Semi-Analytic Resampling in Lasso

An approximate method for conducting resampling in Lasso, the โ„“_1 penali...
research
โˆ™ 07/27/2017

Extended Comparisons of Best Subset Selection, Forward Stepwise Selection, and the Lasso

In exciting new work, Bertsimas et al. (2016) showed that the classical ...

Please sign up or login with your details

Forgot password? Click here to reset