Projected Neural Network for a Class of Sparse Regression with Cardinality Penalty
In this paper, we consider a class of sparse regression problems, whose objective function is the summation of a convex loss function and a cardinality penalty. By constructing a smoothing function for the cardinality function, we propose a projected neural network and design a correction method for solving this problem. The solution of the proposed neural network is unique, global existent, bounded and globally Lipschitz continuous. Besides, we prove that all accumulation points of the proposed neural network have a common support set and a unified lower bound for the nonzero entries. Combining the proposed neural network with the correction method, any corrected accumulation point is a local minimizer of the considered sparse regression problem. Moreover, we analyze the equivalent relationship on the local minimizers between the considered sparse regression problem and another sparse problem. Finally, some numerical experiments are provided to show the efficiency of the proposed neural networks in solving some sparse regression problems in practice.
READ FULL TEXT