Robust Multi-class Feature Selection via l_2,0-Norm Regularization Minimization

10/08/2020
by   Zhenzhen Sun, et al.
0

Feature selection is an important data preprocessing in data mining and machine learning, which can reduce feature size without deteriorating model's performance. Recently, sparse regression based feature selection methods have received considerable attention due to their good performance. However, these methods generally cannot determine the number of selected features automatically without using a predefined threshold. In order to get a satisfactory result, it often costs significant time and effort to tune the number of selected features carefully. To this end, this paper proposed a novel framework to solve the l_2,0-norm regularization least square problem directly for multi-class feature selection, which can produce exact rowsparsity solution for the weights matrix, features corresponding to non-zero rows will be selected thus the number of selected features can be determined automatically. An efficient homotopy iterative hard threshold (HIHT) algorithm is derived to solve the above optimization problem and find out the stable local solution. Besides, in order to reduce the computational time of HIHT, an acceleration version of HIHT (AHIHT) is derived. Extensive experiments on eight biological datasets show that the proposed method can achieve higher classification accuracy with fewest number of selected features comparing with the approximate convex counterparts and state-of-the-art feature selection methods. The robustness of classification accuracy to the regularization parameter is also exhibited.

READ FULL TEXT
research
10/09/2020

Nonnegative Spectral Analysis with Adaptive Graph and L_2,0-Norm Regularization for Unsupervised Feature Selection

Feature selection is an important data preprocessing in data mining and ...
research
09/12/2022

Bilevel Optimization for Feature Selection in the Data-Driven Newsvendor Problem

We study the feature-based newsvendor problem, in which a decision-maker...
research
10/08/2017

Structural Feature Selection for Event Logs

We consider the problem of classifying business process instances based ...
research
02/18/2019

Sparse Regression: Scalable algorithms and empirical performance

In this paper, we review state-of-the-art methods for feature selection ...
research
06/16/2014

Multi-stage Multi-task feature learning via adaptive threshold

Multi-task feature learning aims to identity the shared features among t...
research
03/16/2013

l_2,p Matrix Norm and Its Application in Feature Selection

Recently, l_2,1 matrix norm has been widely applied to many areas such a...
research
12/14/2021

Unsupervised feature selection via self-paced learning and low-redundant regularization

Much more attention has been paid to unsupervised feature selection nowa...

Please sign up or login with your details

Forgot password? Click here to reset