Unsupervised Hypergraph Feature Selection via a Novel Point-Weighting Framework and Low-Rank Representation

08/25/2018
by   Ammar Gilani, et al.
2

Feature selection methods are widely used in order to solve the 'curse of dimensionality' problem. Many proposed feature selection frameworks, treat all data points equally; neglecting their different representation power and importance. In this paper, we propose an unsupervised hypergraph feature selection method via a novel point-weighting framework and low-rank representation that captures the importance of different data points. We introduce a novel soft hypergraph with low complexity to model data. Then, we formulate the feature selection as an optimization problem to preserve local relationships and also global structure of data. Our approach for global structure preservation helps the framework overcome the problem of unavailability of data labels in unsupervised learning. The proposed feature selection method treats with different data points based on their importance in defining data structure and representation power. Moreover, since the robustness of feature selection methods against noise and outlier is of great importance, we adopt low-rank representation in our model. Also, we provide an efficient algorithm to solve the proposed optimization problem. The computational cost of the proposed algorithm is lower than many state-of-the-art methods which is of high importance in feature selection tasks. We conducted comprehensive experiments with various evaluation methods on different benchmark data sets. These experiments indicate significant improvement, compared with state-of-the-art feature selection methods.

READ FULL TEXT

page 22

page 23

page 24

page 27

page 29

research
06/21/2021

Low-rank Dictionary Learning for Unsupervised Feature Selection

There exist many high-dimensional data in real-world applications such a...
research
05/29/2020

Unsupervised Feature Selection via Multi-step Markov Transition Probability

Feature selection is a widely used dimension reduction technique to sele...
research
01/31/2022

Compactness Score: A Fast Filter Method for Unsupervised Feature Selection

For feature engineering, feature selection seems to be an important rese...
research
06/22/2023

Generalized Low-Rank Update: Model Parameter Bounds for Low-Rank Training Data Modifications

In this study, we have developed an incremental machine learning (ML) me...
research
06/18/2020

Leveraging Model Inherent Variable Importance for Stable Online Feature Selection

Feature selection can be a crucial factor in obtaining robust and accura...
research
09/12/2015

Double Relief with progressive weighting function

Feature weighting algorithms try to solve a problem of great importance ...
research
11/30/2018

Unsupervised learning with GLRM feature selection reveals novel traumatic brain injury phenotypes

Baseline injury categorization is important to traumatic brain injury (T...

Please sign up or login with your details

Forgot password? Click here to reset