Where to Pay Attention in Sparse Training for Feature Selection?

11/26/2022
by   Ghada Sokar, et al.
23

A new line of research for feature selection based on neural networks has recently emerged. Despite its superiority to classical methods, it requires many training iterations to converge and detect informative features. The computational time becomes prohibitively long for datasets with a large number of samples or a very high dimensional feature space. In this paper, we present a new efficient unsupervised method for feature selection based on sparse autoencoders. In particular, we propose a new sparse training algorithm that optimizes a model's sparse topology during training to pay attention to informative features quickly. The attention-based adaptation of the sparse topology enables fast detection of informative features after a few training iterations. We performed extensive experiments on 10 datasets of different types, including image, speech, text, artificial, and biological. They cover a wide range of characteristics, such as low and high-dimensional feature spaces, and few and large training samples. Our proposed approach outperforms the state-of-the-art methods in terms of selecting informative features while reducing training iterations and computational costs substantially. Moreover, the experiments show the robustness of our method in extremely noisy environments.

READ FULL TEXT

page 9

page 21

research
03/10/2023

Supervised Feature Selection with Neuron Evolution in Sparse Neural Networks

Feature selection that selects an informative subset of variables from d...
research
10/19/2020

Fractal Autoencoders for Feature Selection

Feature selection reduces the dimensionality of data by identifying a su...
research
12/01/2020

Quick and Robust Feature Selection: the Strength of Energy-efficient Sparse Training for Autoencoders

Major complications arise from the recent increase in the amount of high...
research
06/27/2015

A Novel Approach for Stable Selection of Informative Redundant Features from High Dimensional fMRI Data

Feature selection is among the most important components because it not ...
research
06/04/2021

Top-k Regularization for Supervised Feature Selection

Feature selection identifies subsets of informative features and reduces...
research
06/07/2023

Feature Selection using Sparse Adaptive Bottleneck Centroid-Encoder

We introduce a novel nonlinear model, Sparse Adaptive Bottleneck Centroi...
research
09/08/2018

Identifying The Most Informative Features Using A Structurally Interacting Elastic Net

Feature selection can efficiently identify the most informative features...

Please sign up or login with your details

Forgot password? Click here to reset