Quick and Robust Feature Selection: the Strength of Energy-efficient Sparse Training for Autoencoders

12/01/2020
by   Zahra Atashgahi, et al.
5

Major complications arise from the recent increase in the amount of high-dimensional data, including high computational costs and memory requirements. Feature selection, which identifies the most relevant and informative attributes of a dataset, has been introduced as a solution to this problem. Most of the existing feature selection methods are computationally inefficient; inefficient algorithms lead to high energy consumption, which is not desirable for devices with limited computational and energy resources. In this paper, a novel and flexible method for unsupervised feature selection is proposed. This method, named QuickSelection, introduces the strength of the neuron in sparse neural networks as a criterion to measure the feature importance. This criterion, blended with sparsely connected denoising autoencoders trained with the sparse evolutionary training procedure, derives the importance of all input features simultaneously. We implement QuickSelection in a purely sparse manner as opposed to the typical approach of using a binary mask over connections to simulate sparsity. It results in a considerable speed increase and memory reduction. When tested on several benchmark datasets, including five low-dimensional and three high-dimensional datasets, the proposed method is able to achieve the best trade-off of classification and clustering accuracy, running time, and maximum memory usage, among widely used approaches for feature selection. Besides, our proposed method requires the least amount of energy among the state-of-the-art autoencoder-based feature selection methods.

READ FULL TEXT

page 7

page 13

page 23

page 24

page 26

research
03/10/2023

Supervised Feature Selection with Neuron Evolution in Sparse Neural Networks

Feature selection that selects an informative subset of variables from d...
research
11/26/2022

Where to Pay Attention in Sparse Training for Feature Selection?

A new line of research for feature selection based on neural networks ha...
research
10/28/2022

End-to-end Ensemble-based Feature Selection for Paralinguistics Tasks

The events of recent years have highlighted the importance of telemedici...
research
12/17/2022

An Evolutionary Multitasking Algorithm with Multiple Filtering for High-Dimensional Feature Selection

Recently, evolutionary multitasking (EMT) has been successfully used in ...
research
06/25/2020

Stochastic Subset Selection

Current machine learning algorithms are designed to work with huge volum...
research
09/03/2023

Carbon Emission Prediction and Clean Industry Transformation Based on Machine Learning: A Case Study of Sichuan Province

This study preprocessed 2000-2019 energy consumption data for 46 key Sic...
research
03/22/2021

Feature Selection for Imbalanced Data with Deep Sparse Autoencoders Ensemble

Class imbalance is a common issue in many domain applications of learnin...

Please sign up or login with your details

Forgot password? Click here to reset