Improving feature selection algorithms using normalised feature histograms

02/05/2012
by   Alex Pappachen James, et al.
0

The proposed feature selection method builds a histogram of the most stable features from random subsets of a training set and ranks the features based on a classifier based cross-validation. This approach reduces the instability of features obtained by conventional feature selection methods that occur with variation in training data and selection criteria. Classification results on four microarray and three image datasets using three major feature selection criteria and a naive Bayes classifier show considerable improvement over benchmark results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/12/2015

Search Strategies for Binary Feature Selection for a Naive Bayes Classifier

We compare in this paper several feature selection methods for the Naive...
research
10/26/2020

Fast-Ensembles of Minimum Redundancy Feature Selection

Finding relevant subspaces in very high-dimensional data is a challengin...
research
11/30/2020

Utilizing stability criteria in choosing feature selection methods yields reproducible results in microbiome data

Feature selection is indispensable in microbiome data analysis, but it c...
research
05/28/2019

Efficient Wrapper Feature Selection using Autoencoder and Model Based Elimination

We propose a computationally efficient wrapper feature selection method ...
research
01/15/2020

Outlier Detection Ensemble with Embedded Feature Selection

Feature selection places an important role in improving the performance ...
research
09/27/2020

RENT – Repeated Elastic Net Technique for Feature Selection

In this study we present the RENT feature selection method for binary cl...
research
02/04/2018

Heuristic Feature Selection for Clickbait Detection

We study feature selection as a means to optimize the baseline clickbait...

Please sign up or login with your details

Forgot password? Click here to reset