Feature Selection from High-Dimensional Data with Very Low Sample Size: A Cautionary Tale

08/27/2020
by   Ludmila I. Kuncheva, et al.
0

In classification problems, the purpose of feature selection is to identify a small, highly discriminative subset of the original feature set. In many applications, the dataset may have thousands of features and only a few dozens of samples (sometimes termed `wide'). This study is a cautionary tale demonstrating why feature selection in such cases may lead to undesirable results. In view to highlight the sample size issue, we derive the required sample size for declaring two features different. Using an example, we illustrate the heavy dependency between feature set and classifier, which poses a question to classifier-agnostic feature selection methods. However, the choice of a good selector-classifier pair is hampered by the low correlation between estimated and true error rate, as illustrated by another example. While previous studies raising similar issues validate their message with mostly synthetic data, here we carried out an experiment with 20 real datasets. We created an exaggerated scenario whereby we cut a very small portion of the data (10 instances per class) for feature selection and used the rest of the data for testing. The results reinforce the caution and suggest that it may be better to refrain from feature selection from very wide datasets rather than return misleading output to the user.

READ FULL TEXT
research
08/08/2017

An Effective Feature Selection Method Based on Pair-Wise Feature Proximity for High Dimensional Low Sample Size Data

Feature selection has been studied widely in the literature. However, th...
research
11/25/2022

Graph Convolutional Network-based Feature Selection for High-dimensional and Low-sample Size Data

Feature selection is a powerful dimension reduction technique which sele...
research
01/05/2014

Feature Selection Using Classifier in High Dimensional Data

Feature selection is frequently used as a pre-processing step to machine...
research
02/19/2019

An entropic feature selection method in perspective of Turing formula

Health data are generally complex in type and small in sample size. Such...
research
01/12/2011

Review and Evaluation of Feature Selection Algorithms in Synthetic Problems

The main purpose of Feature Subset Selection is to find a reduced subset...
research
01/12/2020

On Feature Interactions Identified by Shapley Values of Binary Classification Games

For feature selection and related problems, we introduce the notion of c...
research
08/23/2017

Massively-Parallel Feature Selection for Big Data

We present the Parallel, Forward-Backward with Pruning (PFBP) algorithm ...

Please sign up or login with your details

Forgot password? Click here to reset