Feature Selection via Probabilistic Outputs

06/27/2012
by   Andrea Danyluk, et al.
0

This paper investigates two feature-scoring criteria that make use of estimated class probabilities: one method proposed by shen and a complementary approach proposed below. We develop a theoretical framework to analyze each criterion and show that both estimate the spread (across all values of a given feature) of the probability that an example belongs to the positive class. Based on our analysis, we predict when each scoring technique will be advantageous over the other and give empirical results validating our predictions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2021

Canonical-Correlation-Based Fast Feature Selection

This paper proposes a canonical-correlation-based filter method for feat...
research
02/12/2019

Sparse Feature Selection in Kernel Discriminant Analysis via Optimal Scoring

We consider the two-group classification problem and propose a kernel cl...
research
05/06/2014

Feature selection for classification with class-separability strategy and data envelopment analysis

In this paper, a novel feature selection method is presented, which is b...
research
01/15/2020

Outlier Detection Ensemble with Embedded Feature Selection

Feature selection places an important role in improving the performance ...
research
12/15/2022

Converting College Football Point Spread Differentials to Probabilities

For NCAA football, we provide a method for sports bettors to determine i...
research
05/26/2008

DimReduction - Interactive Graphic Environment for Dimensionality Reduction

Feature selection is a pattern recognition approach to choose important ...
research
06/08/2022

Likelihood-free Model Choice for Simulator-based Models with the Jensen–Shannon Divergence

Choice of appropriate structure and parametric dimension of a model in t...

Please sign up or login with your details

Forgot password? Click here to reset