Sequential Feature Classification in the Context of Redundancies

04/01/2020
by   Lukas Pfannschmidt, et al.
0

The problem of all-relevant feature selection is concerned with finding a relevant feature set with preserved redundancies. There exist several approximations to solve this problem but only one could give a distinction between strong and weak relevance. This approach was limited to the case of linear problems. In this work, we present a new solution for this distinction in the non-linear case through the use of random forest models and statistical methods.

READ FULL TEXT
research
06/25/2011

The All Relevant Feature Selection using Random Forest

In this paper we examine the application of the random forest classifier...
research
06/16/2011

Random forest models of the retention constants in the thin layer chromatography

In the current study we examine an application of the machine learning m...
research
09/24/2015

A Review of Feature Selection Methods Based on Mutual Information

In this work we present a review of the state of the art of information ...
research
12/10/2019

Feature Relevance Determination for Ordinal Regression in the Context of Feature Redundancies and Privileged Information

Advances in machine learning technologies have led to increasingly power...
research
05/12/2016

Context-dependent feature analysis with random forests

In many cases, feature selection is often more complicated than identify...
research
11/16/2021

On the utility of power spectral techniques with feature selection techniques for effective mental task classification in noninvasive BCI

In this paper classification of mental task-root Brain-Computer Interfac...
research
03/02/2019

FRI - Feature Relevance Intervals for Interpretable and Interactive Data Exploration

Most existing feature selection methods are insufficient for analytic pu...

Please sign up or login with your details

Forgot password? Click here to reset