Feature selection in machine learning: Rényi min-entropy vs Shannon entropy

01/27/2020
by   Catuscia Palamidessi, et al.
0

Feature selection, in the context of machine learning, is the process of separating the highly predictive feature from those that might be irrelevant or redundant. Information theory has been recognized as a useful concept for this task, as the prediction power stems from the correlation, i.e., the mutual information, between features and labels. Many algorithms for feature selection in the literature have adopted the Shannon-entropy-based mutual information. In this paper, we explore the possibility of using Rényi min-entropy instead. In particular, we propose an algorithm based on a notion of conditional Rényi min-entropy that has been recently adopted in the field of security and privacy, and which is strictly related to the Bayes error. We prove that in general the two approaches are incomparable, in the sense that we show that we can construct datasets on which the Rényi-based algorithm performs better than the corresponding Shannon-based one, and datasets on which the situation is reversed. In practice, however, when considering datasets of real data, it seems that the Rényi-based algorithm tends to outperform the other one. We have effectuate several experiments on the BASEHOCK, SEMEION, and GISETTE datasets, and in all of them we have indeed observed that the Rényi-based algorithm gives better results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/21/2016

Theoretical Evaluation of Feature Selection Methods based on Mutual Information

Feature selection methods are usually evaluated by wrapping specific cla...
research
12/30/2021

Studying the Interplay between Information Loss and Operation Loss in Representations for Classification

Information-theoretic measures have been widely adopted in the design of...
research
10/06/2012

Feature Selection via L1-Penalized Squared-Loss Mutual Information

Feature selection is a technique to screen out less important features. ...
research
07/18/2022

High-Order Conditional Mutual Information Maximization for dealing with High-Order Dependencies in Feature Selection

This paper presents a novel feature selection method based on the condit...
research
06/23/2020

Distance Correlation Sure Independence Screening for Accelerated Feature Selection in Parkinson's Disease Vocal Data

With the abundance of machine learning methods available and the temptat...
research
01/29/2021

An Information Bottleneck Problem with Rényi's Entropy

This paper considers an information bottleneck problem with the objectiv...

Please sign up or login with your details

Forgot password? Click here to reset