Powershap: A Power-full Shapley Feature Selection Method

06/16/2022
by   Jarne Verhaeghe, et al.
0

Feature selection is a crucial step in developing robust and powerful machine learning models. Feature selection techniques can be divided into two categories: filter and wrapper methods. While wrapper methods commonly result in strong predictive performances, they suffer from a large computational complexity and therefore take a significant amount of time to complete, especially when dealing with high-dimensional feature sets. Alternatively, filter methods are considerably faster, but suffer from several other disadvantages, such as (i) requiring a threshold value, (ii) not taking into account intercorrelation between features, and (iii) ignoring feature interactions with the model. To this end, we present powershap, a novel wrapper feature selection method, which leverages statistical hypothesis testing and power calculations in combination with Shapley values for quick and intuitive feature selection. Powershap is built on the core assumption that an informative feature will have a larger impact on the prediction compared to a known random feature. Benchmarks and simulations show that powershap outperforms other filter methods with predictive performances on par with wrapper methods while being significantly faster, often even reaching half or a third of the execution time. As such, powershap provides a competitive and quick algorithm that can be used by various models in different domains. Furthermore, powershap is implemented as a plug-and-play and open-source sklearn component, enabling easy integration in conventional data science pipelines. User experience is even further enhanced by also providing an automatic mode that automatically tunes the hyper-parameters of the powershap algorithm, allowing to use the algorithm without any configuration needed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/05/2020

Feature Selection Methods for Uplift Modeling

Uplift modeling is a predictive modeling technique that estimates the us...
research
07/12/2018

The Impact of Feature Selection on Predicting the Number of Bugs

Bug prediction is the process of training a machine learning model on so...
research
11/30/2022

Universal Feature Selection Tool (UniFeat): An Open-Source Tool for Dimensionality Reduction

The Universal Feature Selection Tool (UniFeat) is an open-source tool de...
research
09/30/2017

Testing for Feature Relevance: The HARVEST Algorithm

Feature selection with high-dimensional data and a very small proportion...
research
11/17/2022

An Advantage Using Feature Selection with a Quantum Annealer

Feature selection is a technique in statistical prediction modeling that...
research
10/31/2018

MDFS - MultiDimensional Feature Selection

Identification of informative variables in an information system is ofte...
research
11/10/2016

Feature Selection with the R Package MXM: Discovering Statistically-Equivalent Feature Subsets

The statistically equivalent signature (SES) algorithm is a method for f...

Please sign up or login with your details

Forgot password? Click here to reset