Feature Importance Measurement based on Decision Tree Sampling

07/25/2023
by   Chao Huang, et al.
0

Random forest is effective for prediction tasks but the randomness of tree generation hinders interpretability in feature importance analysis. To address this, we proposed DT-Sampler, a SAT-based method for measuring feature importance in tree-based model. Our method has fewer parameters than random forest and provides higher interpretability and stability for the analysis in real-world problems. An implementation of DT-Sampler is available at https://github.com/tsudalab/DT-sampler.

READ FULL TEXT
research
03/19/2019

Random Pairwise Shapelets Forest

Shapelet is a discriminative subsequence of time series. An advanced sha...
research
10/16/2022

Positive-Unlabeled Learning using Random Forests via Recursive Greedy Risk Minimization

The need to learn from positive and unlabeled data, or PU learning, aris...
research
12/19/2019

Extreme Learning Tree

The paper proposes a new variant of a decision tree, called an Extreme L...
research
11/26/2020

FIST: A Feature-Importance Sampling and Tree-Based Method for Automatic Design Flow Parameter Tuning

Design flow parameters are of utmost importance to chip design quality a...
research
07/04/2023

MDI+: A Flexible Random Forest-Based Feature Importance Framework

Mean decrease in impurity (MDI) is a popular feature importance measure ...
research
05/18/2023

Unbiased Gradient Boosting Decision Tree with Unbiased Feature Importance

Gradient Boosting Decision Tree (GBDT) has achieved remarkable success i...
research
12/31/2020

Automatic Historical Feature Generation through Tree-based Method in Ads Prediction

Historical features are important in ads click-through rate (CTR) predic...

Please sign up or login with your details

Forgot password? Click here to reset