Splitting matters: how monotone transformation of predictor variables may improve the predictions of decision tree models

11/14/2016
by   Tal Galili, et al.
0

It is widely believed that the prediction accuracy of decision tree models is invariant under any strictly monotone transformation of the individual predictor variables. However, this statement may be false when predicting new observations with values that were not seen in the training-set and are close to the location of the split point of a tree rule. The sensitivity of the prediction error to the split point interpolation is high when the split point of the tree is estimated based on very few observations, reaching 9 misclassification error when only 10 observations are used for constructing a split, and shrinking to 1 compares the performance of alternative methods for split point interpolation and concludes that the best choice is taking the mid-point between the two closest points to the split point of the tree. Furthermore, if the (continuous) distribution of the predictor variable is known, then using its probability integral for transforming the variable ("quantile transformation") will reduce the model's interpolation error by up to about a half on average. Accordingly, this study provides guidelines for both developers and users of decision tree models (including bagging and random forest).

READ FULL TEXT

page 14

page 16

page 37

page 39

page 40

research
08/09/2021

Collapsing the Decision Tree: the Concurrent Data Predictor

A family of concurrent data predictors is derived from the decision tree...
research
05/30/2022

bsnsing: A decision tree induction method based on recursive optimal boolean rule composition

This paper proposes a new mixed-integer programming (MIP) formulation to...
research
09/16/2018

Mobility Mode Detection Using WiFi Signals

We utilize Wi-Fi communications from smartphones to predict their mobili...
research
12/19/2019

Extreme Learning Tree

The paper proposes a new variant of a decision tree, called an Extreme L...
research
10/20/2020

An Eager Splitting Strategy for Online Decision Trees

We study the effectiveness of replacing the split strategy for the state...
research
11/30/2020

Using dynamical quantization to perform split attempts in online tree regressors

A central aspect of online decision tree solutions is evaluating the inc...
research
11/19/2021

MURAL: An Unsupervised Random Forest-Based Embedding for Electronic Health Record Data

A major challenge in embedding or visualizing clinical patient data is t...

Please sign up or login with your details

Forgot password? Click here to reset