Optimal trees selection for classification via out-of-bag assessment and sub-bagging

12/30/2020
by   Zardad Khan, et al.
11

The effect of training data size on machine learning methods has been well investigated over the past two decades. The predictive performance of tree based machine learning methods, in general, improves with a decreasing rate as the size of training data increases. We investigate this in optimal trees ensemble (OTE) where the method fails to learn from some of the training observations due to internal validation. Modified tree selection methods are thus proposed for OTE to cater for the loss of training observations in internal validation. In the first method, corresponding out-of-bag (OOB) observations are used in both individual and collective performance assessment for each tree. Trees are ranked based on their individual performance on the OOB observations. A certain number of top ranked trees is selected and starting from the most accurate tree, subsequent trees are added one by one and their impact is recorded by using the OOB observations left out from the bootstrap sample taken for the tree being added. A tree is selected if it improves predictive accuracy of the ensemble. In the second approach, trees are grown on random subsets, taken without replacement-known as sub-bagging, of the training data instead of bootstrap samples (taken with replacement). The remaining observations from each sample are used in both individual and collective assessments for each corresponding tree similar to the first method. Analysis on 21 benchmark datasets and simulations studies show improved performance of the modified methods in comparison to OTE and other state-of-the-art methods.

READ FULL TEXT

page 2

page 3

page 4

page 9

page 11

page 18

page 19

page 20

research
05/18/2020

Optimal survival trees ensemble

Recent studies have adopted an approach of selecting accurate and divers...
research
03/15/2017

Cost-complexity pruning of random forests

Random forests perform bootstrap-aggregation by sampling the training sa...
research
06/01/2015

Bootstrap Bias Corrections for Ensemble Methods

This paper examines the use of a residual bootstrap for bias correction ...
research
11/21/2022

An Optimal k Nearest Neighbours Ensemble for Classification Based on Extended Neighbourhood Rule with Features subspace

To minimize the effect of outliers, kNN ensembles identify a set of clos...
research
04/05/2020

XtracTree for Regulator Validation of Bagging Methods Used in Retail Banking

Bootstrap aggregation, known as bagging, is one of the most popular ense...
research
12/10/2015

Cross-Validated Variable Selection in Tree-Based Methods Improves Predictive Performance

Recursive partitioning approaches producing tree-like models are a long ...
research
12/15/2020

Robust Optimal Classification Trees under Noisy Labels

In this paper we propose a novel methodology to construct Optimal Classi...

Please sign up or login with your details

Forgot password? Click here to reset