Multiple decision trees

03/27/2013
by   Suk Wah Kwok, et al.
0

This paper describes experiments, on two domains, to investigate the effect of averaging over predictions of multiple decision trees, instead of using a single tree. Other authors have pointed out theoretical and commonsense reasons for preferring the multiple tree approach. Ideally, we would like to consider predictions from all trees, weighted by their probability. However, there is a vast number of different trees, and it is difficult to estimate the probability of each tree. We sidestep the estimation problem by using a modified version of the ID3 algorithm to build good trees, and average over only these trees. Our results are encouraging. For each domain, we managed to produce a small number of good trees. We find that it is best to average across sets of trees with different structure; this usually gives better performance than any of the constituent trees, including the ID3 tree.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 6

page 7

page 8

research
04/05/2023

Inapproximability of sufficient reasons for decision trees

In this note, we establish the hardness of approximation of the problem ...
research
08/15/2022

Combining Predictions under Uncertainty: The Case of Random Decision Trees

A common approach to aggregate classification estimates in an ensemble o...
research
07/11/2012

MOB-ESP and other Improvements in Probability Estimation

A key prerequisite to optimal reasoning under uncertainty in intelligent...
research
02/24/2022

Interfering Paths in Decision Trees: A Note on Deodata Predictors

A technique for improving the prediction accuracy of decision trees is p...
research
11/10/2021

Classification of the Chess Endgame problem using Logistic Regression, Decision Trees, and Neural Networks

In this study we worked on the classification of the Chess Endgame probl...
research
09/10/2019

GBDT-MO: Gradient Boosted Decision Trees for Multiple Outputs

Gradient boosted decision trees (GBDTs) are widely used in machine learn...
research
08/26/2021

Distributed Soft Bayesian Additive Regression Trees

Bayesian Additive Regression Trees(BART) is a Bayesian nonparametric app...

Please sign up or login with your details

Forgot password? Click here to reset