Trees, Forests, Chickens, and Eggs: When and Why to Prune Trees in a Random Forest

03/30/2021
by   Siyu Zhou, et al.
0

Due to their long-standing reputation as excellent off-the-shelf predictors, random forests continue remain a go-to model of choice for applied statisticians and data scientists. Despite their widespread use, however, until recently, little was known about their inner-workings and about which aspects of the procedure were driving their success. Very recently, two competing hypotheses have emerged – one based on interpolation and the other based on regularization. This work argues in favor of the latter by utilizing the regularization framework to reexamine the decades-old question of whether individual trees in an ensemble ought to be pruned. Despite the fact that default constructions of random forests use near full depth trees in most popular software packages, here we provide strong evidence that tree depth should be seen as a natural form of regularization across the entire procedure. In particular, our work suggests that random forests with shallow trees are advantageous when the signal-to-noise ratio in the data is low. In building up this argument, we also critique the newly popular notion of "double descent" in random forests by drawing parallels to U-statistics and arguing that the noticeable jumps in random forest accuracy are the result of simple averaging rather than interpolation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/25/2019

Asymptotic Distributions and Rates of Convergence for Random Forests and other Resampled Ensemble Learners

Random forests remain among the most popular off-the-shelf supervised le...
research
11/01/2019

Randomization as Regularization: A Degrees of Freedom Explanation for Random Forest Success

Random forests remain among the most popular off-the-shelf supervised ma...
research
04/28/2015

Explaining the Success of AdaBoost and Random Forests as Interpolating Classifiers

There is a large literature explaining why AdaBoost is a successful clas...
research
04/25/2016

Neural Random Forests

Given an ensemble of randomized regression trees, it is possible to rest...
research
08/03/2019

The Use of Binary Choice Forests to Model and Estimate Discrete Choice Models

We show the equivalence of discrete choice models and the class of binar...
research
02/08/2022

Is interpolation benign for random forests?

Statistical wisdom suggests that very complex models, interpolating trai...
research
02/18/2018

Training Big Random Forests with Little Resources

Without access to large compute clusters, building random forests on lar...

Please sign up or login with your details

Forgot password? Click here to reset