Randomization as Regularization: A Degrees of Freedom Explanation for Random Forest Success

11/01/2019
by   Lucas Mentch, et al.
53

Random forests remain among the most popular off-the-shelf supervised machine learning tools with a well-established track record of predictive accuracy in both regression and classification settings. Despite their empirical success as well as a bevy of recent work investigating their statistical properties, a full and satisfying explanation for their success has yet to be put forth. Here we aim to take a step forward in this direction by demonstrating that the additional randomness injected into individual trees serves as a form of implicit regularization, making random forests an ideal model in low signal-to-noise ratio (SNR) settings. Specifically, from a model-complexity perspective, we show that the mtry parameter in random forests serves much the same purpose as the shrinkage penalty in explicitly regularized regression procedures like lasso and ridge regression. To highlight this point, we design a randomized linear-model-based forward selection procedure intended as an analogue to tree-based random forests and demonstrate its surprisingly strong empirical performance. Numerous demonstrations on both real and synthetic data are provided.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/30/2021

Trees, Forests, Chickens, and Eggs: When and Why to Prune Trees in a Random Forest

Due to their long-standing reputation as excellent off-the-shelf predict...
research
05/25/2019

Asymptotic Distributions and Rates of Convergence for Random Forests and other Resampled Ensemble Learners

Random forests remain among the most popular off-the-shelf supervised le...
research
04/28/2015

Explaining the Success of AdaBoost and Random Forests as Interpolating Classifiers

There is a large literature explaining why AdaBoost is a successful clas...
research
03/07/2020

Getting Better from Worse: Augmented Bagging and a Cautionary Tale of Variable Importance

As the size, complexity, and availability of data continues to grow, sci...
research
08/10/2017

Subset Selection with Shrinkage: Sparse Linear Modeling when the SNR is low

We study the behavior of a fundamental tool in sparse statistical modeli...
research
04/16/2019

Scalable and Efficient Hypothesis Testing with Random Forests

Throughout the last decade, random forests have established themselves a...
research
11/11/2019

Simplifying Random Forests: On the Trade-off between Interpretability and Accuracy

We analyze the trade-off between model complexity and accuracy for rando...

Please sign up or login with your details

Forgot password? Click here to reset