Confidence Intervals for the Generalisation Error of Random Forests

01/26/2022
by   Samyak Rajanala, et al.
0

Out-of-bag error is commonly used as an estimate of generalisation error in ensemble-based learning models such as random forests. We present confidence intervals for this quantity using the delta-method-after-bootstrap and the jackknife-after-bootstrap techniques. These methods do not require growing any additional trees. We show that these new confidence intervals have improved coverage properties over the naive confidence interval, in real and simulated examples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/14/2019

On confidence intervals centered on bootstrap smoothed estimators

We assess the performance, in terms of coverage probability and expected...
research
04/25/2014

Quantifying Uncertainty in Random Forests via Confidence Intervals and Hypothesis Tests

This work develops formal statistical inference procedures for machine l...
research
08/24/2018

Applications of the Fractional-Random-Weight Bootstrap

The bootstrap, based on resampling, has, for several decades, been a wid...
research
12/07/2017

Asymptotic coverage probabilities of bootstrap percentile confidence intervals for constrained parameters

The asymptotic behaviour of the commonly used bootstrap percentile confi...
research
11/18/2013

Confidence Intervals for Random Forests: The Jackknife and the Infinitesimal Jackknife

We study the variability of predictions made by bagged learners and rand...
research
06/01/2023

Confidence Intervals for Error Rates in Matching Tasks: Critical Review and Recommendations

Matching algorithms are commonly used to predict matches between items i...

Please sign up or login with your details

Forgot password? Click here to reset