WildWood: a new Random Forest algorithm

09/16/2021
by   Stéphane Gaïffas, et al.
0

We introduce WildWood (WW), a new ensemble algorithm for supervised learning of Random Forest (RF) type. While standard RF algorithms use bootstrap out-of-bag samples to compute out-of-bag scores, WW uses these samples to produce improved predictions given by an aggregation of the predictions of all possible subtrees of each fully grown tree in the forest. This is achieved by aggregation with exponential weights computed over out-of-bag samples, that are computed exactly and very efficiently thanks to an algorithm called context tree weighting. This improvement, combined with a histogram strategy to accelerate split finding, makes WW fast and competitive compared with other well-established ensemble methods, such as standard RF and extreme gradient boosting algorithms.

READ FULL TEXT

page 2

page 3

research
08/17/2020

To Bag is to Prune

It is notoriously hard to build a bad Random Forest (RF). Concurrently, ...
research
10/14/2014

Enhanced Random Forest with Image/Patch-Level Learning for Image Understanding

Image understanding is an important research domain in the computer visi...
research
01/21/2021

Crossbreeding in Random Forest

Ensemble learning methods are designed to benefit from multiple learning...
research
02/10/2021

Feature Analyses and Modelling of Lithium-ion Batteries Manufacturing based on Random Forest Classification

Lithium-ion battery manufacturing is a highly complicated process with s...
research
06/25/2019

AMF: Aggregated Mondrian Forests for Online Learning

Random Forests (RF) is one of the algorithms of choice in many supervise...
research
05/17/2023

Optimal Weighted Random Forests

The random forest (RF) algorithm has become a very popular prediction me...
research
03/02/2021

Slow-Growing Trees

Random Forest's performance can be matched by a single slow-growing tree...

Please sign up or login with your details

Forgot password? Click here to reset