Guided Random Forest and its application to data approximation

09/02/2019
by   Prashant Gupta, et al.
17

We present a new way of constructing an ensemble classifier, named the Guided Random Forest (GRAF) in the sequel. GRAF extends the idea of building oblique decision trees with localized partitioning to obtain a global partitioning. We show that global partitioning bridges the gap between decision trees and boosting algorithms. We empirically demonstrate that global partitioning reduces the generalization error bound. Results on 115 benchmark datasets show that GRAF yields comparable or better results on a majority of datasets. We also present a new way of approximating the datasets in the framework of random forests.

READ FULL TEXT
research
11/03/2021

Oblique and rotation double random forest

An ensemble of decision trees is known as Random Forest. As suggested by...
research
10/23/2018

On PAC-Bayesian Bounds for Random Forests

Existing guarantees in terms of rigorous upper bounds on the generalizat...
research
12/13/2014

Oriented Edge Forests for Boundary Detection

We present a simple, efficient model for learning boundary detection bas...
research
07/02/2013

Comparing various regression methods on ensemble strategies in differential evolution

Differential evolution possesses a multitude of various strategies for g...
research
09/28/2018

Minimization of Gini impurity via connections with the k-means problem

The Gini impurity is one of the measures used to select attribute in Dec...
research
01/30/2019

Classifier Suites for Insider Threat Detection

Better methods to detect insider threats need new anticipatory analytics...
research
08/31/2016

hi-RF: Incremental Learning Random Forest for large-scale multi-class Data Classification

In recent years, dynamically growing data and incrementally growing numb...

Please sign up or login with your details

Forgot password? Click here to reset