Oblique and rotation double random forest

11/03/2021
by   M. A. Ganaie, et al.
0

An ensemble of decision trees is known as Random Forest. As suggested by Breiman, the strength of unstable learners and the diversity among them are the ensemble models' core strength. In this paper, we propose two approaches known as oblique and rotation double random forests. In the first approach, we propose a rotation based double random forest. In rotation based double random forests, transformation or rotation of the feature space is generated at each node. At each node different random feature subspace is chosen for evaluation, hence the transformation at each node is different. Different transformations result in better diversity among the base learners and hence, better generalization performance. With the double random forest as base learner, the data at each node is transformed via two different transformations namely, principal component analysis and linear discriminant analysis. In the second approach, we propose oblique double random forest. Decision trees in random forest and double random forest are univariate, and this results in the generation of axis parallel split which fails to capture the geometric structure of the data. Also, the standard random forest may not grow sufficiently large decision trees resulting in suboptimal performance. To capture the geometric properties and to grow the decision trees of sufficient depth, we propose oblique double random forest. The oblique double random forest models are multivariate decision trees. At each non-leaf node, multisurface proximal support vector machine generates the optimal plane for better generalization performance. Also, different regularization techniques (Tikhonov regularisation and axis-parallel split regularisation) are employed for tackling the small sample size problems in the decision trees of oblique double random forest.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/13/2023

Heterogeneous Oblique Double Random Forest

The decision tree ensembles use a single data feature at each node for s...
research
09/01/2020

Improved Weighted Random Forest for Classification Problems

Several studies have shown that combining machine learning models in an ...
research
09/02/2019

Guided Random Forest and its application to data approximation

We present a new way of constructing an ensemble classifier, named the G...
research
09/30/2020

Uncovering Feature Interdependencies in Complex Systems with Non-Greedy Random Forests

A "non-greedy" variation of the random forest algorithm is presented to ...
research
02/17/2021

BEDS: Bagging ensemble deep segmentation for nucleus segmentation with testing stage stain augmentation

Reducing outcome variance is an essential task in deep learning based me...
research
07/13/2022

Contextual Decision Trees

Focusing on Random Forests, we propose a multi-armed contextual bandit r...
research
11/09/2015

Spatially Coherent Random Forests

Spatially Coherent Random Forest (SCRF) extends Random Forest to create ...

Please sign up or login with your details

Forgot password? Click here to reset