Heterogeneous Oblique Double Random Forest

04/13/2023
by   M. A. Ganaie, et al.
0

The decision tree ensembles use a single data feature at each node for splitting the data. However, splitting in this manner may fail to capture the geometric properties of the data. Thus, oblique decision trees generate the oblique hyperplane for splitting the data at each non-leaf node. Oblique decision trees capture the geometric properties of the data and hence, show better generalization. The performance of the oblique decision trees depends on the way oblique hyperplanes are generate and the data used for the generation of those hyperplanes. Recently, multiple classifiers have been used in a heterogeneous random forest (RaF) classifier, however, it fails to generate the trees of proper depth. Moreover, double RaF studies highlighted that larger trees can be generated via bootstrapping the data at each non-leaf node and splitting the original data instead of the bootstrapped data recently. The study of heterogeneous RaF lacks the generation of larger trees while as the double RaF based model fails to take over the geometric characteristics of the data. To address these shortcomings, we propose heterogeneous oblique double RaF. The proposed model employs several linear classifiers at each non-leaf node on the bootstrapped data and splits the original data based on the optimal linear classifier. The optimal hyperplane corresponds to the models based on the optimized impurity criterion. The experimental analysis indicates that the performance of the introduced heterogeneous double random forest is comparatively better than the baseline models. To demonstrate the effectiveness of the proposed heterogeneous double random forest, we used it for the diagnosis of Schizophrenia disease. The proposed model predicted the disease more accurately compared to the baseline models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/03/2021

Oblique and rotation double random forest

An ensemble of decision trees is known as Random Forest. As suggested by...
research
06/19/2015

CO2 Forest: Improved Random Forest by Continuous Optimization of Oblique Splits

We propose a novel algorithm for optimizing multivariate linear threshol...
research
04/23/2022

A Novel Splitting Criterion Inspired by Geometric Mean Metric Learning for Decision Tree

Decision tree (DT) attracts persistent research attention due to its imp...
research
01/20/2021

Autocart – spatially-aware regression trees for ecological and spatial modeling

Many ecological and spatial processes are complex in nature and are not ...
research
11/19/2021

MURAL: An Unsupervised Random Forest-Based Embedding for Electronic Health Record Data

A major challenge in embedding or visualizing clinical patient data is t...
research
06/02/2023

Hierarchical Quadratic Random Forest Classifier

In this paper, we proposed a hierarchical quadratic random forest classi...

Please sign up or login with your details

Forgot password? Click here to reset