Tree in Tree: from Decision Trees to Decision Graphs

10/01/2021
by   Bingzhao Zhu, et al.
0

Decision trees have been widely used as classifiers in many machine learning applications thanks to their lightweight and interpretable decision process. This paper introduces Tree in Tree decision graph (TnT), a framework that extends the conventional decision tree to a more generic and powerful directed acyclic graph. TnT constructs decision graphs by recursively growing decision trees inside the internal or leaf nodes instead of greedy training. The time complexity of TnT is linear to the number of nodes in the graph, and it can construct decision graphs on large datasets. Compared to decision trees, we show that TnT achieves better classification performance with reduced model size, both as a stand-alone classifier and as a base estimator in bagging/AdaBoost ensembles. Our proposed model is a novel, more efficient, and accurate alternative to the widely-used decision trees.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/02/2023

Construction of Decision Trees and Acyclic Decision Graphs from Decision Rule Systems

Decision trees and systems of decision rules are widely used as classifi...
research
11/26/2022

Mixture of Decision Trees for Interpretable Machine Learning

This work introduces a novel interpretable machine learning method calle...
research
10/21/2020

Convex Polytope Trees

A decision tree is commonly restricted to use a single hyperplane to spl...
research
02/05/2018

Enhancing Multi-Class Classification of Random Forest using Random Vector Functional Neural Network and Oblique Decision Surfaces

Both neural networks and decision trees are popular machine learning met...
research
12/03/2021

A Flexible HLS Hoeffding Tree Implementation for Runtime Learning on FPGA

Decision trees are often preferred when implementing Machine Learning in...
research
07/04/2018

Generating Mandarin and Cantonese F0 Contours with Decision Trees and BLSTMs

This paper models the fundamental frequency contours on both Mandarin an...
research
09/10/2019

GBDT-MO: Gradient Boosted Decision Trees for Multiple Outputs

Gradient boosted decision trees (GBDTs) are widely used in machine learn...

Please sign up or login with your details

Forgot password? Click here to reset