Using dynamical quantization to perform split attempts in online tree regressors

A central aspect of online decision tree solutions is evaluating the incoming data and enabling model growth. For such, trees much deal with different kinds of input features and partition them to learn from the data. Numerical features are no exception, and they pose additional challenges compared to other kinds of features, as there is no trivial strategy to choose the best point to make a split decision. The problem is even more challenging in regression tasks because both the features and the target are continuous. Typical online solutions evaluate and store all the points monitored between split attempts, which goes against the constraints posed in real-time applications. In this paper, we introduce the Quantization Observer (QO), a simple yet effective hashing-based algorithm to monitor and evaluate split point candidates in numerical features for online tree regressors. QO can be easily integrated into incremental decision trees, such as Hoeffding Trees, and it has a monitoring cost of O(1) per instance and sub-linear cost to evaluate split candidates. Previous solutions had a O(log n) cost per insertion (in the best case) and a linear cost to evaluate split points. Our extensive experimental setup highlights QO's effectiveness in providing accurate split point suggestions while spending much less memory and processing time than its competitors.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/19/2021

Simple is better: Making Decision Trees faster using random sampling

In recent years, gradient boosted decision trees have become popular in ...
research
03/29/2019

Online Multi-target regression trees with stacked leaf models

The amount of available data raises at large steps. Developing machine l...
research
12/01/2022

Fully-Dynamic Decision Trees

We develop the first fully dynamic algorithm that maintains a decision t...
research
02/14/2023

Scalable Optimal Multiway-Split Decision Trees with Constraints

There has been a surge of interest in learning optimal decision trees us...
research
10/20/2020

An Eager Splitting Strategy for Online Decision Trees

We study the effectiveness of replacing the split strategy for the state...
research
11/25/2015

Unifying Decision Trees Split Criteria Using Tsallis Entropy

The construction of efficient and effective decision trees remains a key...
research
11/14/2016

Splitting matters: how monotone transformation of predictor variables may improve the predictions of decision tree models

It is widely believed that the prediction accuracy of decision tree mode...

Please sign up or login with your details

Forgot password? Click here to reset