Convergence Rates of Oblique Regression Trees for Flexible Function Libraries

10/26/2022
by   Matias D. Cattaneo, et al.
0

We develop a theoretical framework for the analysis of oblique decision trees, where the splits at each decision node occur at linear combinations of the covariates (as opposed to conventional tree constructions that force axis-aligned splits involving only a single covariate). While this methodology has garnered significant attention from the computer science and optimization communities since the mid-80s, the advantages they offer over their axis-aligned counterparts remain only empirically justified, and explanations for their success are largely based on heuristics. Filling this long-standing gap between theory and practice, we show that oblique regression trees (constructed by recursively minimizing squared error) satisfy a type of oracle inequality and can adapt to a rich library of regression models consisting of linear combinations of ridge functions and their limit points. This provides a quantitative baseline to compare and contrast decision trees with other less interpretable methods, such as projection pursuit regression and neural networks, which target similar model forms. Contrary to popular belief, one need not always trade-off interpretability with accuracy. Specifically, we show that, under suitable conditions, oblique decision trees achieve similar predictive accuracy as neural networks for the same library of regression models. To address the combinatorial complexity of finding the optimal splitting hyperplane at each decision node, our proposed theoretical framework can accommodate many existing computational tools in the literature. Our results rely on (arguably surprising) connections between recursive adaptive partitioning and sequential greedy approximation algorithms for convex optimization problems (e.g., orthogonal greedy algorithms), which may be of independent theoretical interest.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/21/2020

Convex Polytope Trees

A decision tree is commonly restricted to use a single hyperplane to spl...
research
03/16/2022

Greedy Algorithms for Decision Trees with Hypotheses

We investigate at decision trees that incorporate both traditional queri...
research
02/23/2017

Neural Decision Trees

In this paper we propose a synergistic melting of neural networks and de...
research
06/07/2020

Sparse learning with CART

Decision trees with binary splits are popularly constructed using Classi...
research
05/05/2023

Learning Decision Trees with Gradient Descent

Decision Trees (DTs) are commonly used for many machine learning tasks d...
research
09/08/2021

Robust Optimal Classification Trees Against Adversarial Examples

Decision trees are a popular choice of explainable model, but just like ...
research
08/14/2019

Optimizing for Interpretability in Deep Neural Networks with Tree Regularization

Deep models have advanced prediction in many domains, but their lack of ...

Please sign up or login with your details

Forgot password? Click here to reset