Log In Sign Up

Learning Binary Trees via Sparse Relaxation

by   Valentina Zantedeschi, et al.

One of the most classical problems in machine learning is how to learn binary trees that split data into useful partitions. From classification/regression via decision trees to hierarchical clustering, binary trees are useful because they (a) are often easy to visualize; (b) make computationally-efficient predictions; and (c) allow for flexible partitioning. Because of this there has been extensive research on how to learn such trees that generally fall into one of three categories: 1. greedy node-by-node optimization; 2. probabilistic relaxations for differentiability; 3. mixed-integer programs (MIP). Each of these have downsides: greedy can myopically choose poor splits, probabilistic relaxations do not have principled ways to prune trees, MIP methods can be slow on large problems and may not generalize. In this work we derive a novel sparse relaxation for binary tree learning. By deriving a new MIP and sparsely relaxing it, our approach is able to learn tree splits and tree pruning using argmin differentiation. We demonstrate how our approach is easily visualizable and is competitive with current tree-based approaches in classification/regression and hierarchical clustering. Source code is available at .


page 1

page 2

page 3

page 4


Bayesian Rose Trees

Hierarchical structure is ubiquitous in data across many domains. There ...

Mixed integer linear optimization formulations for learning optimal binary classification trees

Decision trees are powerful tools for classification and regression that...

Decision trees as partitioning machines to characterize their generalization properties

Decision trees are popular machine learning models that are simple to bu...

Discovering Descriptive Tile Trees by Mining Optimal Geometric Subtiles

When analysing binary data, the ease at which one can interpret results ...

From Trees to Continuous Embeddings and Back: Hyperbolic Hierarchical Clustering

Similarity-based Hierarchical Clustering (HC) is a classical unsupervise...

Sparse learning with CART

Decision trees with binary splits are popularly constructed using Classi...

Neural Regression Trees

Regression-via-Classification (RvC) is the process of converting a regre...