DeepAI AI Chat
Log In Sign Up

Optimally Pruning Decision Tree Ensembles With Feature Cost

01/05/2016
by   Feng Nan, et al.
0

We consider the problem of learning decision rules for prediction with feature budget constraint. In particular, we are interested in pruning an ensemble of decision trees to reduce expected feature cost while maintaining high prediction accuracy for any test example. We propose a novel 0-1 integer program formulation for ensemble pruning. Our pruning formulation is general - it takes any ensemble of decision trees as input. By explicitly accounting for feature-sharing across trees together with accuracy/cost trade-off, our method is able to significantly reduce feature cost by pruning subtrees that introduce more loss in terms of feature cost than benefit in terms of prediction accuracy gain. Theoretically, we prove that a linear programming relaxation produces the exact solution of the original integer program. This allows us to use efficient convex optimization tools to obtain an optimally pruned ensemble for any given budget. Empirically, we see that our pruning algorithm significantly improves the performance of the state of the art ensemble method BudgetRF.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/16/2016

Pruning Random Forests for Prediction on a Budget

We propose to prune a random forest (RF) for resource-constrained predic...
01/24/2023

A Robust Hypothesis Test for Tree Ensemble Pruning

Gradient boosted decision trees are some of the most popular algorithms ...
01/30/2023

Optimal Decision Tree Policies for Markov Decision Processes

Interpretability of reinforcement learning policies is essential for man...
03/24/2020

Born-Again Tree Ensembles

The use of machine learning algorithms in finance, medicine, and crimina...
06/13/2018

Ensemble Pruning based on Objection Maximization with a General Distributed Framework

Ensemble pruning, selecting a subset of individual learners from an orig...
05/31/2022

ForestPrune: Compact Depth-Controlled Tree Ensembles

Tree ensembles are versatile supervised learning algorithms that achieve...
06/03/2011

An Analysis of Reduced Error Pruning

Top-down induction of decision trees has been observed to suffer from th...