Variational Boosted Soft Trees

02/21/2023
by   Tristan Cinquin, et al.
5

Gradient boosting machines (GBMs) based on decision trees consistently demonstrate state-of-the-art results on regression and classification tasks with tabular data, often outperforming deep neural networks. However, these models do not provide well-calibrated predictive uncertainties, which prevents their use for decision making in high-risk applications. The Bayesian treatment is known to improve predictive uncertainty calibration, but previously proposed Bayesian GBM methods are either computationally expensive, or resort to crude approximations. Variational inference is often used to implement Bayesian neural networks, but is difficult to apply to GBMs, because the decision trees used as weak learners are non-differentiable. In this paper, we propose to implement Bayesian GBMs using variational inference with soft decision trees, a fully differentiable alternative to standard decision trees introduced by Irsoy et al. Our experiments demonstrate that variational soft trees and variational soft GBMs provide useful uncertainty estimates, while retaining good predictive performance. The proposed models show higher test likelihoods when compared to the state-of-the-art Bayesian GBMs in 7/10 tabular regression datasets and improved out-of-distribution detection in 5/10 datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/23/2023

NCART: Neural Classification and Regression Tree for Tabular Data

Deep learning models have become popular in the analysis of tabular data...
research
06/07/2020

Soft Gradient Boosting Machine

Gradient Boosting Machine has proven to be one successful function appro...
research
09/02/2019

Bayesian Neural Tree Models for Nonparametric Regression

Frequentist and Bayesian methods differ in many aspects, but share some ...
research
08/26/2021

Distributed Soft Bayesian Additive Regression Trees

Bayesian Additive Regression Trees(BART) is a Bayesian nonparametric app...
research
02/18/2020

The Tree Ensemble Layer: Differentiability meets Conditional Computation

Neural networks and tree ensembles are state-of-the-art learners, each w...
research
07/04/2012

Obtaining Calibrated Probabilities from Boosting

Boosted decision trees typically yield good accuracy, precision, and ROC...
research
06/23/2022

Indecision Trees: Learning Argument-Based Reasoning under Quantified Uncertainty

Using Machine Learning systems in the real world can often be problemati...

Please sign up or login with your details

Forgot password? Click here to reset