BooST: Boosting Smooth Trees for Partial Effect Estimation in Nonlinear Regressions

by   Yuri Fonseca, et al.
Columbia University
Yahoo! Inc.

In this paper we introduce a new machine learning (ML) model for nonlinear regression called Boosting Smooth Transition Regression Tree (BooST). The main advantage of the BooST is that it estimates the derivatives (partial effects) of very general nonlinear models, providing more interpretation than other tree based models concerning the mapping between the covariates and the dependent variable. We provide some asymptotic theory that shows consistency of the partial derivatives and we present some examples on simulated and empirical data.


page 14

page 17

page 19

page 25

page 26

page 27

page 28

page 29


Infinitesimal gradient boosting

We define infinitesimal gradient boosting as a limit of the popular tree...

Using Model-Based Trees with Boosting to Fit Low-Order Functional ANOVA Models

Low-order functional ANOVA (fANOVA) models have been rediscovered in the...

Recycled Two-Stage Estimation in Nonlinear Mixed Effects Regression Models

We consider a re-sampling scheme for estimation of the population parame...

Tree Boosted Varying Coefficient Models

This paper investigates the integration of gradient boosted decision tre...

Boost-R: Gradient Boosted Trees for Recurrence Data

Recurrence data arise from multi-disciplinary domains spanning reliabili...

A copula-based boosting model for time-to-event prediction with dependent censoring

A characteristic feature of time-to-event data analysis is possible cens...

Performance and Interpretability Comparisons of Supervised Machine Learning Algorithms: An Empirical Study

This paper compares the performances of three supervised machine learnin...

Please sign up or login with your details

Forgot password? Click here to reset