BooST: Boosting Smooth Trees for Partial Effect Estimation in Nonlinear Regressions

08/10/2018
by   Yuri Fonseca, et al.
0

In this paper we introduce a new machine learning (ML) model for nonlinear regression called Boosting Smooth Transition Regression Tree (BooST). The main advantage of the BooST is that it estimates the derivatives (partial effects) of very general nonlinear models, providing more interpretation than other tree based models concerning the mapping between the covariates and the dependent variable. We provide some asymptotic theory that shows consistency of the partial derivatives and we present some examples on simulated and empirical data.

READ FULL TEXT

page 14

page 17

page 19

page 25

page 26

page 27

page 28

page 29

research
04/26/2021

Infinitesimal gradient boosting

We define infinitesimal gradient boosting as a limit of the popular tree...
research
07/14/2022

Using Model-Based Trees with Boosting to Fit Low-Order Functional ANOVA Models

Low-order functional ANOVA (fANOVA) models have been rediscovered in the...
research
02/03/2019

Recycled Two-Stage Estimation in Nonlinear Mixed Effects Regression Models

We consider a re-sampling scheme for estimation of the population parame...
research
04/01/2019

Tree Boosted Varying Coefficient Models

This paper investigates the integration of gradient boosted decision tre...
research
07/03/2021

Boost-R: Gradient Boosted Trees for Recurrence Data

Recurrence data arise from multi-disciplinary domains spanning reliabili...
research
10/10/2022

A copula-based boosting model for time-to-event prediction with dependent censoring

A characteristic feature of time-to-event data analysis is possible cens...
research
04/27/2022

Performance and Interpretability Comparisons of Supervised Machine Learning Algorithms: An Empirical Study

This paper compares the performances of three supervised machine learnin...

Please sign up or login with your details

Forgot password? Click here to reset