Feature Interactions in XGBoost

07/11/2020
by   Kshitij Goyal, et al.
32

In this paper, we investigate how feature interactions can be identified to be used as constraints in the gradient boosting tree models using XGBoost's implementation. Our results show that accurate identification of these constraints can help improve the performance of baseline XGBoost model significantly. Further, the improvement in the model structure can also lead to better interpretability.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/08/2023

Decision trees compensate for model misspecification

The best-performing models in ML are not interpretable. If we can explai...
research
07/14/2022

Using Model-Based Trees with Boosting to Fit Low-Order Functional ANOVA Models

Low-order functional ANOVA (fANOVA) models have been rediscovered in the...
research
06/08/2015

Interpretable Selection and Visualization of Features and Interactions Using Bayesian Forests

It is becoming increasingly important for machine learning methods to ma...
research
12/21/2021

Explanation of Machine Learning Models Using Shapley Additive Explanation and Application for Real Data in Hospital

When using machine learning techniques in decision-making processes, the...
research
06/21/2023

Feature Interactions Reveal Linguistic Structure in Language Models

We study feature interactions in the context of feature attribution meth...
research
05/13/2021

Extending Models Via Gradient Boosting: An Application to Mendelian Models

Improving existing widely-adopted prediction models is often a more effi...
research
04/16/2021

Learning Feature Interactions With and Without Specifications

Features in product lines and highly configurable systems can interact i...

Please sign up or login with your details

Forgot password? Click here to reset