A Comparative Analysis of XGBoost

11/05/2019
by   Candice Bentéjac, et al.
27

XGBoost is a scalable ensemble technique based on gradient boosting that has demonstrated to be a reliable and efficient machine learning challenge solver. This work proposes a practical analysis of how this novel technique works in terms of training speed, generalization performance and parameter setup. In addition, a comprehensive comparison between XGBoost, random forests and gradient boosting has been performed using carefully tuned models as well as using the default settings. The results of this comparison may indicate that XGBoost is not necessarily the best choice under all circumstances. Finally an extensive analysis of XGBoost parametrization tuning process is carried out.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/26/2022

Condensed Gradient Boosting

This paper presents a computationally efficient variant of gradient boos...
research
07/25/2022

Forecasting the Short-Term Energy Consumption Using Random Forests and Gradient Boosting

This paper analyzes comparatively the performance of Random Forests and ...
research
09/25/2022

Feature Encodings for Gradient Boosting with Automunge

Selecting a default feature encoding strategy for gradient boosted learn...
research
08/13/2020

An information criterion for automatic gradient tree boosting

An information theoretic approach to learning the complexity of classifi...
research
12/05/2019

RoNGBa: A Robustly Optimized Natural Gradient Boosting Training Approach with Leaf Number Clipping

Natural gradient has been recently introduced to the field of boosting t...
research
12/27/2020

Effective Email Spam Detection System using Extreme Gradient Boosting

The popularity, cost-effectiveness and ease of information exchange that...

Please sign up or login with your details

Forgot password? Click here to reset