RoNGBa: A Robustly Optimized Natural Gradient Boosting Training Approach with Leaf Number Clipping

12/05/2019
by   Liliang Ren, et al.
0

Natural gradient has been recently introduced to the field of boosting to enable the generic probabilistic predication capability. Natural gradient boosting shows promising performance improvements on small datasets due to better training dynamics, but it suffers from slow training speed overhead especially for large datasets. We present a replication study of NGBoost(Duan et al., 2019) training that carefully examines the impacts of key hyper-parameters under the circumstance of best-first decision tree learning. We find that with the regularization of leaf number clipping, the performance of NGBoost can be largely improved via a better choice of hyperparameters. Experiments show that our approach significantly beats the state-of-the-art performance on various kinds of datasets from the UCI Machine Learning Repository while still has up to 4.85x speed up compared with the original approach of NGBoost.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/08/2019

NGBoost: Natural Gradient Boosting for Probabilistic Prediction

We present Natural Gradient Boosting (NGBoost), an algorithm which bring...
research
02/19/2020

Gradient Boosting Neural Networks: GrowNet

A novel gradient boosting framework is proposed where shallow neural net...
research
04/12/2018

Asynchronous Parallel Sampling Gradient Boosting Decision Tree

With the development of big data technology, Gradient Boosting Decision ...
research
02/12/2023

Efficient Fraud Detection using Deep Boosting Decision Trees

Fraud detection is to identify, monitor, and prevent potentially fraudul...
research
04/12/2018

Asynch-SGBDT: Asynchronous Parallel Stochastic Gradient Boosting Decision Tree based on Parameters Server

Gradient Boosting Decision Tree, i.e. GBDT, becomes one of the most impo...
research
11/05/2019

A Comparative Analysis of XGBoost

XGBoost is a scalable ensemble technique based on gradient boosting that...
research
09/17/2019

Communication-Efficient Weighted Sampling and Quantile Summary for GBDT

Gradient boosting decision tree (GBDT) is a powerful and widely-used mac...

Please sign up or login with your details

Forgot password? Click here to reset