Gradient Boosting Neural Networks: GrowNet

02/19/2020
by   Sarkhan Badirli, et al.
0

A novel gradient boosting framework is proposed where shallow neural networks are employed as "weak learners". General loss functions are considered under this unified framework with specific examples presented for classification, regression and learning to rank. A fully corrective step is incorporated to remedy the pitfall of greedy function approximation of classic gradient boosting decision tree. The proposed model rendered state-of-the-art results in all three tasks on multiple datasets. An ablation study is performed to shed light on the effect of each model components and model hyperparameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/20/2020

SGLB: Stochastic Gradient Langevin Boosting

In this paper, we introduce Stochastic Gradient Langevin Boosting (SGLB)...
research
11/05/2019

A Deep Gradient Boosting Network for Optic Disc and Cup Segmentation

Segmentation of optic disc (OD) and optic cup (OC) is critical in automa...
research
12/12/2022

GWRBoost:A geographically weighted gradient boosting method for explainable quantification of spatially-varying relationships

The geographically weighted regression (GWR) is an essential tool for es...
research
12/05/2019

RoNGBa: A Robustly Optimized Natural Gradient Boosting Training Approach with Leaf Number Clipping

Natural gradient has been recently introduced to the field of boosting t...
research
09/05/2011

Learning Nonlinear Functions Using Regularized Greedy Forest

We consider the problem of learning a forest of nonlinear decision rules...
research
07/12/2016

Improved Multi-Class Cost-Sensitive Boosting via Estimation of the Minimum-Risk Class

We present a simple unified framework for multi-class cost-sensitive boo...
research
09/28/2022

TRBoost: A Generic Gradient Boosting Machine based on Trust-region Method

A generic Gradient Boosting Machine called Trust-region Boosting (TRBoos...

Please sign up or login with your details

Forgot password? Click here to reset