CatBoost: gradient boosting with categorical features support

10/24/2018
by   Anna Veronika Dorogush, et al.
0

In this paper we present CatBoost, a new open-sourced gradient boosting library that successfully handles categorical features and outperforms existing publicly available implementations of gradient boosting in terms of quality on a set of popular publicly available datasets. The library has a GPU implementation of learning algorithm and a CPU implementation of scoring algorithm, which are significantly faster than other gradient boosting libraries on ensembles of similar sizes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/19/2020

Out-of-Core GPU Gradient Boosting

GPU-based algorithms have greatly accelerated many machine learning meth...
research
07/20/2020

Wide Boosting

Gradient boosting (GB) is a popular methodology used to solve prediction...
research
05/26/2023

Benchmarking state-of-the-art gradient boosting algorithms for classification

This work explores the use of gradient boosting in the context of classi...
research
07/08/2020

StructureBoost: Efficient Gradient Boosting for Structured Categorical Variables

Gradient boosting methods based on Structured Categorical Decision Trees...
research
07/01/2023

JoinBoost: Grow Trees Over Normalized Data Using Only SQL

Although dominant for tabular data, ML libraries that train tree models ...
research
10/31/2017

TF Boosted Trees: A scalable TensorFlow based framework for gradient boosting

TF Boosted Trees (TFBT) is a new open-sourced frame-work for the distrib...
research
10/12/2020

A Generalized Stacking for Implementing Ensembles of Gradient Boosting Machines

The gradient boosting machine is one of the powerful tools for solving r...

Please sign up or login with your details

Forgot password? Click here to reset