Multi-Target XGBoostLSS Regression

10/13/2022
by   Alexander März, et al.
0

Current implementations of Gradient Boosting Machines are mostly designed for single-target regression tasks and commonly assume independence between responses when used in multivariate settings. As such, these models are not well suited if non-negligible dependencies exist between targets. To overcome this limitation, we present an extension of XGBoostLSS that models multiple targets and their dependencies in a probabilistic regression setting. Empirical results show that our approach outperforms existing GBMs with respect to runtime and compares well in terms of accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/08/2019

Component-Wise Boosting of Targets for Multi-Output Prediction

Multi-output prediction deals with the prediction of several targets of ...
research
09/29/2020

Quantile Surfaces – Generalizing Quantile Regression to Multivariate Targets

In this article, we present a novel approach to multivariate probabilist...
research
03/22/2020

Multi-target regression via output space quantization

Multi-target regression is concerned with the prediction of multiple con...
research
06/07/2021

Multivariate Probabilistic Regression with Natural Gradient Boosting

Many single-target regression problems require estimates of uncertainty ...
research
04/02/2022

Distributional Gradient Boosting Machines

We present a unified probabilistic gradient boosting framework for regre...
research
02/18/2015

Prediction of Search Targets From Fixations in Open-World Settings

Previous work on predicting the target of visual search from human fixat...
research
11/14/2022

Deep Autoregressive Regression

In this work, we demonstrate that a major limitation of regression using...

Please sign up or login with your details

Forgot password? Click here to reset