Task-wise Split Gradient Boosting Trees for Multi-center Diabetes Prediction

08/16/2021
by   Mingcheng Chen, et al.
10

Diabetes prediction is an important data science application in the social healthcare domain. There exist two main challenges in the diabetes prediction task: data heterogeneity since demographic and metabolic data are of different types, data insufficiency since the number of diabetes cases in a single medical center is usually limited. To tackle the above challenges, we employ gradient boosting decision trees (GBDT) to handle data heterogeneity and introduce multi-task learning (MTL) to solve data insufficiency. To this end, Task-wise Split Gradient Boosting Trees (TSGB) is proposed for the multi-center diabetes prediction task. Specifically, we firstly introduce task gain to evaluate each task separately during tree construction, with a theoretical analysis of GBDT's learning objective. Secondly, we reveal a problem when directly applying GBDT in MTL, i.e., the negative task gain problem. Finally, we propose a novel split method for GBDT in MTL based on the task gain statistics, named task-wise split, as an alternative to standard feature-wise split to overcome the mentioned negative task gain problem. Extensive experiments on a large-scale real-world diabetes dataset and a commonly used benchmark dataset demonstrate TSGB achieves superior performance against several state-of-the-art methods. Detailed case studies further support our analysis of negative task gain problems and provide insightful findings. The proposed TSGB method has been deployed as an online diabetes risk assessment software for early diagnosis.

READ FULL TEXT

page 2

page 3

page 4

page 5

page 6

page 9

page 10

page 11

research
01/17/2022

MT-GBM: A Multi-Task Gradient Boosting Machine with Shared Decision Trees

Despite the success of deep learning in computer vision and natural lang...
research
05/18/2023

Unbiased Gradient Boosting Decision Tree with Unbiased Feature Importance

Gradient Boosting Decision Tree (GBDT) has achieved remarkable success i...
research
01/25/2018

Information gain ratio correction: Improving prediction with more balanced decision tree splits

Decision trees algorithms use a gain function to select the best split d...
research
11/25/2015

Unifying Decision Trees Split Criteria Using Tsallis Entropy

The construction of efficient and effective decision trees remains a key...
research
10/29/2019

Minimal Variance Sampling in Stochastic Gradient Boosting

Stochastic Gradient Boosting (SGB) is a widely used approach to regulari...
research
02/10/2020

Collaborative Training of Balanced Random Forests for Open Set Domain Adaptation

In this paper, we introduce a collaborative training algorithm of balanc...

Please sign up or login with your details

Forgot password? Click here to reset