Random forests for survival analysis using maximally selected rank statistics

05/11/2016
by   Marvin N. Wright, et al.
0

The most popular approach for analyzing survival data is the Cox regression model. The Cox model may, however, be misspecified, and its proportionality assumption is not always fulfilled. An alternative approach is random forests for survival outcomes. The standard split criterion for random survival forests is the log-rank test statistics, which favors splitting variables with many possible split points. Conditional inference forests avoid this split point selection bias. However, linear rank statistics are utilized in current software for conditional inference forests to select the optimal splitting variable, which cannot detect non-linear effects in the independent variables. We therefore use maximally selected rank statistics for split point selection in random forests for survival analysis. As in conditional inference forests, p-values for association between split points and survival time are minimized. We describe several p-value approximations and the implementation of the proposed random forest approach. A simulation study demonstrates that unbiased split point selection is possible. However, there is a trade-off between unbiased split point selection and runtime. In benchmark studies of prediction performance on simulated and real datasets the new method performs better than random survival forests if informative dichotomous variables are combined with uninformative variables with more categories and better than conditional inference forests if non-linear covariate effects are included. In a runtime comparison the method proves to be computationally faster than both alternatives, if a simple p-value approximation is used.

READ FULL TEXT
research
02/05/2019

Survival Forests under Test: Impact of the Proportional Hazards Assumption on Prognostic and Predictive Forests for ALS Survival

We investigate the effect of the proportional hazards assumption on prog...
research
06/24/2019

The Power of Unbiased Recursive Partitioning: A Unifying View of CTree, MOB, and GUIDE

A core step of every algorithm for learning regression trees is the sele...
research
08/17/2020

Stochastic Optimization Forests

We study conditional stochastic optimization problems, where we leverage...
research
09/11/2020

DART: Data Addition and Removal Trees

How can we update data for a machine learning model after it has already...
research
09/01/2023

Area-norm COBRA on Conditional Survival Prediction

The paper explores a different variation of combined regression strategy...
research
08/18/2015

ranger: A Fast Implementation of Random Forests for High Dimensional Data in C++ and R

We introduce the C++ application and R package ranger. The software is a...
research
09/04/2023

Hidden variables unseen by Random Forests

Random Forests are widely claimed to capture interactions well. However,...

Please sign up or login with your details

Forgot password? Click here to reset