BigSurvSGD: Big Survival Data Analysis via Stochastic Gradient Descent

02/28/2020
by   Aliasghar Tarkhan, et al.
0

In many biomedical applications, outcome is measured as a “time-to-event” (eg. disease progression or death). To assess the connection between features of a patient and this outcome, it is common to assume a proportional hazards model, and fit a proportional hazards regression (or Cox regression). To fit this model, a log-concave objective function known as the “partial likelihood” is maximized. For moderate-sized datasets, an efficient Newton-Raphson algorithm that leverages the structure of the objective can be employed. However, in large datasets this approach has two issues: 1) The computational tricks that leverage structure can also lead to computational instability; 2) The objective does not naturally decouple: Thus, if the dataset does not fit in memory, the model can be very computationally expensive to fit. This additionally means that the objective is not directly amenable to stochastic gradient-based optimization methods. To overcome these issues, we propose a simple, new framing of proportional hazards regression: This results in an objective function that is amenable to stochastic gradient descent. We show that this simple modification allows us to efficiently fit survival models with very large datasets. This also facilitates training complex, eg. neural-network-based, models with survival data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/19/2018

Nonparametric Bayesian Lomax delegate racing for survival analysis with competing risks

We propose Lomax delegate racing (LDR) to explicitly model the mechanism...
research
08/13/2016

SGDR: Stochastic Gradient Descent with Warm Restarts

Restart techniques are common in gradient-free optimization to deal with...
research
03/01/2023

Non asymptotic analysis of Adaptive stochastic gradient algorithms and applications

In stochastic optimization, a common tool to deal sequentially with larg...
research
07/03/2020

Weak error analysis for stochastic gradient descent optimization algorithms

Stochastic gradient descent (SGD) type optimization schemes are fundamen...
research
06/14/2021

Smart Gradient – An Adaptive Technique for Improving Gradient Estimation

Computing the gradient of a function provides fundamental information ab...
research
02/14/2020

Stochasticity of Deterministic Gradient Descent: Large Learning Rate for Multiscale Objective Function

This article suggests that deterministic Gradient Descent, which does no...
research
03/06/2023

A neural network based model for multi-dimensional nonlinear Hawkes processes

This paper introduces the Neural Network for Nonlinear Hawkes processes ...

Please sign up or login with your details

Forgot password? Click here to reset