DeepAI AI Chat
Log In Sign Up

An evaluation of randomized machine learning methods for redundant data: Predicting short and medium-term suicide risk from administrative records and risk assessments

by   Thuong Nguyen, et al.
Deakin University

Accurate prediction of suicide risk in mental health patients remains an open problem. Existing methods including clinician judgments have acceptable sensitivity, but yield many false positives. Exploiting administrative data has a great potential, but the data has high dimensionality and redundancies in the recording processes. We investigate the efficacy of three most effective randomized machine learning techniques random forests, gradient boosting machines, and deep neural nets with dropout in predicting suicide risk. Using a cohort of mental health patients from a regional Australian hospital, we compare the predictive performance with popular traditional approaches clinician judgments based on a checklist, sparse logistic regression and decision trees. The randomized methods demonstrated robustness against data redundancies and superior predictive performance on AUC and F-measure.


page 13

page 17


Interpretable Multi-Task Deep Neural Networks for Dynamic Predictions of Postoperative Complications

Accurate prediction of postoperative complications can inform shared dec...

A Machine Learning Analysis of COVID-19 Mental Health Data

In late December 2019, the novel coronavirus (Sars-Cov-2) and the result...

An Interpretable Intensive Care Unit Mortality Risk Calculator

Mortality risk is a major concern to patients have just been discharged ...

Accuracy of the Epic Sepsis Prediction Model in a Regional Health System

Interest in an electronic health record-based computational model that c...

Predicting Injectable Medication Adherence via a Smart Sharps Bin and Machine Learning

Medication non-adherence is a widespread problem affecting over 50 who h...

Grabit: Gradient Tree Boosted Tobit Models for Default Prediction

We introduce a novel model which is obtained by applying gradient tree b...