Transformers for prompt-level EMA non-response prediction

11/01/2021
by   Supriya Nagesh, et al.
5

Ecological Momentary Assessments (EMAs) are an important psychological data source for measuring current cognitive states, affect, behavior, and environmental factors from participants in mobile health (mHealth) studies and treatment programs. Non-response, in which participants fail to respond to EMA prompts, is an endemic problem. The ability to accurately predict non-response could be utilized to improve EMA delivery and develop compliance interventions. Prior work has explored classical machine learning models for predicting non-response. However, as increasingly large EMA datasets become available, there is the potential to leverage deep learning models that have been effective in other fields. Recently, transformer models have shown state-of-the-art performance in NLP and other domains. This work is the first to explore the use of transformers for EMA data analysis. We address three key questions in applying transformers to EMA data: 1. Input representation, 2. encoding temporal information, 3. utility of pre-training on improving downstream prediction task performance. The transformer model achieves a non-response prediction AUC of 0.77 and is significantly better than classical ML and LSTM-based deep learning models. We will make our a predictive model trained on a corpus of 40K EMA samples freely-available to the research community, in order to facilitate the development of future transformer-based EMA analysis works.

READ FULL TEXT

page 6

page 16

research
09/18/2023

FactoFormer: Factorized Hyperspectral Transformers with Self-Supervised Pre-Training

Hyperspectral images (HSIs) contain rich spectral and spatial informatio...
research
02/17/2022

Graph Masked Autoencoder

Transformers have achieved state-of-the-art performance in learning grap...
research
11/09/2021

Multi-Task Prediction of Clinical Outcomes in the Intensive Care Unit using Flexible Multimodal Transformers

Recent deep learning research based on Transformer model architectures h...
research
12/30/2020

Optimizing Deeper Transformers on Small Datasets: An Application on Text-to-SQL Semantic Parsing

Due to the common belief that training deep transformers from scratch re...
research
07/13/2022

Entry-Flipped Transformer for Inference and Prediction of Participant Behavior

Some group activities, such as team sports and choreographed dances, inv...
research
09/17/2023

A Few-Shot Approach to Dysarthric Speech Intelligibility Level Classification Using Transformers

Dysarthria is a speech disorder that hinders communication due to diffic...
research
11/16/2020

Detecting Receptivity for mHealth Interventions in the Natural Environment

JITAI is an emerging technique with great potential to support health be...

Please sign up or login with your details

Forgot password? Click here to reset