Exploring Workplace Behaviors through Speaking Patterns using Large-scale Multimodal Wearable Recordings: A Study of Healthcare Providers

12/18/2022
by   Tiantian Feng, et al.
0

Interpersonal spoken communication is central to human interaction and the exchange of information. Such interactive processes involve not only speech and spoken language but also non-verbal cues such as hand gestures, facial expressions, and nonverbal vocalization, that are used to express feelings and provide feedback. These multimodal communication signals carry a variety of information about the people: traits like gender and age as well as about physical and psychological states and behavior. This work uses wearable multimodal sensors to investigate interpersonal communication behaviors focusing on speaking patterns among healthcare providers with a focus on nurses. We analyze longitudinal data collected from 99 nurses in a large hospital setting over ten weeks. The results indicate that speaking pattern differences across shift schedules and working units. Moreover, results show that speaking patterns combined with physiological measures can be used to predict affect measures and life satisfaction scores. The implementation of this work can be accessed at https://github.com/usc-sail/tiles-audio-arousal.

READ FULL TEXT
research
11/16/2019

Learning Behavioral Representations from Wearable Sensors

The ubiquity of mobile devices and wearable sensors offers unprecedented...
research
03/18/2020

TILES-2018: A longitudinal physiologic and behavioral data set of hospital workers

We present a novel longitudinal multimodal corpus of physiological and b...
research
07/06/2022

Variance in Classifying Affective State via Electrocardiogram and Photoplethysmography

Advances in wearable technology have significantly increased the sensiti...
research
02/14/2018

Breeze: Sharing Biofeedback Through Wearable Technologies

Digitally presenting physiological signals as biofeedback to users raise...
research
01/29/2019

Guidelines for creating man-machine multimodal interfaces

Understanding details of human multimodal interaction can elucidate many...
research
07/31/2018

Multimodal Sensing and Interaction for a Robotic Hand Orthosis

Wearable robotic hand rehabilitation devices can allow greater freedom a...
research
11/21/2019

An analysis of observation length requirements in spoken language for machine understanding of human behaviors

Automatic quantification of human interaction behaviors based on languag...

Please sign up or login with your details

Forgot password? Click here to reset