Emotion-Recognition Using Smart Watch Accelerometer Data: Preliminary Findings

09/26/2017
by   Juan C. Quiroz, et al.
0

This study investigates the use of accelerometer data from a smart watch to infer an individual's emotional state. We present our preliminary findings on a user study with 50 participants. Participants were primed either with audio-visual (movie clips) or audio (classical music) to elicit emotional responses. Participants then walked while wearing a smart watch on one wrist and a heart rate strap on their chest. Our hypothesis is that the accelerometer signal will exhibit different patterns for participants in response to different emotion priming. We divided the accelerometer data using sliding windows, extracted features from each window, and used the features to train supervised machine learning algorithms to infer an individual's emotion from their walking pattern. Our discussion includes a description of the methodology, data collected, and early results.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset