Eye-based Continuous Affect Prediction

07/23/2019
by   Jonny O'Dwyer, et al.
0

Eye-based information channels include the pupils, gaze, saccades, fixational movements, and numerous forms of eye opening and closure. Pupil size variation indicates cognitive load and emotion, while a person's gaze direction is said to be congruent with the motivation to approach or avoid stimuli. The eyelids are involved in facial expressions that can encode basic emotions. Additionally, eye-based cues can have implications for human annotators of emotions or feelings. Despite these facts, the use of eye-based cues in affective computing is in its infancy, however, and this work is intended to start to address this. Eye-based feature sets, incorporating data from all of the aforementioned information channels, that can be estimated from video are proposed. Feature set refinement is provided by way of continuous arousal and valence learning and prediction experiments on the RECOLA validation set. The eye-based features are then combined with a speech feature set to provide confirmation of their usefulness and assess affect prediction performance compared with group-of-humans-level performance on the RECOLA test set. The core contribution of this paper, a refined eye-based feature set, is shown to provide benefits for affect prediction. It is hoped that this work stimulates further research into eye-based affective computing.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/05/2018

Continuous Affect Prediction using Eye Gaze

In recent times, there has been significant interest in the machine reco...
research
07/23/2019

Speech, Head, and Eye-based Cues for Continuous Affect Prediction

Continuous affect prediction involves the discrete time-continuous regre...
research
08/14/2018

Looking Beyond a Clever Narrative: Visual Context and Attention are Primary Drivers of Affect in Video Advertisements

Emotion evoked by an advertisement plays a key role in influencing brand...
research
05/17/2018

Affective computing using speech and eye gaze: a review and bimodal system proposal for continuous affect prediction

Speech has been a widely used modality in the field of affective computi...
research
03/05/2018

Continuous Affect Prediction Using Eye Gaze and Speech

Affective computing research traditionally focused on labeling a person'...
research
06/23/2020

Gender and Emotion Recognition from Implicit User Behavior Signals

This work explores the utility of implicit behavioral cues, namely, Elec...
research
08/29/2017

Discovering Gender Differences in Facial Emotion Recognition via Implicit Behavioral Cues

We examine the utility of implicit behavioral cues in the form of EEG br...

Please sign up or login with your details

Forgot password? Click here to reset