Continuous Affect Prediction Using Eye Gaze and Speech

03/05/2018
by   Jonny O'Dwyer, et al.
0

Affective computing research traditionally focused on labeling a person's emotion as one of a discrete number of classes e.g. happy or sad. In recent times, more attention has been given to continuous affect prediction across dimensions in the emotional space, e.g. arousal and valence. Continuous affect prediction is the task of predicting a numerical value for different emotion dimensions. The application of continuous affect prediction is powerful in domains involving real-time audio-visual communications which could include remote or assistive technologies for psychological assessment of subjects. Modalities used for continuous affect prediction may include speech, facial expressions and physiological responses. As opposed to single modality analysis, the research community have combined multiple modalities to improve the accuracy of continuous affect prediction. In this context, this paper investigates a continuous affect prediction system using the novel combination of speech and eye gaze. A new eye gaze feature set is proposed. This novel approach uses open source software for real-time affect prediction in audio-visual communication environments. A unique advantage of the human-computer interface used here is that it does not require the subject to wear specialized and expensive eye-tracking headsets or intrusive devices. The results indicate that the combination of speech and eye gaze improves arousal prediction by 3.5 alone.

READ FULL TEXT
research
03/05/2018

Continuous Affect Prediction using Eye Gaze

In recent times, there has been significant interest in the machine reco...
research
05/17/2018

Affective computing using speech and eye gaze: a review and bimodal system proposal for continuous affect prediction

Speech has been a widely used modality in the field of affective computi...
research
07/23/2019

Speech, Head, and Eye-based Cues for Continuous Affect Prediction

Continuous affect prediction involves the discrete time-continuous regre...
research
07/23/2019

Eye-based Continuous Affect Prediction

Eye-based information channels include the pupils, gaze, saccades, fixat...
research
06/29/2023

CORAE: A Tool for Intuitive and Continuous Retrospective Evaluation of Interactions

This paper introduces CORAE, a novel web-based open-source tool for COnt...
research
04/05/2022

CalmResponses: Displaying Collective Audience Reactions in Remote Communication

We propose a system displaying audience eye gaze and nod reactions for e...
research
08/14/2018

Looking Beyond a Clever Narrative: Visual Context and Attention are Primary Drivers of Affect in Video Advertisements

Emotion evoked by an advertisement plays a key role in influencing brand...

Please sign up or login with your details

Forgot password? Click here to reset