Continuous Affect Prediction using Eye Gaze

03/05/2018
by   Jonny O'Dwyer, et al.
0

In recent times, there has been significant interest in the machine recognition of human emotions, due to the suite of applications to which this knowledge can be applied. A number of different modalities, such as speech or facial expression, individually and with eye gaze, have been investigated by the affective computing research community to either classify the emotion (e.g. sad, happy, angry) or predict the continuous values of affective dimensions (e.g. valence, arousal, dominance) at each moment in time. Surprisingly after an extensive literature review, eye gaze as a unimodal input to a continuous affect prediction system has not been considered. In this context, this paper evaluates the use of eye gaze as a unimodal input to a continuous affect prediction system. The performance of continuous prediction of arousal and valence using eye gaze is compared with the performance of a speech system using the AVEC 2014 speech feature set. The experimental evaluation when using eye gaze as the single modality in a continuous affect prediction system produced a correlation result for valence prediction that is better than the correlation result obtained with the AVEC 2014 speech feature set. Furthermore, the eye gaze feature set proposed in this paper contains 98 compared to the number of features in the AVEC 2014 feature set.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/05/2018

Continuous Affect Prediction Using Eye Gaze and Speech

Affective computing research traditionally focused on labeling a person'...
research
05/17/2018

Affective computing using speech and eye gaze: a review and bimodal system proposal for continuous affect prediction

Speech has been a widely used modality in the field of affective computi...
research
07/23/2019

Eye-based Continuous Affect Prediction

Eye-based information channels include the pupils, gaze, saccades, fixat...
research
06/10/2021

A Wearable Virtual Touch System for Cars

In automotive domain, operation of secondary tasks like accessing infota...
research
03/24/2021

PureGaze: Purifying Gaze Feature for Generalizable Gaze Estimation

Gaze estimation methods learn eye gaze from facial features. However, am...
research
08/02/2022

Can Gaze Beat Touch? A Fitts' Law Evaluation of Gaze, Touch, and Mouse Inputs

Gaze input has been a promising substitute for mouse input for point and...
research
07/28/2023

When to generate hedges in peer-tutoring interactions

This paper explores the application of machine learning techniques to pr...

Please sign up or login with your details

Forgot password? Click here to reset