Log In Sign Up

Automated Alertness and Emotion Detection for Empathic Feedback During E-Learning

by   S L Happy, et al.

In the context of education technology, empathic interaction with the user and feedback by the learning system using multiple inputs such as video, voice and text inputs is an important area of research. In this paper, a nonintrusive, standalone model for intelligent assessment of alertness and emotional state as well as generation of appropriate feedback has been proposed. Using the non-intrusive visual cues, the system classifies emotion and alertness state of the user, and provides appropriate feedback according to the detected cognitive state using facial expressions, ocular parameters, postures, and gestures. Assessment of alertness level using ocular parameters such as PERCLOS and saccadic parameters, emotional state from facial expression analysis, and detection of both relevant cognitive and emotional states from upper body gestures and postures has been proposed. Integration of such a system in e-learning environment is expected to enhance students performance through interaction, feedback, and positive mood induction.


Fusing Body Posture with Facial Expressions for Joint Recognition of Affect in Child-Robot Interaction

In this paper we address the problem of multi-cue affect recognition in ...

Multimodal Affect Analysis for Product Feedback Assessment

Consumers often react expressively to products such as food samples, per...

Artificial Emotional Intelligence in Socially Assistive Robots for Older Adults: A Pilot Study

This paper presents our recent research on integrating artificial emotio...

EmoFit: Affect Monitoring System for Sedentary Jobs

Emotional and physical well-being at workplace is important for a positi...

Survey on Emotional Body Gesture Recognition

Automatic emotion recognition has become a trending research topic in th...