Evaluating Temporal Patterns in Applied Infant Affect Recognition

09/07/2022
by   Allen Chang, et al.
0

Agents must monitor their partners' affective states continuously in order to understand and engage in social interactions. However, methods for evaluating affect recognition do not account for changes in classification performance that may occur during occlusions or transitions between affective states. This paper addresses temporal patterns in affect classification performance in the context of an infant-robot interaction, where infants' affective states contribute to their ability to participate in a therapeutic leg movement activity. To support robustness to facial occlusions in video recordings, we trained infant affect recognition classifiers using both facial and body features. Next, we conducted an in-depth analysis of our best-performing models to evaluate how performance changed over time as the models encountered missing data and changing infant affect. During time windows when features were extracted with high confidence, a unimodal model trained on facial features achieved the same optimal performance as multimodal models trained on both facial and body features. However, multimodal models outperformed unimodal models when evaluated on the entire dataset. Additionally, model performance was weakest when predicting an affective state transition and improved after multiple predictions of the same affective state. These findings emphasize the benefits of incorporating body features in continuous affect recognition for infants. Our work highlights the importance of evaluating variability in model performance both over time and in the presence of missing data when applying affect recognition to social interactions.

READ FULL TEXT

page 1

page 3

page 5

research
07/31/2022

Towards Intercultural Affect Recognition: Audio-Visual Affect Recognition in the Wild Across Six Cultures

In our multicultural world, affect-aware AI systems that support humans ...
research
10/28/2020

Quantified Facial Temporal-Expressiveness Dynamics for Affect Analysis

The quantification of visual affect data (e.g. face images) is essential...
research
08/17/2021

Affect-Aware Deep Belief Network Representations for Multimodal Unsupervised Deception Detection

Automated systems that detect the social behavior of deception can enhan...
research
08/31/2020

Introducing Representations of Facial Affect in Automated Multimodal Deception Detection

Automated deception detection systems can enhance health, justice, and s...
research
05/07/2017

Multimodal Affect Analysis for Product Feedback Assessment

Consumers often react expressively to products such as food samples, per...
research
03/21/2016

Action-Affect Classification and Morphing using Multi-Task Representation Learning

Most recent work focused on affect from facial expressions, and not as m...
research
06/29/2023

CORAE: A Tool for Intuitive and Continuous Retrospective Evaluation of Interactions

This paper introduces CORAE, a novel web-based open-source tool for COnt...

Please sign up or login with your details

Forgot password? Click here to reset