Joint Engagement Classification using Video Augmentation Techniques for Multi-person Human-robot Interaction

12/28/2022
by   Yubin Kim, et al.
0

Affect understanding capability is essential for social robots to autonomously interact with a group of users in an intuitive and reciprocal way. However, the challenge of multi-person affect understanding comes from not only the accurate perception of each user's affective state (e.g., engagement) but also the recognition of the affect interplay between the members (e.g., joint engagement) that presents as complex, but subtle, nonverbal exchanges between them. Here we present a novel hybrid framework for identifying a parent-child dyad's joint engagement by combining a deep learning framework with various video augmentation techniques. Using a dataset of parent-child dyads reading storybooks together with a social robot at home, we first train RGB frame- and skeleton-based joint engagement recognition models with four video augmentation techniques (General Aug, DeepFake, CutOut, and Mixed) applied datasets to improve joint engagement classification performance. Second, we demonstrate experimental results on the use of trained models in the robot-parent-child interaction context. Third, we introduce a behavior-based metric for evaluating the learned representation of the models to investigate the model interpretability when recognizing joint engagement. This work serves as the first step toward fully unlocking the potential of end-to-end video understanding models pre-trained on large public datasets and augmented with data augmentation and visualization techniques for affect recognition in the multi-person human-robot interaction in the wild.

READ FULL TEXT

page 4

page 6

page 8

research
09/29/2017

Detection of social signals for recognizing engagement in human-robot interaction

Detection of engagement during a conversation is an important function o...
research
01/10/2023

Sentiment-based Engagement Strategies for intuitive Human-Robot Interaction

Emotion expressions serve as important communicative signals and are cru...
research
08/05/2019

Speech Driven Backchannel Generation using Deep Q-Network for Enhancing Engagement in Human-Robot Interaction

We present a novel method for training a social robot to generate backch...
research
02/04/2018

Personalized Machine Learning for Robot Perception of Affect and Engagement in Autism Therapy

Robots have great potential to facilitate future therapies for children ...
research
04/20/2020

On-the-fly Detection of User Engagement Decrease in Spontaneous Human-Robot Interaction, International Journal of Social Robotics, 2019

In this paper, we consider the detection of a decrease of engagement by ...
research
03/31/2023

Affective Computing for Human-Robot Interaction Research: Four Critical Lessons for the Hitchhiker

Social Robotics and Human-Robot Interaction (HRI) research relies on dif...
research
03/02/2019

Complex Stiffness Model of Physical Human-Robot Interaction: Implications for Control of Performance Augmentation Exoskeletons

Human joint dynamic stiffness plays an important role in the stability o...

Please sign up or login with your details

Forgot password? Click here to reset