Emotive Response to a Hybrid-Face Robot and Translation to Consumer Social Robots

12/08/2020
by   Maitreyee Wairagkar, et al.
0

We introduce the conceptual formulation, design, fabrication, control and commercial translation with IoT connection of a hybrid-face social robot and validation of human emotional response to its affective interactions. The hybrid-face robot integrates a 3D printed faceplate and a digital display to simplify conveyance of complex facial movements while providing the impression of three-dimensional depth for natural interaction. We map the space of potential emotions of the robot to specific facial feature parameters and characterise the recognisability of the humanoid hybrid-face robot's archetypal facial expressions. We introduce pupil dilation as an additional degree of freedom for conveyance of emotive states. Human interaction experiments demonstrate the ability to effectively convey emotion from the hybrid-robot face to human observers by mapping their neurophysiological electroencephalography (EEG) response to perceived emotional information and through interviews. Results show main hybrid-face robotic expressions can be discriminated with recognition rates above 80 response similar to that of actual human faces as measured by the face-specific N170 event-related potentials in EEG. The hybrid-face robot concept has been modified, implemented, and released in the commercial IoT robotic platform Miko (My Companion), an affective robot with facial and conversational features currently in use for human-robot interaction in children by Emotix Inc. We demonstrate that human EEG responses to Miko emotions are comparative to neurophysiological responses for actual human facial recognition. Finally, interviews show above 90 We conclude that simplified hybrid-face abstraction conveys emotions effectively and enhances human-robot interaction.

READ FULL TEXT

page 1

page 3

page 4

page 5

page 6

page 10

research
02/20/2022

Real-time Emotion Appraisal with Circumplex Model for Human-Robot Interaction

Emotions are the intrinsic or extrinsic representations of our experienc...
research
06/08/2017

Sympathy Begins with a Smile, Intelligence Begins with a Word: Use of Multimodal Features in Spoken Human-Robot Interaction

Recognition of social signals, from human facial expressions or prosody ...
research
01/13/2023

Optimizing Facial Expressions of an Android Robot Effectively: a Bayesian Optimization Approach

Expressing various facial emotions is an important social ability for ef...
research
10/09/2020

Emotional Musical Prosody: Validated Vocal Dataset for Human Robot Interaction

Human collaboration with robotics is dependant on the development of a r...
research
03/07/2021

Developing a Data-Driven Categorical Taxonomy of Emotional Expressions in Real World Human Robot Interactions

Emotions are reactions that can be expressed through a variety of social...
research
05/02/2022

Data-driven emotional body language generation for social robotics

In social robotics, endowing humanoid robots with the ability to generat...
research
04/22/2021

Design not Lost in Translation: A Case Study of an Intimate-Space Socially Assistive Robot for Emotion Regulation

We present a Research-through-Design case study of the design and develo...

Please sign up or login with your details

Forgot password? Click here to reset