Log In Sign Up

Fuzzy Gesture Expression Model for an Interactive and Safe Robot Partner

by   Alexis Stoven-Dubois, et al.

Interaction with a robot partner requires many elements, including not only speech but also embodiment. Thus, gestural and facial expressions are important for communication. Furthermore, understanding human movements is essential for safe and natural interchange. This paper proposes an interactive fuzzy emotional model for the robot partner's gesture expression, following its facial emotional model. First, we describe the physical interaction between the user and its robot partner. Next, we propose a kinematic model for the robot partner based on the Denavit-Hartenberg convention and solve the inverse kinematic transformation through Bacterial Memetic Algorithm. Then, the emotional model along its interactivity with the user is discussed. Finally, we show experimental results of the proposed model.


page 4

page 9

page 10


An Intelligent Personal Robot Assistant

Recent development in developing humanoid robot poses new challenges to ...

Facial Gesture Recognition Using Correlation And Mahalanobis Distance

Augmenting human computer interaction with automated analysis and synthe...

Fusing Body Posture with Facial Expressions for Joint Recognition of Affect in Child-Robot Interaction

In this paper we address the problem of multi-cue affect recognition in ...

Hand and Arm Gesture-based Human-Robot Interaction: A Review

The study of Human-Robot Interaction (HRI) aims to create close and frie...

An Improved Wrist Kinematic Model for Human-Robot Interaction

Human kinematics is of fundamental importance for rehabilitation and ass...

EmoFit: Affect Monitoring System for Sedentary Jobs

Emotional and physical well-being at workplace is important for a positi...

Fuzzy inference based mentality estimation for eye robot agent

Household robots need to communicate with human beings in a friendly fas...