Sociable and Ergonomic Human-Robot Collaboration through Action Recognition and Augmented Hierarchical Quadratic Programming

07/07/2022
by   Francesco Tassi, et al.
0

The recognition of actions performed by humans and the anticipation of their intentions are important enablers to yield sociable and successful collaboration in human-robot teams. Meanwhile, robots should have the capacity to deal with multiple objectives and constraints, arising from the collaborative task or the human. In this regard, we propose vision techniques to perform human action recognition and image classification, which are integrated into an Augmented Hierarchical Quadratic Programming (AHQP) scheme to hierarchically optimize the robot's reactive behavior and human ergonomics. The proposed framework allows one to intuitively command the robot in space while a task is being executed. The experiments confirm increased human ergonomics and usability, which are fundamental parameters for reducing musculoskeletal diseases and increasing trust in automation.

READ FULL TEXT

page 1

page 2

page 5

page 6

page 7

research
11/02/2020

Toward Mutual Trust Modeling in Human-Robot Collaboration

The recent revolution of intelligent systems made it possible for robots...
research
10/21/2022

Learning Action Duration and Synergy in Task Planning for Human-Robot Collaboration

A good estimation of the actions' cost is key in task planning for human...
research
10/18/2016

ARTiS: Appearance-based Action Recognition in Task Space for Real-Time Human-Robot Collaboration

To have a robot actively supporting a human during a collaborative task,...
research
06/18/2022

Human-Robot Handovers using Task-Space Quadratic Programming

Bidirectional object handover between a human and a robot enables an imp...
research
02/01/2021

"Grip-that-there": An Investigation of Explicit and Implicit Task Allocation Techniques for Human-Robot Collaboration

In ad-hoc human-robot collaboration (HRC), humans and robots work on a t...
research
02/26/2019

Beyond the Self: Using Grounded Affordances to Interpret and Describe Others' Actions

We propose a developmental approach that allows a robot to interpret and...
research
11/24/2017

Interactive Robot Learning of Gestures, Language and Affordances

A growing field in robotics and Artificial Intelligence (AI) research is...

Please sign up or login with your details

Forgot password? Click here to reset