ATTACH Dataset: Annotated Two-Handed Assembly Actions for Human Action Understanding

by   Dustin Aganian, et al.

With the emergence of collaborative robots (cobots), human-robot collaboration in industrial manufacturing is coming into focus. For a cobot to act autonomously and as an assistant, it must understand human actions during assembly. To effectively train models for this task, a dataset containing suitable assembly actions in a realistic setting is crucial. For this purpose, we present the ATTACH dataset, which contains 51.6 hours of assembly with 95.2k annotated fine-grained actions monitored by three cameras, which represent potential viewpoints of a cobot. Since in an assembly context workers tend to perform different actions simultaneously with their two hands, we annotated the performed actions for each hand separately. Therefore, in the ATTACH dataset, more than 68 times more than in related datasets, typically featuring more simplistic assembly tasks. For better generalization with respect to the background of the working area, we did not only record color and depth images, but also used the Azure Kinect body tracking SDK for estimating 3D skeletons of the worker. To create a first baseline, we report the performance of state-of-the-art methods for action recognition as well as action detection on video and skeleton-sequence inputs. The dataset is available at .


Fusing Hand and Body Skeletons for Human Action Recognition in Assembly

As collaborative robots (cobots) continue to gain popularity in industri...

HA-ViD: A Human Assembly Video Dataset for Comprehensive Assembly Knowledge Understanding

Understanding comprehensive assembly knowledge from videos is critical f...

Challenges of the Creation of a Dataset for Vision Based Human Hand Action Recognition in Industrial Assembly

This work presents the Industrial Hand Action Dataset V1, an industrial ...

Fine-grained activity recognition for assembly videos

In this paper we address the task of recognizing assembly actions as a s...

Two-Stage Clustering of Human Preferences for Action Prediction in Assembly Tasks

To effectively assist human workers in assembly tasks a robot must proac...

BusyHands: A Hand-Tool Interaction Database for Assembly Tasks Semantic Segmentation

Visual segmentation has seen tremendous advancement recently with ready ...

How Object Information Improves Skeleton-based Human Action Recognition in Assembly Tasks

As the use of collaborative robots (cobots) in industrial manufacturing ...

Please sign up or login with your details

Forgot password? Click here to reset