Transformer-based Action recognition in hand-object interacting scenarios

10/20/2022
by   Hoseong Cho, et al.
0

This report describes the 2nd place solution to the ECCV 2022 Human Body, Hands, and Activities (HBHA) from Egocentric and Multi-view Cameras Challenge: Action Recognition. This challenge aims to recognize hand-object interaction in an egocentric view. We propose a framework that estimates keypoints of two hands and an object with a Transformer-based keypoint estimator and recognizes actions based on the estimated keypoints. We achieved a top-1 accuracy of 87.19

READ FULL TEXT

page 2

page 3

research
10/20/2022

Transformer-based Global 3D Hand Pose Estimation in Two Hands Manipulating Objects Scenarios

This report describes our 1st place solution to ECCV 2022 challenge on H...
research
03/25/2023

Multi-view knowledge distillation transformer for human action recognition

Recently, Transformer-based methods have been utilized to improve the pe...
research
09/08/2021

Egocentric View Hand Action Recognition by Leveraging Hand Surface and Hand Grasp Type

We introduce a multi-stage framework that uses mean curvature on a hand ...
research
06/22/2019

Baidu-UTS Submission to the EPIC-Kitchens Action Recognition Challenge 2019

In this report, we present the Baidu-UTS submission to the EPIC-Kitchens...
research
09/17/2023

CaSAR: Contact-aware Skeletal Action Recognition

Skeletal Action recognition from an egocentric view is important for app...
research
06/20/2022

M M Mix: A Multimodal Multiview Transformer Ensemble

This report describes the approach behind our winning solution to the 20...
research
05/02/2019

Egocentric Hand Track and Object-based Human Action Recognition

Egocentric vision is an emerging field of computer vision that is charac...

Please sign up or login with your details

Forgot password? Click here to reset