Egocentric Hand Track and Object-based Human Action Recognition

05/02/2019
by   Georgios Kapidis, et al.
0

Egocentric vision is an emerging field of computer vision that is characterized by the acquisition of images and video from the first person perspective. In this paper we address the challenge of egocentric human action recognition by utilizing the presence and position of detected regions of interest in the scene explicitly, without further use of visual features. Initially, we recognize that human hands are essential in the execution of actions and focus on obtaining their movements as the principal cues that define actions. We employ object detection and region tracking techniques to locate hands and capture their movements. Prior knowledge about egocentric views facilitates hand identification between left and right. With regard to detection and tracking, we contribute a pipeline that successfully operates on unseen egocentric videos to find the camera wearer's hands and associate them through time. Moreover, we emphasize on the value of scene information for action recognition. We acknowledge that the presence of objects is significant for the execution of actions by humans and in general for the description of a scene. To acquire this information, we utilize object detection for specific classes that are relevant to the actions we want to recognize. Our experiments are targeted on videos of kitchen activities from the Epic-Kitchens dataset. We model action recognition as a sequence learning problem of the detected spatial positions in the frames. Our results show that explicit hand and object detections with no other visual information can be relied upon to classify hand-related human actions. Testing against methods fully dependent on visual features, signals that for actions where hand motions are conceptually important, a region-of-interest-based description of a video contains equally expressive information with comparable classification performance.

READ FULL TEXT

page 1

page 5

page 7

research
04/07/2016

Trajectory Aligned Features For First Person Action Recognition

Egocentric videos are characterised by their ability to have the first p...
research
03/22/2022

Detection, Recognition, and Tracking: A Survey

For humans, object detection, recognition, and tracking are innate. Thes...
research
04/22/2020

Human and Machine Action Prediction Independent of Object Information

Predicting other people's action is key to successful social interaction...
research
10/20/2022

Transformer-based Action recognition in hand-object interacting scenarios

This report describes the 2nd place solution to the ECCV 2022 Human Body...
research
11/10/2017

Egocentric Hand Detection Via Dynamic Region Growing

Egocentric videos, which mainly record the activities carried out by the...
research
11/12/2020

Adding Knowledge to Unsupervised Algorithms for the Recognition of Intent

Computer vision algorithms performance are near or superior to humans in...
research
01/18/2017

Action Recognition: From Static Datasets to Moving Robots

Deep learning models have achieved state-of-the- art performance in reco...

Please sign up or login with your details

Forgot password? Click here to reset