Okutama-Action: An Aerial View Video Dataset for Concurrent Human Action Detection

06/09/2017
by   Mohammadamin Barekatain, et al.
0

Despite significant progress in the development of human action detection datasets and algorithms, no current dataset is representative of real-world aerial view scenarios. We present Okutama-Action, a new video dataset for aerial view concurrent human action detection. It consists of 43 minute-long fully-annotated sequences with 12 action classes. Okutama-Action features many challenges missing in current datasets, including dynamic transition of actions, significant changes in scale and aspect ratio, abrupt camera movement, as well as multi-labeled actors. As a result, our dataset is more challenging than existing ones, and will help push the field forward to enable real-world applications.

READ FULL TEXT

page 2

page 5

page 6

page 7

research
05/19/2017

The Kinetics Human Action Video Dataset

We describe the DeepMind Kinetics human action video dataset. The datase...
research
02/08/2022

Untrimmed Action Anticipation

Egocentric action anticipation consists in predicting a future action th...
research
04/02/2023

From Isolated Islands to Pangea: Unifying Semantic Space for Human Action Understanding

Action understanding matters and attracts attention. It can be formed as...
research
06/12/2020

ESAD: Endoscopic Surgeon Action Detection Dataset

In this work, we take aim towards increasing the effectiveness of surgic...
research
10/21/2020

A Short Note on the Kinetics-700-2020 Human Action Dataset

We describe the 2020 edition of the DeepMind Kinetics human action datas...
research
04/07/2021

The SARAS Endoscopic Surgeon Action Detection (ESAD) dataset: Challenges and methods

For an autonomous robotic system, monitoring surgeon actions and assisti...
research
05/06/2019

Emergent Leadership Detection Across Datasets

Automatic detection of emergent leaders in small groups from nonverbal b...

Please sign up or login with your details

Forgot password? Click here to reset