UCF101: A Dataset of 101 Human Actions Classes From Videos in The Wild

by   Khurram Soomro, et al.

We introduce UCF101 which is currently the largest dataset of human actions. It consists of 101 action classes, over 13k clips and 27 hours of video data. The database consists of realistic user uploaded videos containing camera motion and cluttered background. Additionally, we provide baseline action recognition results on this new dataset using standard bag of words approach with overall performance of 44.5 currently the most challenging dataset of actions due to its large number of classes, large number of clips and also unconstrained nature of such clips.



page 2

page 3

page 4

page 5

page 7


ARID: A New Dataset for Recognizing Action in the Dark

The task of action recognition in dark videos is useful in various scena...

Hand Action Detection from Ego-centric Depth Sequences with Error-correcting Hough Transform

Detecting hand actions from ego-centric depth sequences is a practically...

ESAD: Endoscopic Surgeon Action Detection Dataset

In this work, we take aim towards increasing the effectiveness of surgic...

Building a Video-and-Language Dataset with Human Actions for Multimodal Logical Inference

This paper introduces a new video-and-language dataset with human action...

Every Moment Counts: Dense Detailed Labeling of Actions in Complex Videos

Every moment counts in action recognition. A comprehensive understanding...

The SARAS Endoscopic Surgeon Action Detection (ESAD) dataset: Challenges and methods

For an autonomous robotic system, monitoring surgeon actions and assisti...

Predicting Ergonomic Risks During Indoor Object Manipulation Using Spatiotemporal Convolutional Networks

Automated real-time prediction of the ergonomic risks of manipulating ob...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.