Fast-Forward Video Based on Semantic Extraction

Thanks to the low operational cost and large storage capacity of smartphones and wearable devices, people are recording many hours of daily activities, sport actions and home videos. These videos, also known as egocentric videos, are generally long-running streams with unedited content, which make them boring and visually unpalatable, bringing up the challenge to make egocentric videos more appealing. In this work we propose a novel methodology to compose the new fast-forward video by selecting frames based on semantic information extracted from images. The experiments show that our approach outperforms the state-of-the-art as far as semantic information is concerned and that it is also able to produce videos that are more pleasant to be watched.

READ FULL TEXT
research
08/14/2017

Towards Semantic Fast-Forward and Stabilized Egocentric Videos

The emergence of low-cost personal mobiles devices and wearable cameras ...
research
06/12/2018

Fast forwarding Egocentric Videos by Listening and Watching

The remarkable technological advance in well-equipped wearable devices i...
research
09/29/2017

Photometric Stabilization for Fast-forward Videos

Videos captured by consumer cameras often exhibit temporal variations in...
research
11/09/2017

Making a long story short: A Multi-Importance Semantic for Fast-Forwarding Egocentric Videos

The emergence of low-cost, high-quality personal wearable cameras combin...
research
02/23/2018

A Weighted Sparse Sampling and Smoothing Frame Transition Approach for Semantic Fast-Forward First-Person Videos

Thanks to the advances in the technology of low-cost digital cameras and...
research
11/24/2015

Fine-Grain Annotation of Cricket Videos

The recognition of human activities is one of the key problems in video ...
research
06/05/2015

Sentence Directed Video Object Codetection

We tackle the problem of video object codetection by leveraging the weak...

Please sign up or login with your details

Forgot password? Click here to reset