Joint Stabilization and Direction of 360°Videos

01/14/2019
by   Chengzhou Tang, et al.
0

360 video provides an immersive experience for viewers, allowing them to freely explore the world by turning their head. However, creating high-quality 360 video content can be challenging, as viewers may miss important events by looking in the wrong direction, or they may see things that ruin the immersion, such as stitching artifacts and the film crew. We take advantage of the fact that not all directions are equally likely to be observed; most viewers are more likely to see content located at "true north", i.e. in front of them, due to ergonomic constraints. We therefore propose 360 video direction, where the video is jointly optimized to orient important events to the front of the viewer and visual clutter behind them, while producing smooth camera motion. Unlike traditional video, viewers can still explore the space as desired, but with the knowledge that the most important content is likely to be in front of them. Constraints can be user guided, either added directly on the equirectangular projection or by recording "guidance" viewing directions while watching the video in a VR headset, or automatically computed, such as via visual saliency or forward motion direction. To accomplish this, we propose a new motion estimation technique specifically designed for 360 video which outperforms the commonly used 5-point algorithm on wide angle video. We additionally formulate the direction problem as an optimization where a novel parametrization of spherical warping allows us to correct for some degree of parallax effects. We compare our approach to recent methods that address stabilization-only and converting 360 video to narrow field-of-view video.

READ FULL TEXT

page 5

page 7

page 9

page 10

page 11

page 12

research
09/07/2023

BOLA360: Near-optimal View and Bitrate Adaptation for 360-degree Video Streaming

Recent advances in omnidirectional cameras and AR/VR headsets have spurr...
research
04/11/2018

Measurement of exceptional motion in VR video contents for VR sickness assessment using deep convolutional autoencoder

This paper proposes a new objective metric of exceptional motion in VR v...
research
09/21/2023

Using Saliency and Cropping to Improve Video Memorability

Video memorability is a measure of how likely a particular video is to b...
research
11/20/2020

ATSal: An Attention Based Architecture for Saliency Prediction in 360 Videos

The spherical domain representation of 360 video/image presents many cha...
research
06/13/2023

360TripleView: 360-Degree Video View Management System Driven by Convergence Value of Viewing Preferences

360-degree video has become increasingly popular in content consumption....
research
03/01/2017

Making 360^∘ Video Watchable in 2D: Learning Videography for Click Free Viewing

360^∘ video requires human viewers to actively control "where" to look w...
research
12/12/2017

Learning Compressible 360° Video Isomers

Standard video encoders developed for conventional narrow field-of-view ...

Please sign up or login with your details

Forgot password? Click here to reset