Photometric Stabilization for Fast-forward Videos

09/29/2017
by   Xuaner Cecilia Zhang, et al.
0

Videos captured by consumer cameras often exhibit temporal variations in color and tone that are caused by camera auto-adjustments like white-balance and exposure. When such videos are sub-sampled to play fast-forward, as in the increasingly popular forms of timelapse and hyperlapse videos, these temporal variations are exacerbated and appear as visually disturbing high frequency flickering. Previous techniques to photometrically stabilize videos typically rely on computing dense correspondences between video frames, and use these correspondences to remove all color changes in the video sequences. However, this approach is limited in fast-forward videos that often have large content changes and also might exhibit changes in scene illumination that should be preserved. In this work, we propose a novel photometric stabilization algorithm for fast-forward videos that is robust to large content-variation across frames. We compute pairwise color and tone transformations between neighboring frames and smooth these pair-wise transformations while taking in account the possibility of scene/content variations. This allows us to eliminate high-frequency fluctuations, while still adapting to real variations in scene characteristics. We evaluate our technique on a new dataset consisting of controlled synthetic and real videos, and demonstrate that our techniques outperforms the state-of-the-art.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset