MVStylizer: An Efficient Edge-Assisted Video Photorealistic Style Transfer System for Mobile Phones

05/24/2020
by   Ang Li, et al.
3

Recent research has made great progress in realizing neural style transfer of images, which denotes transforming an image to a desired style. Many users start to use their mobile phones to record their daily life, and then edit and share the captured images and videos with other users. However, directly applying existing style transfer approaches on videos, i.e., transferring the style of a video frame by frame, requires an extremely large amount of computation resources. It is still technically unaffordable to perform style transfer of videos on mobile phones. To address this challenge, we propose MVStylizer, an efficient edge-assisted photorealistic video style transfer system for mobile phones. Instead of performing stylization frame by frame, only key frames in the original video are processed by a pre-trained deep neural network (DNN) on edge servers, while the rest of stylized intermediate frames are generated by our designed optical-flow-based frame interpolation algorithm on mobile phones. A meta-smoothing module is also proposed to simultaneously upscale a stylized frame to arbitrary resolution and remove style transfer related distortions in these upscaled frames. In addition, for the sake of continuously enhancing the performance of the DNN model on the edge server, we adopt a federated learning scheme to keep retraining each DNN model on the edge server with collected data from mobile clients and syncing with a global DNN model on the cloud server. Such a scheme effectively leverages the diversity of collected data from various mobile clients and efficiently improves the system performance. Our experiments demonstrate that MVStylizer can generate stylized videos with an even better visual quality compared to the state-of-the-art method while achieving 75.5× speedup for 1920×1080 videos.

READ FULL TEXT

page 1

page 2

page 8

research
03/16/2023

NLUT: Neural-based 3D Lookup Tables for Video Photorealistic Style Transfer

Video photorealistic style transfer is desired to generate videos with a...
research
05/05/2017

Characterizing and Improving Stability in Neural Style Transfer

Recent progress in style transfer on images has focused on improving the...
research
09/18/2023

Instant Photorealistic Style Transfer: A Lightweight and Adaptive Approach

In this paper, we propose an Instant Photorealistic Style Transfer (IPST...
research
12/03/2020

Learning to Transfer Visual Effects from Videos to Images

We study the problem of animating images by transferring spatio-temporal...
research
02/27/2018

Neural Stereoscopic Image Style Transfer

Neural style transfer is an emerging technique which is able to endow da...
research
05/07/2020

Kunster – AR Art Video Maker – Real time video neural style transfer on mobile devices

Neural style transfer is a well-known branch of deep learning research, ...
research
05/26/2016

DeepMovie: Using Optical Flow and Deep Neural Networks to Stylize Movies

A recent paper by Gatys et al. describes a method for rendering an image...

Please sign up or login with your details

Forgot password? Click here to reset