Optical Flow Distillation: Towards Efficient and Stable Video Style Transfer

07/10/2020
by   Xinghao Chen, et al.
6

Video style transfer techniques inspire many exciting applications on mobile devices. However, their efficiency and stability are still far from satisfactory. To boost the transfer stability across frames, optical flow is widely adopted, despite its high computational complexity, e.g. occupying over 97 transfer network via knowledge distillation paradigm. We adopt two teacher networks, one of which takes optical flow during inference while the other does not. The output difference between these two teacher networks highlights the improvements made by optical flow, which is then adopted to distill the target student network. Furthermore, a low-rank distillation loss is employed to stabilize the output of student network by mimicking the rank of input videos. Extensive experiments demonstrate that our student network without an optical flow module is still able to generate stable video and runs much faster than the teacher network.

READ FULL TEXT

page 2

page 6

page 12

page 14

research
11/06/2018

Evolvement Constrained Adversarial Learning for Video Style Transfer

Video style transfer is a useful component for applications such as augm...
research
02/11/2021

Frame Difference-Based Temporal Loss for Video Stylization

Neural style transfer models have been used to stylize an ordinary video...
research
03/17/2022

Delta Distillation for Efficient Video Processing

This paper aims to accelerate video stream processing, such as object de...
research
03/24/2020

ShadowTutor: Distributed Partial Distillation for Mobile Video DNN Inference

Following the recent success of deep neural networks (DNN) on video comp...
research
10/25/2022

GlobalFlowNet: Video Stabilization using Deep Distilled Global Motion Estimates

Videos shot by laymen using hand-held cameras contain undesirable shaky ...
research
10/09/2022

Students taught by multimodal teachers are superior action recognizers

The focal point of egocentric video understanding is modelling hand-obje...
research
05/26/2016

DeepMovie: Using Optical Flow and Deep Neural Networks to Stylize Movies

A recent paper by Gatys et al. describes a method for rendering an image...

Please sign up or login with your details

Forgot password? Click here to reset