Towards General and Fast Video Derain via Knowledge Distillation

08/10/2023
by   Defang Cai, et al.
0

As a common natural weather condition, rain can obscure video frames and thus affect the performance of the visual system, so video derain receives a lot of attention. In natural environments, rain has a wide variety of streak types, which increases the difficulty of the rain removal task. In this paper, we propose a Rain Review-based General video derain Network via knowledge distillation (named RRGNet) that handles different rain streak types with one pre-training weight. Specifically, we design a frame grouping-based encoder-decoder network that makes full use of the temporal information of the video. Further, we use the old task model to guide the current model in learning new rain streak types while avoiding forgetting. To consolidate the network's ability to derain, we design a rain review module to play back data from old tasks for the current model. The experimental results show that our developed general method achieves the best results in terms of running speed and derain effect.

READ FULL TEXT

page 1

page 2

page 3

page 5

research
08/22/2022

Tree-structured Auxiliary Online Knowledge Distillation

Traditional knowledge distillation adopts a two-stage training process i...
research
06/11/2019

Incremental Classifier Learning Based on PEDCC-Loss and Cosine Distance

The main purpose of incremental learning is to learn new knowledge while...
research
04/09/2019

Back to the Future: Knowledge Distillation for Human Action Anticipation

We consider the task of training a neural network to anticipate human ac...
research
07/03/2023

Review helps learn better: Temporal Supervised Knowledge Distillation

Reviewing plays an important role when learning knowledge. The knowledge...
research
10/20/2020

Fast Video Salient Object Detection via Spatiotemporal Knowledge Distillation

Since the wide employment of deep learning frameworks in video salient o...
research
02/15/2023

Offline-to-Online Knowledge Distillation for Video Instance Segmentation

In this paper, we present offline-to-online knowledge distillation (OOKD...
research
04/09/2019

Ultrafast Video Attention Prediction with Coupled Knowledge Distillation

Large convolutional neural network models have recently demonstrated imp...

Please sign up or login with your details

Forgot password? Click here to reset