Learning Knowledge Representation with Meta Knowledge Distillation for Single Image Super-Resolution

07/18/2022
by   Han Zhu, et al.
0

Knowledge distillation (KD), which can efficiently transfer knowledge from a cumbersome network (teacher) to a compact network (student), has demonstrated its advantages in some computer vision applications. The representation of knowledge is vital for knowledge transferring and student learning, which is generally defined in hand-crafted manners or uses the intermediate features directly. In this paper, we propose a model-agnostic meta knowledge distillation method under the teacher-student architecture for the single image super-resolution task. It provides a more flexible and accurate way to help the teachers transmit knowledge in accordance with the abilities of students via knowledge representation networks (KRNets) with learnable parameters. In order to improve the perception ability of knowledge representation to students' requirements, we propose to solve the transformation process from intermediate outputs to transferred knowledge by employing the student features and the correlation between teacher and student in the KRNets. Specifically, the texture-aware dynamic kernels are generated and then extract texture features to be improved and the corresponding teacher guidance so as to decompose the distillation problem into texture-wise supervision for further promoting the recovery quality of high-frequency details. In addition, the KRNets are optimized in a meta-learning manner to ensure the knowledge transferring and the student learning are beneficial to improving the reconstructed quality of the student. Experiments conducted on various single image super-resolution datasets demonstrate that our proposed method outperforms existing defined knowledge representation related distillation methods, and can help super-resolution algorithms achieve better reconstruction quality without introducing any inference complexity.

READ FULL TEXT

page 3

page 5

page 6

page 8

page 9

research
11/22/2021

Local-Selective Feature Distillation for Single Image Super-Resolution

Recent improvements in convolutional neural network (CNN)-based single i...
research
11/29/2022

Feature-domain Adaptive Contrastive Distillation for Efficient Single Image Super-Resolution

Recently, CNN-based SISR has numerous parameters and high computational ...
research
07/15/2020

Learning with Privileged Information for Efficient Image Super-Resolution

Convolutional neural networks (CNNs) have allowed remarkable advances in...
research
05/12/2018

Born Again Neural Networks

Knowledge distillation (KD) consists of transferring knowledge from one ...
research
05/06/2023

Structural and Statistical Texture Knowledge Distillation for Semantic Segmentation

Existing knowledge distillation works for semantic segmentation mainly f...
research
10/06/2021

Inter-Domain Alignment for Predicting High-Resolution Brain Networks Using Teacher-Student Learning

Accurate and automated super-resolution image synthesis is highly desire...

Please sign up or login with your details

Forgot password? Click here to reset