Learning with Privileged Information for Efficient Image Super-Resolution

07/15/2020
by   Wonkyung Lee, et al.
0

Convolutional neural networks (CNNs) have allowed remarkable advances in single image super-resolution (SISR) over the last decade. Most SR methods based on CNNs have focused on achieving performance gains in terms of quality metrics, such as PSNR and SSIM, over classical approaches. They typically require a large amount of memory and computational units. FSRCNN, consisting of few numbers of convolutional layers, has shown promising results, while using an extremely small number of network parameters. We introduce in this paper a novel distillation framework, consisting of teacher and student networks, that allows to boost the performance of FSRCNN drastically. To this end, we propose to use ground-truth high-resolution (HR) images as privileged information. The encoder in the teacher learns the degradation process, subsampling of HR images, using an imitation loss. The student and the decoder in the teacher, having the same network architecture as FSRCNN, try to reconstruct HR images. Intermediate features in the decoder, affordable for the student to learn, are transferred to the student through feature distillation. Experimental results on standard benchmarks demonstrate the effectiveness and the generalization ability of our framework, which significantly boosts the performance of FSRCNN as well as other SR methods. Our code and model are available online: https://cvlab.yonsei.ac.kr/projects/PISR.

READ FULL TEXT

page 14

page 21

page 22

page 23

page 24

page 25

research
05/25/2021

Towards Compact Single Image Super-Resolution via Contrastive Self-distillation

Convolutional neural networks (CNNs) are highly successful for super-res...
research
07/18/2022

Learning Knowledge Representation with Meta Knowledge Distillation for Single Image Super-Resolution

Knowledge distillation (KD), which can efficiently transfer knowledge fr...
research
10/06/2021

Inter-Domain Alignment for Predicting High-Resolution Brain Networks Using Teacher-Student Learning

Accurate and automated super-resolution image synthesis is highly desire...
research
03/26/2018

Fast and Accurate Single Image Super-Resolution via Information Distillation Network

Recently, deep convolutional neural networks (CNNs) have been demonstrat...
research
11/29/2022

Feature-domain Adaptive Contrastive Distillation for Efficient Single Image Super-Resolution

Recently, CNN-based SISR has numerous parameters and high computational ...
research
02/26/2021

Knowledge Distillation Circumvents Nonlinearity for Optical Convolutional Neural Networks

In recent years, Convolutional Neural Networks (CNNs) have enabled ubiqu...
research
11/22/2021

Local-Selective Feature Distillation for Single Image Super-Resolution

Recent improvements in convolutional neural network (CNN)-based single i...

Please sign up or login with your details

Forgot password? Click here to reset