Pixel Distillation: A New Knowledge Distillation Scheme for Low-Resolution Image Recognition

12/17/2021
by   Guangyu Guo, et al.
0

The great success of deep learning is mainly due to the large-scale network architecture and the high-quality training data. However, it is still challenging to deploy recent deep models on portable devices with limited memory and imaging ability. Some existing works have engaged to compress the model via knowledge distillation. Unfortunately, these methods cannot deal with images with reduced image quality, such as the low-resolution (LR) images. To this end, we make a pioneering effort to distill helpful knowledge from a heavy network model learned from high-resolution (HR) images to a compact network model that will handle LR images, thus advancing the current knowledge distillation technique with the novel pixel distillation. To achieve this goal, we propose a Teacher-Assistant-Student (TAS) framework, which disentangles knowledge distillation into the model compression stage and the high resolution representation transfer stage. By equipping a novel Feature Super Resolution (FSR) module, our approach can learn lightweight network model that can achieve similar accuracy as the heavy teacher model but with much fewer parameters, faster inference speed, and lower-resolution inputs. Comprehensive experiments on three widely-used benchmarks, , CUB-200-2011, PASCAL VOC 2007, and ImageNetSub, demonstrate the effectiveness of our approach.

READ FULL TEXT
research
03/13/2023

Continuous sign language recognition based on cross-resolution knowledge distillation

The goal of continuous sign language recognition(CSLR) research is to ap...
research
04/01/2016

Adapting Models to Signal Degradation using Distillation

Model compression and knowledge distillation have been successfully appl...
research
03/08/2022

PyNET-QxQ: A Distilled PyNET for QxQ Bayer Pattern Demosaicing in CMOS Image Sensor

The deep learning-based ISP models for mobile cameras produce high-quali...
research
04/09/2019

Ultrafast Video Attention Prediction with Coupled Knowledge Distillation

Large convolutional neural network models have recently demonstrated imp...
research
02/10/2023

CEN-HDR: Computationally Efficient neural Network for real-time High Dynamic Range imaging

High dynamic range (HDR) imaging is still a challenging task in modern d...
research
03/22/2022

SSD-KD: A Self-supervised Diverse Knowledge Distillation Method for Lightweight Skin Lesion Classification Using Dermoscopic Images

Skin cancer is one of the most common types of malignancy, affecting a l...
research
12/02/2022

StructVPR: Distill Structural Knowledge with Weighting Samples for Visual Place Recognition

Visual place recognition (VPR) is usually considered as a specific image...

Please sign up or login with your details

Forgot password? Click here to reset