Improving Knowledge Distillation Via Transferring Learning Ability

04/24/2023
by   Long Liu, et al.
0

Existing knowledge distillation methods generally use a teacher-student approach, where the student network solely learns from a well-trained teacher. However, this approach overlooks the inherent differences in learning abilities between the teacher and student networks, thus causing the capacity-gap problem. To address this limitation, we propose a novel method called SLKD.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset