Orderly Dual-Teacher Knowledge Distillation for Lightweight Human Pose Estimation

04/21/2021
by   Zhong-Qiu Zhao, et al.
0

Although deep convolution neural networks (DCNN) have achieved excellent performance in human pose estimation, these networks often have a large number of parameters and computations, leading to the slow inference speed. For this issue, an effective solution is knowledge distillation, which transfers knowledge from a large pre-trained network (teacher) to a small network (student). However, there are some defects in the existing approaches: (I) Only a single teacher is adopted, neglecting the potential that a student can learn from multiple teachers. (II) The human segmentation mask can be regarded as additional prior information to restrict the location of keypoints, which is never utilized. (III) A student with a small number of parameters cannot fully imitate heatmaps provided by datasets and teachers. (IV) There exists noise in heatmaps generated by teachers, which causes model degradation. To overcome these defects, we propose an orderly dual-teacher knowledge distillation (ODKD) framework, which consists of two teachers with different capabilities. Specifically, the weaker one (primary teacher, PT) is used to teach keypoints information, the stronger one (senior teacher, ST) is utilized to transfer segmentation and keypoints information by adding the human segmentation mask. Taking dual-teacher together, an orderly learning strategy is proposed to promote knowledge absorbability. Moreover, we employ a binarization operation which further improves the learning ability of the student and reduces noise in heatmaps. Experimental results on COCO and OCHuman keypoints datasets show that our proposed ODKD can improve the performance of different lightweight models by a large margin, and HRNet-W16 equipped with ODKD achieves state-of-the-art performance for lightweight human pose estimation.

READ FULL TEXT

page 3

page 8

research
01/15/2020

Lightweight 3D Human Pose Estimation Network Training Using Teacher-Student Learning

We present MoVNect, a lightweight deep neural network to capture 3D huma...
research
08/04/2021

Online Knowledge Distillation for Efficient Pose Estimation

Existing state-of-the-art human pose estimation methods require heavy co...
research
05/24/2023

On Correlated Knowledge Distillation for Monitoring Human Pose with Radios

In this work, we propose and develop a simple experimental testbed to st...
research
12/17/2020

Invariant Teacher and Equivariant Student for Unsupervised 3D Human Pose Estimation

We propose a novel method based on teacher-student learning framework fo...
research
09/20/2019

Learning Lightweight Pedestrian Detector with Hierarchical Knowledge Distillation

It remains very challenging to build a pedestrian detection system for r...
research
04/11/2019

Improved training of binary networks for human pose estimation and image recognition

Big neural networks trained on large datasets have advanced the state-of...
research
08/17/2023

Learning Through Guidance: Knowledge Distillation for Endoscopic Image Classification

Endoscopy plays a major role in identifying any underlying abnormalities...

Please sign up or login with your details

Forgot password? Click here to reset