Cluster-aware Semi-supervised Learning: Relational Knowledge Distillation Provably Learns Clustering

07/20/2023
by   Yijun Dong, et al.
0

Despite the empirical success and practical significance of (relational) knowledge distillation that matches (the relations of) features between teacher and student models, the corresponding theoretical interpretations remain limited for various knowledge distillation paradigms. In this work, we take an initial step toward a theoretical understanding of relational knowledge distillation (RKD), with a focus on semi-supervised classification problems. We start by casting RKD as spectral clustering on a population-induced graph unveiled by a teacher model. Via a notion of clustering error that quantifies the discrepancy between the predicted and ground truth clusterings, we illustrate that RKD over the population provably leads to low clustering error. Moreover, we provide a sample complexity bound for RKD with limited unlabeled samples. For semi-supervised learning, we further demonstrate the label efficiency of RKD through a general framework of cluster-aware semi-supervised learning that assumes low clustering errors. Finally, by unifying data augmentation consistency regularization into this cluster-aware framework, we show that despite the common effect of learning accurate clusterings, RKD facilitates a "global" perspective through spectral clustering, whereas consistency regularization focuses on a "local" perspective via expansion.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/08/2023

SLaM: Student-Label Mixing for Semi-Supervised Knowledge Distillation

Semi-supervised knowledge distillation is a powerful training paradigm f...
research
05/13/2022

Knowledge Distillation Meets Open-Set Semi-Supervised Learning

Existing knowledge distillation methods mostly focus on distillation of ...
research
11/11/2022

Knowledge Distillation from Cross Teaching Teachers for Efficient Semi-Supervised Abdominal Organ Segmentation in CT

For more clinical applications of deep learning models for medical image...
research
03/26/2021

Multimodal Knowledge Expansion

The popularity of multimodal sensors and the accessibility of the Intern...
research
10/23/2020

Iterative Graph Self-Distillation

How to discriminatively vectorize graphs is a fundamental challenge that...
research
08/02/2016

Relational Similarity Machines

This paper proposes Relational Similarity Machines (RSM): a fast, accura...

Please sign up or login with your details

Forgot password? Click here to reset