Contrastive Knowledge Amalgamation for Unsupervised Image Classification

07/27/2023
by   Shangde Gao, et al.
0

Knowledge amalgamation (KA) aims to learn a compact student model to handle the joint objective from multiple teacher models that are are specialized for their own tasks respectively. Current methods focus on coarsely aligning teachers and students in the common representation space, making it difficult for the student to learn the proper decision boundaries from a set of heterogeneous teachers. Besides, the KL divergence in previous works only minimizes the probability distribution difference between teachers and the student, ignoring the intrinsic characteristics of teachers. Therefore, we propose a novel Contrastive Knowledge Amalgamation (CKA) framework, which introduces contrastive losses and an alignment loss to achieve intra-class cohesion and inter-class separation.Contrastive losses intra- and inter- models are designed to widen the distance between representations of different classes. The alignment loss is introduced to minimize the sample-level distribution differences of teacher-student models in the common representation space.Furthermore, the student learns heterogeneous unsupervised classification tasks through soft targets efficiently and flexibly in the task-level amalgamation. Extensive experiments on benchmarks demonstrate the generalization capability of CKA in the amalgamation of specific task as well as multiple tasks. Comprehensive ablation studies provide a further insight into our CKA.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/29/2021

Complementary Relation Contrastive Distillation

Knowledge distillation aims to transfer representation ability from a te...
research
07/07/2021

Categorical Relation-Preserving Contrastive Knowledge Distillation for Medical Image Classification

The amount of medical images for training deep classification models is ...
research
03/14/2023

A Contrastive Knowledge Transfer Framework for Model Compression and Transfer Learning

Knowledge Transfer (KT) achieves competitive performance and is widely u...
research
01/06/2022

Contrastive Neighborhood Alignment

We present Contrastive Neighborhood Alignment (CNA), a manifold learning...
research
09/21/2020

Feature Distillation With Guided Adversarial Contrastive Learning

Deep learning models are shown to be vulnerable to adversarial examples....
research
03/25/2021

Jointly Modeling Heterogeneous Student Behaviors and Interactions Among Multiple Prediction Tasks

Prediction tasks about students have practical significance for both stu...
research
04/25/2021

Class Equilibrium using Coulomb's Law

Projection algorithms learn a transformation function to project the dat...

Please sign up or login with your details

Forgot password? Click here to reset