ClonalNet: Classifying Better by Focusing on Confusing Categories

10/14/2021
by   Xue Zhang, et al.
0

Existing neural classification networks predominately adopt one-hot encoding due to its simplicity in representing categorical data. However, the one-hot representation neglects inter-category correlations, which may result in poor generalization. Herein, we observe that a pre-trained baseline network has paid attention to the target image region even though it incorrectly predicts the image, revealing which categories confuse the baseline. This observation motivates us to consider inter-category correlations. Therefore, we propose a clonal network, named ClonalNet, which learns to discriminate between confusing categories derived from the pre-trained baseline. The ClonalNet architecture can be identical or smaller than the baseline architecture. When identical, ClonalNet is a clonal version of the baseline but does not share weights. When smaller, the training process of ClonalNet resembles that of the standard knowledge distillation. The difference from knowledge distillation is that we design a focusing-picking loss to optimize ClonalNet. This novel loss enforces ClonalNet to concentrate on confusing categories and make more confident predictions on ground-truth labels with the baseline reference. Experiments show that ClonalNet significantly outperforms baseline networks and knowledge distillation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/01/2018

On Compressing U-net Using Knowledge Distillation

We study the use of knowledge distillation to compress the U-net archite...
research
02/25/2022

Learn From the Past: Experience Ensemble Knowledge Distillation

Traditional knowledge distillation transfers "dark knowledge" of a pre-t...
research
12/31/2021

Conditional Generative Data-Free Knowledge Distillation based on Attention Transfer

Knowledge distillation has made remarkable achievements in model compres...
research
01/21/2019

Network Transplanting (extended abstract)

This paper focuses on a new task, i.e., transplanting a category-and-tas...
research
03/25/2020

Circumventing Outliers of AutoAugment with Knowledge Distillation

AutoAugment has been a powerful algorithm that improves the accuracy of ...
research
08/28/2020

Background Splitting: Finding Rare Classes in a Sea of Background

We focus on the real-world problem of training accurate deep models for ...
research
04/26/2018

Network Transplanting

This paper focuses on a novel problem, i.e., transplanting a category-an...

Please sign up or login with your details

Forgot password? Click here to reset