The Fixed Sub-Center: A Better Way to Capture Data Complexity

03/24/2022
by   Zhemin Zhang, et al.
0

Treating class with a single center may hardly capture data distribution complexities. Using multiple sub-centers is an alternative way to address this problem. However, highly correlated sub-classes, the classifier's parameters grow linearly with the number of classes, and lack of intra-class compactness are three typical issues that need to be addressed in existing multi-subclass methods. To this end, we propose to use Fixed Sub-Center (F-SC), which allows the model to create more discrepant sub-centers while saving memory and cutting computational costs considerably. The F-SC specifically, first samples a class center Ui for each class from a uniform distribution, and then generates a normal distribution for each class, where the mean is equal to Ui. Finally, the sub-centers are sampled based on the normal distribution corresponding to each class, and the sub-centers are fixed during the training process avoiding the overhead of gradient calculation. Moreover, F-SC penalizes the Euclidean distance between the samples and their corresponding sub-centers, it helps remain intra-compactness. The experimental results show that F-SC significantly improves the accuracy of both image classification and fine-grained recognition tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/24/2017

Contrastive-center loss for deep neural networks

The deep convolutional neural network(CNN) has significantly raised the ...
research
01/12/2023

Semantic Segmentation via Pixel-to-Center Similarity Calculation

Since the fully convolutional network has achieved great success in sema...
research
07/11/2023

DRMC: A Generalist Model with Dynamic Routing for Multi-Center PET Image Synthesis

Multi-center positron emission tomography (PET) image synthesis aims at ...
research
04/30/2021

Center Prediction Loss for Re-identification

The training loss function that enforces certain training sample distrib...
research
09/09/2021

Fine-grained Data Distribution Alignment for Post-Training Quantization

While post-training quantization receives popularity mostly due to its e...
research
08/08/2022

Rethinking Robust Representation Learning Under Fine-grained Noisy Faces

Learning robust feature representation from large-scale noisy faces stan...
research
04/22/2022

Some Optimization Solutions for Relief Distribution

Humanitarian logistics remain a challenging area of application for oper...

Please sign up or login with your details

Forgot password? Click here to reset