Positive-Negative Equal Contrastive Loss for Semantic Segmentation

07/04/2022
by   Jing Wang, et al.
0

The contextual information is critical for various computer vision tasks, previous works commonly design plug-and-play modules and structural losses to effectively extract and aggregate the global context. These methods utilize fine-label to optimize the model but ignore that fine-trained features are also precious training resources, which can introduce preferable distribution to hard pixels (i.e., misclassified pixels). Inspired by contrastive learning in unsupervised paradigm, we apply the contrastive loss in a supervised manner and re-design the loss function to cast off the stereotype of unsupervised learning (e.g., imbalance of positives and negatives, confusion of anchors computing). To this end, we propose Positive-Negative Equal contrastive loss (PNE loss), which increases the latent impact of positive embedding on the anchor and treats the positive as well as negative sample pairs equally. The PNE loss can be directly plugged right into existing semantic segmentation frameworks and leads to excellent performance with neglectable extra computational costs. We utilize a number of classic segmentation methods (e.g., DeepLabV3, OCRNet, UperNet) and backbone (e.g., ResNet, HRNet, Swin Transformer) to conduct comprehensive experiments and achieve state-of-the-art performance on two benchmark datasets (e.g., Cityscapes and COCO-Stuff). Our code will be publicly available soon.

READ FULL TEXT

page 5

page 8

page 13

research
03/09/2021

Doubly Contrastive Deep Clustering

Deep clustering successfully provides more effective features than conve...
research
06/03/2021

Attention-Guided Supervised Contrastive Learning for Semantic Segmentation

Contrastive learning has shown superior performance in embedding global ...
research
09/01/2021

Multi-Sample based Contrastive Loss for Top-k Recommendation

The top-k recommendation is a fundamental task in recommendation systems...
research
11/15/2022

False: False Negative Samples Aware Contrastive Learning for Semantic Segmentation of High-Resolution Remote Sensing Image

The existing SSCL of RSI is built based on constructing positive and neg...
research
03/09/2023

Distortion-Disentangled Contrastive Learning

Self-supervised learning is well known for its remarkable performance in...
research
05/08/2018

Reasoning with Sarcasm by Reading In-between

Sarcasm is a sophisticated speech act which commonly manifests on social...
research
06/15/2021

Evaluating Modules in Graph Contrastive Learning

The recent emergence of contrastive learning approaches facilitates the ...

Please sign up or login with your details

Forgot password? Click here to reset