Self Correspondence Distillation for End-to-End Weakly-Supervised Semantic Segmentation

02/27/2023
by   Rongtao Xu, et al.
0

Efficiently training accurate deep models for weakly supervised semantic segmentation (WSSS) with image-level labels is challenging and important. Recently, end-to-end WSSS methods have become the focus of research due to their high training efficiency. However, current methods suffer from insufficient extraction of comprehensive semantic information, resulting in low-quality pseudo-labels and sub-optimal solutions for end-to-end WSSS. To this end, we propose a simple and novel Self Correspondence Distillation (SCD) method to refine pseudo-labels without introducing external supervision. Our SCD enables the network to utilize feature correspondence derived from itself as a distillation target, which can enhance the network's feature learning process by complementing semantic information. In addition, to further improve the segmentation accuracy, we design a Variation-aware Refine Module to enhance the local consistency of pseudo-labels by computing pixel-level variation. Finally, we present an efficient end-to-end Transformer-based framework (TSCD) via SCD and Variation-aware Refine Module for the accurate WSSS task. Extensive experiments on the PASCAL VOC 2012 and MS COCO 2014 datasets demonstrate that our method significantly outperforms other state-of-the-art methods. Our code is available at https://github.com/Rongtao-Xu/RepresentationLearning/tree/main/SCD-AAAI2023.

READ FULL TEXT

page 1

page 3

page 5

page 6

page 7

research
03/05/2022

Learning Affinity from Attention: End-to-End Weakly-Supervised Semantic Segmentation with Transformers

Weakly-supervised semantic segmentation (WSSS) with image-level labels i...
research
10/25/2022

Pointly-Supervised Panoptic Segmentation

In this paper, we propose a new approach to applying point-level annotat...
research
03/01/2023

DMSA: Dynamic Multi-scale Unsupervised Semantic Segmentation Based on Adaptive Affinity

The proposed method in this paper proposes an end-to-end unsupervised se...
research
12/07/2021

SSAT: A Symmetric Semantic-Aware Transformer Network for Makeup Transfer and Removal

Makeup transfer is not only to extract the makeup style of the reference...
research
04/05/2022

Joint Learning of Feature Extraction and Cost Aggregation for Semantic Correspondence

Establishing dense correspondences across semantically similar images is...
research
03/19/2022

Learning Self-Supervised Low-Rank Network for Single-Stage Weakly and Semi-Supervised Semantic Segmentation

Semantic segmentation with limited annotations, such as weakly supervise...
research
04/18/2023

Coupling Global Context and Local Contents for Weakly-Supervised Semantic Segmentation

Thanks to the advantages of the friendly annotations and the satisfactor...

Please sign up or login with your details

Forgot password? Click here to reset