Weakly Supervised Semantic Segmentation via Alternative Self-Dual Teaching

12/17/2021
by   Dingwen Zhang, et al.
11

Current weakly supervised semantic segmentation (WSSS) frameworks usually contain the separated mask-refinement model and the main semantic region mining model. These approaches would contain redundant feature extraction backbones and biased learning objectives, making them computational complex yet sub-optimal to addressing the WSSS task. To solve this problem, this paper establishes a compact learning framework that embeds the classification and mask-refinement components into a unified deep model. With the shared feature extraction backbone, our model is able to facilitate knowledge sharing between the two components while preserving a low computational complexity. To encourage high-quality knowledge interaction, we propose a novel alternative self-dual teaching (ASDT) mechanism. Unlike the conventional distillation strategy, the knowledge of the two teacher branches in our model is alternatively distilled to the student branch by a Pulse Width Modulation (PWM), which generates PW wave-like selection signal to guide the knowledge distillation process. In this way, the student branch can help prevent the model from falling into local minimum solutions caused by the imperfect knowledge provided of either teacher branch. Comprehensive experiments on the PASCAL VOC 2012 and COCO-Stuff 10K demonstrate the effectiveness of the proposed alternative self-dual teaching mechanism as well as the new state-of-the-art performance of our approach.

READ FULL TEXT

page 1

page 2

page 4

page 5

page 7

page 14

research
03/15/2021

Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge Distillation

Knowledge distillation is a method of transferring the knowledge from a ...
research
08/30/2021

Seminar Learning for Click-Level Weakly Supervised Semantic Segmentation

Annotation burden has become one of the biggest barriers to semantic seg...
research
09/06/2023

DMKD: Improving Feature-based Knowledge Distillation for Object Detection Via Dual Masking Augmentation

Recent mainstream masked distillation methods function by reconstructing...
research
07/11/2023

The Staged Knowledge Distillation in Video Classification: Harmonizing Student Progress by a Complementary Weakly Supervised Framework

In the context of label-efficient learning on video data, the distillati...
research
08/09/2023

Branches Mutual Promotion for End-to-End Weakly Supervised Semantic Segmentation

End-to-end weakly supervised semantic segmentation aims at optimizing a ...
research
09/07/2021

Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution

Knowledge distillation (KD) is an effective framework that aims to trans...
research
12/18/2021

Anomaly Discovery in Semantic Segmentation via Distillation Comparison Networks

This paper aims to address the problem of anomaly discovery in semantic ...

Please sign up or login with your details

Forgot password? Click here to reset