Contrastive Multi-Task Dense Prediction

07/16/2023
by   Siwei Yang, et al.
0

This paper targets the problem of multi-task dense prediction which aims to achieve simultaneous learning and inference on a bunch of multiple dense prediction tasks in a single framework. A core objective in design is how to effectively model cross-task interactions to achieve a comprehensive improvement on different tasks based on their inherent complementarity and consistency. Existing works typically design extra expensive distillation modules to perform explicit interaction computations among different task-specific features in both training and inference, bringing difficulty in adaptation for different task sets, and reducing efficiency due to clearly increased size of multi-task models. In contrast, we introduce feature-wise contrastive consistency into modeling the cross-task interactions for multi-task dense prediction. We propose a novel multi-task contrastive regularization method based on the consistency to effectively boost the representation learning of the different sub-tasks, which can also be easily generalized to different multi-task dense prediction frameworks, and costs no additional computation in the inference. Extensive experiments on two challenging datasets (i.e. NYUD-v2 and Pascal-Context) clearly demonstrate the superiority of the proposed multi-task contrastive learning approach for dense predictions, establishing new state-of-the-art performances.

READ FULL TEXT

page 1

page 3

page 7

research
06/08/2023

InvPT++: Inverted Pyramid Multi-Task Transformer for Visual Scene Understanding

Multi-task scene understanding aims to design models that can simultaneo...
research
03/15/2022

Inverted Pyramid Multi-task Transformer for Dense Scene Understanding

Multi-task dense scene understanding is a thriving research domain that ...
research
04/28/2021

Exploring Relational Context for Multi-Task Dense Prediction

The timeline of computer vision research is marked with advances in lear...
research
10/09/2021

X-model: Improving Data Efficiency in Deep Learning with A Minimax Model

To mitigate the burden of data labeling, we aim at improving data effici...
research
06/17/2022

Cross-task Attention Mechanism for Dense Multi-task Learning

Multi-task learning has recently become a promising solution for a compr...
research
04/22/2021

XCrossNet: Feature Structure-Oriented Learning for Click-Through Rate Prediction

Click-Through Rate (CTR) prediction is a core task in nowadays commercia...
research
05/23/2023

Pre-training Multi-task Contrastive Learning Models for Scientific Literature Understanding

Scientific literature understanding tasks have gained significant attent...

Please sign up or login with your details

Forgot password? Click here to reset