MoP-CLIP: A Mixture of Prompt-Tuned CLIP Models for Domain Incremental Learning

07/11/2023
by   Julien Nicolas, et al.
0

Despite the recent progress in incremental learning, addressing catastrophic forgetting under distributional drift is still an open and important problem. Indeed, while state-of-the-art domain incremental learning (DIL) methods perform satisfactorily within known domains, their performance largely degrades in the presence of novel domains. This limitation hampers their generalizability, and restricts their scalability to more realistic settings where train and test data are drawn from different distributions. To address these limitations, we present a novel DIL approach based on a mixture of prompt-tuned CLIP models (MoP-CLIP), which generalizes the paradigm of S-Prompting to handle both in-distribution and out-of-distribution data at inference. In particular, at the training stage we model the features distribution of every class in each domain, learning individual text and visual prompts to adapt to a given domain. At inference, the learned distributions allow us to identify whether a given test sample belongs to a known domain, selecting the correct prompt for the classification task, or from an unseen domain, leveraging a mixture of the prompt-tuned CLIP models. Our empirical evaluation reveals the poor performance of existing DIL methods under domain shift, and suggests that the proposed MoP-CLIP performs competitively in the standard DIL settings while outperforming state-of-the-art methods in OOD scenarios. These results demonstrate the superiority of MoP-CLIP, offering a robust and general solution to the problem of domain incremental learning.

READ FULL TEXT

page 3

page 4

research
05/06/2022

Forget Less, Count Better: A Domain-Incremental Self-Distillation Learning Benchmark for Lifelong Crowd Counting

Crowd Counting has important applications in public safety and pandemic ...
research
04/07/2022

Incremental Prototype Prompt-tuning with Pre-trained Representation for Class Incremental Learning

Class incremental learning has attracted much attention, but most existi...
research
07/23/2021

VisDA-2021 Competition Universal Domain Adaptation to Improve Performance on Out-of-Distribution Data

Progress in machine learning is typically measured by training and testi...
research
06/17/2022

TKIL: Tangent Kernel Approach for Class Balanced Incremental Learning

When learning new tasks in a sequential manner, deep neural networks ten...
research
03/24/2023

Principles of Forgetting in Domain-Incremental Semantic Segmentation in Adverse Weather Conditions

Deep neural networks for scene perception in automated vehicles achieve ...
research
02/01/2022

Finding lost DG: Explaining domain generalization via model complexity

The domain generalization (DG) problem setting challenges a model traine...
research
05/24/2023

MRN: Multiplexed Routing Network for Incremental Multilingual Text Recognition

Traditional Multilingual Text Recognition (MLTR) usually targets a fixed...

Please sign up or login with your details

Forgot password? Click here to reset