Language-Aware Soft Prompting for Vision Language Foundation Models

10/03/2022
by   Adrian Bulat, et al.
0

This paper is on soft prompt learning for Vision & Language (V L) models. Similarly to their NLP counterparts, V&L models can be adapted to a downstream task by learning soft continuous prompts using a few training examples. Current methods learn the soft prompts by minimizing a cross-entropy loss using as class weights the features obtained by passing the prompts plus the class names through the text encoder. Such methods, however, significantly overfit the training data suffering from large accuracy degradation when tested on unseen classes from the same domain. Our main contribution, in this paper, is a surprisingly simple approach to alleviate this problem: we use a second cross entropy loss to minimize the distance between the learned soft prompts and a set of hand-engineered manual prompts (obtained by prompt engineering). The proposed loss can be interpreted in multiple ways including as a regularizer, as a means for language-based augmentation, and as a way of learning more discriminative class centroids. Importantly, our formulation is inherently amenable to including, during training, virtual classes, i.e. class names for which no visual samples are available, further increasing the robustness of the learned prompts. Through extensive evaluations on 11 datasets, we show that our approach (a) significantly outperforms all prior works on soft prompting, and (b) matches and surpasses, for the first time, the accuracy on novel classes obtained by hand-crafted prompts and CLIP for the majority of the test datasets. Code will be made available.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/28/2021

Striking the Right Balance: Recall Loss for Semantic Segmentation

Class imbalance is a fundamental problem in computer vision applications...
research
12/08/2022

Learning Domain Invariant Prompt for Vision-Language Models

Prompt learning is one of the most effective and trending ways to adapt ...
research
09/04/2020

Imbalanced Image Classification with Complement Cross Entropy

Recently, deep learning models have achieved great success in computer v...
research
12/23/2018

Leveraging Class Similarity to Improve Deep Neural Network Robustness

Traditionally artificial neural networks (ANNs) are trained by minimizin...
research
07/26/2022

Learning Hierarchy Aware Features for Reducing Mistake Severity

Label hierarchies are often available apriori as part of biological taxo...
research
01/24/2019

Learned Belief-Propagation Decoding with Simple Scaling and SNR Adaptation

We consider the weighted belief-propagation (WBP) decoder recently propo...
research
04/16/2021

Lower Bounds on Cross-Entropy Loss in the Presence of Test-time Adversaries

Understanding the fundamental limits of robust supervised learning has e...

Please sign up or login with your details

Forgot password? Click here to reset