ConsPrompt: Easily Exploiting Contrastive Samples for Few-shot Prompt Learning

11/08/2022
by   Jinta Weng, et al.
0

Prompt learning recently become an effective linguistic tool to motivate the PLMs' knowledge on few-shot-setting tasks. However, studies have shown the lack of robustness still exists in prompt learning, since suitable initialization of continuous prompt and expert-first manual prompt are essential in fine-tuning process. What is more, human also utilize their comparative ability to motivate their existing knowledge for distinguishing different examples. Motivated by this, we explore how to use contrastive samples to strengthen prompt learning. In detail, we first propose our model ConsPrompt combining with prompt encoding network, contrastive sampling module, and contrastive scoring module. Subsequently, two sampling strategies, similarity-based and label-based strategies, are introduced to realize differential contrastive learning. The effectiveness of proposed ConsPrompt is demonstrated in five different few-shot learning tasks and shown the similarity-based sampling strategy is more effective than label-based in combining contrastive learning. Our results also exhibits the state-of-the-art performance and robustness in different few-shot settings, which proves that the ConsPrompt could be assumed as a better knowledge probe to motivate PLMs.

READ FULL TEXT
research
06/02/2022

Hard Negative Sampling Strategies for Contrastive Representation Learning

One of the challenges in contrastive learning is the selection of approp...
research
12/01/2022

Research on the application of contrastive learning in multi-label text classification

The effective application of contrastive learning technology in natural ...
research
07/06/2023

Multi-Similarity Contrastive Learning

Given a similarity metric, contrastive methods learn a representation in...
research
05/27/2021

Contrastive Fine-tuning Improves Robustness for Neural Rankers

The performance of state-of-the-art neural rankers can deteriorate subst...
research
05/03/2022

Contrastive Learning for Prompt-Based Few-Shot Language Learners

The impressive performance of GPT-3 using natural language prompts and i...
research
10/29/2022

STPrompt: Semantic-guided and Task-driven prompts for Effective Few-shot Classification

The effectiveness of prompt learning has been demonstrated in different ...
research
10/14/2020

Function Contrastive Learning of Transferable Representations

Few-shot-learning seeks to find models that are capable of fast-adaptati...

Please sign up or login with your details

Forgot password? Click here to reset