Inconsistent Few-Shot Relation Classification via Cross-Attentional Prototype Networks with Contrastive Learning

10/13/2021
by   Hongru Wang, et al.
0

Standard few-shot relation classification (RC) is designed to learn a robust classifier with only few labeled data for each class. However, previous works rarely investigate the effects of a different number of classes (i.e., N-way) and number of labeled data per class (i.e., K-shot) during training vs. testing. In this work, we define a new task, inconsistent few-shot RC, where the model needs to handle the inconsistency of N and K between training and testing. To address this new task, we propose Prototype Network-based cross-attention contrastive learning (ProtoCACL) to capture the rich mutual interactions between the support set and query set. Experimental results demonstrate that our ProtoCACL can outperform the state-of-the-art baseline model under both inconsistent K and inconsistent N settings, owing to its more robust and discriminate representations. Moreover, we identify that in the inconsistent few-shot learning setting, models can achieve better performance with less data than the standard few-shot setting with carefully-selected N and K. In the end of the paper, we provide further analyses and suggestions to systematically guide the selection of N and K under different scenarios.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset