Prompt-Learning for Short Text Classification
In the short text, the extreme short length, feature sparsity and high ambiguity pose huge challenge to classification tasks. Recently, as an effective method for tuning Pre-trained Language Models for specific downstream tasks, prompt-learning has attracted vast amount of attention and research. The main intuition behind the prompt-learning is to insert template into the input and convert the text classification tasks into equivalent cloze-style tasks. However, most prompt-learning methods expand label words manually or only consider the class name for knowledge incorporating in cloze-style prediction, which will inevitably incurred omissions and bias in classification tasks. In this paper, we propose a simple short text classification approach that makes use of prompt-learning based on knowledgeable expansion, which can consider both the short text itself and class name during expanding label words space. Specifically, the top N concepts related to the entity in short text are retrieved from the open Knowledge Graph like Probase, and we further refine the expanded label words by the distance calculation between selected concepts and class label. Experimental results show that our approach obtains obvious improvement compared with other fine-tuning, prompt-learning and knowledgeable prompt-tuning methods, outperforming the state-of-the-art by up to 6 Accuracy points on three well-known datasets.
READ FULL TEXT