Label Agnostic Pre-training for Zero-shot Text Classification

05/25/2023
by   Christopher Clarke, et al.
0

Conventional approaches to text classification typically assume the existence of a fixed set of predefined labels to which a given text can be classified. However, in real-world applications, there exists an infinite label space for describing a given text. In addition, depending on the aspect (sentiment, topic, etc.) and domain of the text (finance, legal, etc.), the interpretation of the label can vary greatly. This makes the task of text classification, particularly in the zero-shot scenario, extremely challenging. In this paper, we investigate the task of zero-shot text classification with the aim of improving the ability of pre-trained language models (PLMs) to generalize to both seen and unseen data across varying aspects and domains. To solve this we introduce two new simple yet effective pre-training strategies, Implicit and Explicit pre-training. These methods inject aspect-level understanding into the model at train time with the goal of conditioning the model to build task-level understanding. To evaluate this, we construct and release UTCD, a new benchmark dataset for evaluating text classification in zero-shot settings. Experimental results on UTCD show that our approach achieves improved zero-shot generalization on a suite of challenging datasets across an array of zero-shot formalizations.

READ FULL TEXT
research
05/03/2023

The Benefits of Label-Description Training for Zero-Shot Text Classification

Large language models have improved zero-shot text classification by all...
research
08/31/2019

Benchmarking Zero-shot Text Classification: Datasets, Evaluation and Entailment Approach

Zero-shot text classification (0Shot-TC) is a challenging NLU problem to...
research
07/15/2023

Prompt Tuning on Graph-augmented Low-resource Text Classification

Text classification is a fundamental problem in information retrieval wi...
research
07/28/2023

WC-SBERT: Zero-Shot Text Classification via SBERT with Self-Training for Wikipedia Categories

Our research focuses on solving the zero-shot text classification proble...
research
06/29/2023

Towards Open-Domain Topic Classification

We introduce an open-domain topic classification system that accepts use...
research
10/23/2022

Conformal Predictor for Improving Zero-shot Text Classification Efficiency

Pre-trained language models (PLMs) have been shown effective for zero-sh...

Please sign up or login with your details

Forgot password? Click here to reset