Exploring Zero and Few-shot Techniques for Intent Classification

05/11/2023
by   Soham Parikh, et al.
0

Conversational NLU providers often need to scale to thousands of intent-classification models where new customers often face the cold-start problem. Scaling to so many customers puts a constraint on storage space as well. In this paper, we explore four different zero and few-shot intent classification approaches with this low-resource constraint: 1) domain adaptation, 2) data augmentation, 3) zero-shot intent classification using descriptions large language models (LLMs), and 4) parameter-efficient fine-tuning of instruction-finetuned language models. Our results show that all these approaches are effective to different degrees in low-resource settings. Parameter-efficient fine-tuning using T-few recipe (Liu et al., 2022) on Flan-T5 (Chang et al., 2022) yields the best performance even with just one sample per intent. We also show that the zero-shot method of prompting LLMs using intent descriptions

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/26/2023

Zero-Shot Slot and Intent Detection in Low-Resource Languages

Intent detection and slot filling are critical tasks in spoken and natur...
research
08/15/2022

Z-BERT-A: a zero-shot Pipeline for Unknown Intent detection

Intent discovery is a fundamental task in NLP, and it is increasingly re...
research
04/05/2022

Data Augmentation for Intent Classification with Off-the-shelf Large Language Models

Data augmentation is a widely employed technique to alleviate the proble...
research
05/23/2023

GrACE: Generation using Associated Code Edits

Developers expend a significant amount of time in editing code for a var...
research
04/16/2021

Surface Form Competition: Why the Highest Probability Answer Isn't Always Right

Large language models have shown promising results in zero-shot settings...
research
04/27/2022

On the Limitations of Dataset Balancing: The Lost Battle Against Spurious Correlations

Recent work has shown that deep learning models in NLP are highly sensit...
research
04/26/2023

Is a prompt and a few samples all you need? Using GPT-4 for data augmentation in low-resource classification tasks

Obtaining and annotating data can be expensive and time-consuming, espec...

Please sign up or login with your details

Forgot password? Click here to reset