Few-Shot Dialogue Summarization via Skeleton-Assisted Prompt Transfer

05/20/2023
by   Kaige Xie, et al.
0

In real-world scenarios, labeled samples for dialogue summarization are usually limited (i.e., few-shot) due to high annotation costs for high-quality dialogue summaries. To efficiently learn from few-shot samples, previous works have utilized massive annotated data from other downstream tasks and then performed prompt transfer in prompt tuning so as to enable cross-task knowledge transfer. However, existing general-purpose prompt transfer techniques lack consideration for dialogue-specific information. In this paper, we focus on improving the prompt transfer from dialogue state tracking to dialogue summarization and propose Skeleton-Assisted Prompt Transfer (SAPT), which leverages skeleton generation as extra supervision that functions as a medium connecting the distinct source and target task and resulting in the model's better consumption of dialogue state information. To automatically extract dialogue skeletons as supervised training data for skeleton generation, we design a novel approach with perturbation-based probes requiring neither annotation effort nor domain knowledge. Training the model on such skeletons can also help preserve model capability during prompt transfer. Our method significantly outperforms existing baselines. In-depth analyses demonstrate the effectiveness of our method in facilitating cross-task knowledge transfer in few-shot dialogue summarization.

READ FULL TEXT
research
03/03/2022

Dialogue Summaries as Dialogue States (DS2), Template-Guided Summarization for Few-shot Dialogue State Tracking

Annotating task-oriented dialogues is notorious for the expensive and di...
research
10/17/2022

Leveraging Non-dialogue Summaries for Dialogue Summarization

To mitigate the lack of diverse dialogue summarization datasets in acade...
research
01/15/2022

Prompt Learning for Few-Shot Dialogue State Tracking

Collecting dialogue state labels, slots and values, for learning dialogu...
research
08/16/2019

Few-Shot Dialogue Generation Without Annotated Data: A Transfer Learning Approach

Learning with minimal data is one of the key challenges in the developme...
research
02/18/2023

Zero and Few-Shot Localization of Task-Oriented Dialogue Agents with a Distilled Representation

Task-oriented Dialogue (ToD) agents are mostly limited to a few widely-s...
research
12/16/2021

CONFIT: Toward Faithful Dialogue Summarization with Linguistically-Informed Contrastive Fine-tuning

Factual inconsistencies in generated summaries severely limit the practi...
research
10/27/2022

He Said, She Said: Style Transfer for Shifting the Perspective of Dialogues

In this work, we define a new style transfer task: perspective shift, wh...

Please sign up or login with your details

Forgot password? Click here to reset