Exploring Effective Information Utilization in Multi-Turn Topic-Driven Conversations

09/01/2022
by   Jiatong Li, et al.
0

Conversations are always related to certain topics. However, it is challenging to fuse dialogue history and topic information from various sources at the same time in current dialogue generation models because of the input length limit of pre-trained language models (PLMs). In order to expand the information that PLMs can utilize, we encode topic and dialogue history information using certain prompts with multiple channels of Fusion-in-Decoder (FiD) and explore the influence of three different channel settings. In this paper, our experiments focus on a specific Chinese dataset named NaturalConv, where the conversation revolves around a piece of recent news. We thoroughly compared different dialogue models and different FiD channel settings. Empirical results show that by combining our proposed whole passage channel with additional history channel, our methods can achieve competitive performance on NaturalConv, making it possible to encode various information from excessively long texts.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/03/2021

NaturalConv: A Chinese Dialogue Dataset Towards Multi-turn Topic-driven Conversation

In this paper, we propose a Chinese multi-turn topic-driven conversation...
research
05/23/2023

Multi-Granularity Prompts for Topic Shift Detection in Dialogue

The goal of dialogue topic shift detection is to identify whether the cu...
research
10/15/2020

Pretrained Language Models for Dialogue Generation with Multiple Input Sources

Large-scale pretrained language models have achieved outstanding perform...
research
10/16/2022

CDConv: A Benchmark for Contradiction Detection in Chinese Conversations

Dialogue contradiction is a critical issue in open-domain dialogue syste...
research
06/18/2021

Continuity of Topic, Interaction, and Query: Learning to Quote in Online Conversations

Quotations are crucial for successful explanations and persuasions in in...
research
04/18/2022

Back to the Future: Bidirectional Information Decoupling Network for Multi-turn Dialogue Modeling

Multi-turn dialogue modeling as a challenging branch of natural language...
research
03/15/2022

Seamlessly Integrating Factual Information and Social Content with Persuasive Dialogue

Effective human-chatbot conversations need to achieve both coherence and...

Please sign up or login with your details

Forgot password? Click here to reset