Beyond Goldfish Memory: Long-Term Open-Domain Conversation

by   Jing Xu, et al.

Despite recent improvements in open-domain dialogue models, state of the art models are trained and evaluated on short conversations with little context. In contrast, the long-term conversation setting has hardly been studied. In this work we collect and release a human-human dataset consisting of multiple chat sessions whereby the speaking partners learn about each other's interests and discuss the things they have learnt from past sessions. We show how existing models trained on existing datasets perform poorly in this long-term conversation setting in both automatic and human evaluations, and we study long-context models that can perform much better. In particular, we find retrieval-augmented methods and methods with an ability to summarize and recall previous conversations outperform the standard encoder-decoder architectures currently considered state of the art.


page 3

page 9

page 13

page 14


Long Time No See! Open-Domain Conversation with Long-Term Persona Memory

Most of the open-domain dialogue models tend to perform poorly in the se...

Engaging Image Chat: Modeling Personality in Grounded Dialogue

To achieve the long-term goal of machines being able to engage humans in...

Incorporating Loose-Structured Knowledge into Conversation Modeling via Recall-Gate LSTM

Modeling human conversations is the essence for building satisfying chat...

Long-Term Feature Banks for Detailed Video Understanding

To understand the world, we humans constantly need to relate the present...

Continuity of Topic, Interaction, and Query: Learning to Quote in Online Conversations

Quotations are crucial for successful explanations and persuasions in in...

The Design and Implementation of XiaoIce, an Empathetic Social Chatbot

This paper describes the development of the Microsoft XiaoIce system, th...

Towards a Human-like Open-Domain Chatbot

We present Meena, a multi-turn open-domain chatbot trained end-to-end on...