Attentive Memory Networks: Efficient Machine Reading for Conversational Search

12/19/2017
by   Tom Kenter, et al.
0

Recent advances in conversational systems have changed the search paradigm. Traditionally, a user poses a query to a search engine that returns an answer based on its index, possibly leveraging external knowledge bases and conditioning the response on earlier interactions in the search session. In a natural conversation, there is an additional source of information to take into account: utterances produced earlier in a conversation can also be referred to and a conversational IR system has to keep track of information conveyed by the user during the conversation, even if it is implicit. We argue that the process of building a representation of the conversation can be framed as a machine reading task, where an automated system is presented with a number of statements about which it should answer questions. The questions should be answered solely by referring to the statements provided, without consulting external knowledge. The time is right for the information retrieval community to embrace this task, both as a stand-alone task and integrated in a broader conversational search setting. In this paper, we focus on machine reading as a stand-alone task and present the Attentive Memory Network (AMN), an end-to-end trainable machine reading algorithm. Its key contribution is in efficiency, achieved by having an hierarchical input encoder, iterating over the input only once. Speed is an important requirement in the setting of conversational search, as gaps between conversational turns have a detrimental effect on naturalness. On 20 datasets commonly used for evaluating machine reading algorithms we show that the AMN achieves performance comparable to the state-of-the-art models, while using considerably fewer computations.

READ FULL TEXT
research
06/06/2019

Conversing by Reading: Contentful Neural Conversation with On-demand Machine Reading

Although neural conversation models are effective in learning how to pro...
research
04/13/2021

BERT Embeddings Can Track Context in Conversational Search

The use of conversational assistants to search for information is becomi...
research
06/13/2020

Guided Transformer: Leveraging Multiple External Sources for Representation Learning in Conversational Search

Asking clarifying questions in response to ambiguous or faceted queries ...
research
09/13/2021

Keyword Extraction for Improved Document Retrieval in Conversational Search

Recent research has shown that mixed-initiative conversational search, b...
research
05/02/2021

Intelligent Conversational Android ERICA Applied to Attentive Listening and Job Interview

Following the success of spoken dialogue systems (SDS) in smartphone ass...
research
04/17/2023

Reward-free Policy Imitation Learning for Conversational Search

Existing conversational search studies mainly focused on asking better c...
research
05/26/2020

EMT: Explicit Memory Tracker with Coarse-to-Fine Reasoning for Conversational Machine Reading

The goal of conversational machine reading is to answer user questions g...

Please sign up or login with your details

Forgot password? Click here to reset