Toward Connecting Speech Acts and Search Actions in Conversational Search Tasks

05/08/2023
by   Souvick Ghosh, et al.
0

Conversational search systems can improve user experience in digital libraries by facilitating a natural and intuitive way to interact with library content. However, most conversational search systems are limited to performing simple tasks and controlling smart devices. Therefore, there is a need for systems that can accurately understand the user's information requirements and perform the appropriate search activity. Prior research on intelligent systems suggested that it is possible to comprehend the functional aspect of discourse (search intent) by identifying the speech acts in user dialogues. In this work, we automatically identify the speech acts associated with spoken utterances and use them to predict the system-level search actions. First, we conducted a Wizard-of-Oz study to collect data from 75 search sessions. We performed thematic analysis to curate a gold standard dataset – containing 1,834 utterances and 509 system actions – of human-system interactions in three information-seeking scenarios. Next, we developed attention-based deep neural networks to understand natural language and predict speech acts. Then, the speech acts were fed to the model to predict the corresponding system-level search actions. We also annotated a second dataset to validate our results. For the two datasets, the best-performing classification model achieved maximum accuracy of 90.2 respectively, for search act classification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/29/2019

Towards a Model for Spoken Conversational Search

Conversation is the natural mode for information exchange in daily life,...
research
06/12/2023

Language of Bargaining

Leveraging an established exercise in negotiation education, we build a ...
research
04/23/2018

Analyzing and Characterizing User Intent in Information-seeking Conversations

Understanding and characterizing how people interact in information-seek...
research
02/19/2023

A Planning-Based Explainable Collaborative Dialogue System

Eva is a multimodal conversational system that helps users to accomplish...
research
12/09/2021

"What can I cook with these ingredients?" – Understanding cooking-related information needs in conversational search

As conversational search becomes more pervasive, it becomes increasingly...
research
05/02/2022

Supporting Complex Information-Seeking Tasks with Implicit Constraints

Current interactive systems with natural language interface lack an abil...
research
09/12/2018

Deep Learning Based Multi-modal Addressee Recognition in Visual Scenes with Utterances

With the widespread use of intelligent systems, such as smart speakers, ...

Please sign up or login with your details

Forgot password? Click here to reset