Snowy: Recommending Utterances for Conversational Visual Analysis

10/08/2021
by   Arjun Srinivasan, et al.
0

Natural language interfaces (NLIs) have become a prevalent medium for conducting visual data analysis, enabling people with varying levels of analytic experience to ask questions of and interact with their data. While there have been notable improvements with respect to language understanding capabilities in these systems, fundamental user experience and interaction challenges including the lack of analytic guidance (i.e., knowing what aspects of the data to consider) and discoverability of natural language input (i.e., knowing how to phrase input utterances) persist. To address these challenges, we investigate utterance recommendations that contextually provide analytic guidance by suggesting data features (e.g., attributes, values, trends) while implicitly making users aware of the types of phrasings that an NLI supports. We present SNOWY, a prototype system that generates and recommends utterances for visual analysis based on a combination of data interestingness metrics and language pragmatics. Through a preliminary user study, we found that utterance recommendations in SNOWY support conversational visual analysis by guiding the participants' analytic workflows and making them aware of the system's language interpretation capabilities. Based on the feedback and observations from the study, we discuss potential implications and considerations for incorporating recommendations in future NLIs for visual analysis.

READ FULL TEXT
research
07/01/2022

Facilitating Conversational Interaction in Natural Language Interfaces for Visualization

Natural language (NL) toolkits enable visualization developers, who may ...
research
08/05/2022

MEDLEY: Intent-based Recommendations to Support Dashboard Composition

Despite the ever-growing popularity of dashboards across a wide range of...
research
04/29/2020

Topic Propagation in Conversational Search

In a conversational context, a user expresses her multi-faceted informat...
research
11/04/2019

VASTA: A Vision and Language-assisted Smartphone Task Automation System

We present VASTA, a novel vision and language-assisted Programming By De...
research
11/16/2018

Temporal Grounding Graphs for Language Understanding with Accrued Visual-Linguistic Context

A robot's ability to understand or ground natural language instructions ...
research
03/04/2019

Using Word Embeddings for Visual Data Exploration with Ontodia and Wikidata

One of the big challenges in Linked Data consumption is to create visual...
research
06/07/2023

The HCI Aspects of Public Deployment of Research Chatbots: A User Study, Design Recommendations, and Open Challenges

Publicly deploying research chatbots is a nuanced topic involving necess...

Please sign up or login with your details

Forgot password? Click here to reset