DeepAI
Log In Sign Up

Exploring Interactions Between Trust, Anthropomorphism, and Relationship Development in Voice Assistants

08/04/2021
by   William Seymour, et al.
0

Modern conversational agents such as Alexa and Google Assistant represent significant progress in speech recognition, natural language processing, and speech synthesis. But as these agents have grown more realistic, concerns have been raised over how their social nature might unconsciously shape our interactions with them. Through a survey of 500 voice assistant users, we explore whether users' relationships with their voice assistants can be quantified using the same metrics as social, interpersonal relationships; as well as if this correlates with how much they trust their devices and the extent to which they anthropomorphise them. Using Knapp's staircase model of human relationships, we find that not only can human-device interactions be modelled in this way, but also that relationship development with voice assistants correlates with increased trust and anthropomorphism.

READ FULL TEXT
08/13/2022

Voice Chatbot for Hospitality

Chatbot is a machine with the ability to answer automatically through a ...
07/03/2019

A Need for Trust in Conversational Interface Research

Across several branches of conversational interaction research including...
11/23/2022

Ignorance is Bliss? The Effect of Explanations on Perceptions of Voice Assistants

Voice assistants offer a convenient and hands-free way of accessing comp...
11/26/2021

Evaluating Trust in the Context of Conversational Information Systems for new users of the Internet

Most online information sources are text-based and in Western Languages ...
05/04/2020

Building Proactive Voice Assistants: When and How (not) to Interact

Voice assistants have recently achieved remarkable commercial success. H...
03/06/2020

Does Siri Have a Soul? Exploring Voice Assistants Through Shinto Design Fictions

It can be difficult to critically reflect on technology that has become ...