Log In Sign Up

Exploring Interactions Between Trust, Anthropomorphism, and Relationship Development in Voice Assistants

by   William Seymour, et al.

Modern conversational agents such as Alexa and Google Assistant represent significant progress in speech recognition, natural language processing, and speech synthesis. But as these agents have grown more realistic, concerns have been raised over how their social nature might unconsciously shape our interactions with them. Through a survey of 500 voice assistant users, we explore whether users' relationships with their voice assistants can be quantified using the same metrics as social, interpersonal relationships; as well as if this correlates with how much they trust their devices and the extent to which they anthropomorphise them. Using Knapp's staircase model of human relationships, we find that not only can human-device interactions be modelled in this way, but also that relationship development with voice assistants correlates with increased trust and anthropomorphism.


Voice Chatbot for Hospitality

Chatbot is a machine with the ability to answer automatically through a ...

A Need for Trust in Conversational Interface Research

Across several branches of conversational interaction research including...

Ignorance is Bliss? The Effect of Explanations on Perceptions of Voice Assistants

Voice assistants offer a convenient and hands-free way of accessing comp...

Evaluating Trust in the Context of Conversational Information Systems for new users of the Internet

Most online information sources are text-based and in Western Languages ...

Building Proactive Voice Assistants: When and How (not) to Interact

Voice assistants have recently achieved remarkable commercial success. H...

Does Siri Have a Soul? Exploring Voice Assistants Through Shinto Design Fictions

It can be difficult to critically reflect on technology that has become ...