Toward quantifying trust dynamics: How people adjust their trust after moment-to-moment interaction with automation

07/15/2021
by   X. Jessie Yang, et al.
0

Objective: We examine how human operators adjust their trust in automation as a result of their moment-to-moment interaction with automation. Background: Most existing studies measured trust by administering questionnaires at the end of an experiment. Only a limited number of studies viewed trust as a dynamic variable that can strengthen or decay over time. Method: Seventy-five participants took part in an aided memory recognition task. In the task, participants viewed a series of images and later on performed 40 trials of the recognition task to identify a target image when it was presented with a distractor. In each trial, participants performed the initial recognition by themselves, received a recommendation from an automated decision aid, and performed the final recognition. After each trial, participants reported their trust on a visual analog scale. Results: Outcome bias and contrast effect significantly influence human operators' trust adjustments. An automation failure leads to a larger trust decrement if the final outcome is undesirable, and a marginally larger trust decrement if the human operator succeeds the task by him-/her-self. An automation success engenders a greater trust increment if the human operator fails the task. Additionally, automation failures have a larger effect on trust adjustment than automation successes. Conclusion: Human operators adjust their trust in automation as a result of their moment-to-moment interaction with automation. Their trust adjustments are significantly influenced by decision-making heuristics/biases. Application: Understanding the trust adjustment process enables accurate prediction of the operators' moment-to-moment trust in automation and informs the design of trust-aware adaptive automation.

READ FULL TEXT

page 11

page 12

page 13

research
06/03/2022

Clustering Trust Dynamics in a Human-Robot Sequential Decision-Making Task

In this paper, we present a framework for trust-aware sequential decisio...
research
07/05/2023

The Effects of Interaction Conflicts, Levels of Automation, and Frequency of Automation on Human Automation Trust and Acceptance

In the presence of interaction conflicts, user trust in automation plays...
research
06/29/2020

Human Trust-based Feedback Control: Dynamically varying automation transparency to optimize human-machine interactions

Human trust in automation plays an essential role in interactions betwee...
research
09/24/2020

Toward Adaptive Trust Calibration for Level 2 Driving Automation

Properly calibrated human trust is essential for successful interaction ...
research
07/26/2020

Modeling and Predicting Trust Dynamics in Human-Robot Teaming: A Bayesian Inference Approach

Trust in automation, or more recently trust in autonomy, has received ex...
research
11/18/2022

What Makes An Apology More Effective? Exploring Anthropomorphism, Individual Differences, And Emotion In Human-Automation Trust Repair

Recent advances in technology have allowed an automation system to recog...
research
02/01/2023

Chatbots for Robotic Process Automation: Investigating Perceived Trust and User Satisfaction

Driven by ongoing improvements in machine learning, chatbots have increa...

Please sign up or login with your details

Forgot password? Click here to reset