Human Trust-based Feedback Control: Dynamically varying automation transparency to optimize human-machine interactions

06/29/2020
by   Kumar Akash, et al.
0

Human trust in automation plays an essential role in interactions between humans and automation. While a lack of trust can lead to a human's disuse of automation, over-trust can result in a human trusting a faulty autonomous system which could have negative consequences for the human. Therefore, human trust should be calibrated to optimize human-machine interactions with respect to context-specific performance objectives. In this article, we present a probabilistic framework to model and calibrate a human's trust and workload dynamics during his/her interaction with an intelligent decision-aid system. This calibration is achieved by varying the automation's transparency—the amount and utility of information provided to the human. The parameterization of the model is conducted using behavioral data collected through human-subject experiments, and three feedback control policies are experimentally validated and compared against a non-adaptive decision-aid system. The results show that human-automation team performance can be optimized when the transparency is dynamically updated based on the proposed control policy. This framework is a first step toward widespread design and implementation of real-time adaptive automation for use in human-machine interactions.

READ FULL TEXT

page 6

page 14

page 15

page 16

research
09/24/2020

Toward Adaptive Trust Calibration for Level 2 Driving Automation

Properly calibrated human trust is essential for successful interaction ...
research
08/03/2020

Enhancing autonomy transparency: an option-centric rationale approach

While the advances in artificial intelligence and machine learning empow...
research
04/02/2020

In Automation We Trust: Investigating the Role of Uncertainty in Active Learning Systems

We investigate how different active learning (AL) query policies coupled...
research
07/15/2021

Toward quantifying trust dynamics: How people adjust their trust after moment-to-moment interaction with automation

Objective: We examine how human operators adjust their trust in automati...
research
03/27/2018

A Classification Model for Sensing Human Trust in Machines Using EEG and GSR

Today, intelligent machines interact and collaborate with humans in a wa...
research
01/21/2020

Impedance Modulation for Negotiating Control Authority in a Haptic Shared Control Paradigm

Communication and cooperation among team members can be enhanced signifi...
research
12/19/2021

A Predictive Autonomous Decision Aid for Calibrating Human-Autonomy Reliance in Multi-Agent Task Assignment

In this work, we develop a game-theoretic modeling of the interaction be...

Please sign up or login with your details

Forgot password? Click here to reset